Co-active learning to adapt humanoid movement for manipulation

Ren Mao, John S. Baras, Yezhou Yang, Cornelia Fermüller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we address the problem of interactive robot movement adaptation under various environmental constraints. A common approach is to adopt motion primitives to generate target motions from demonstrations. However, their generalization capability is weak for novel environments. Additionally, traditional motion generation methods do not consider versatile constraints from different users, tasks, and environments. In this work, we propose a co-active learning framework for learning to adapt the movement of robot end-effectors for manipulation tasks. It is designed to adapt the original imitation trajectories, which are learned from demonstrations, to novel situations with different constraints. The framework also considers user feedback towards the adapted trajectories, and it learns to adapt movement through human-in-the-loop interactions. Experiments on a humanoid platform validate the effectiveness of our approach.

Original languageEnglish (US)
Title of host publicationHumanoids 2016 - IEEE-RAS International Conference on Humanoid Robots
PublisherIEEE Computer Society
Pages372-378
Number of pages7
ISBN (Electronic)9781509047185
DOIs
StatePublished - Dec 30 2016
Event16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016 - Cancun, Mexico
Duration: Nov 15 2016Nov 17 2016

Other

Other16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016
CountryMexico
CityCancun
Period11/15/1611/17/16

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Co-active learning to adapt humanoid movement for manipulation'. Together they form a unique fingerprint.

Cite this