Co-active learning to adapt humanoid movement for manipulation

Ren Mao, John S. Baras, Yezhou Yang, Cornelia Fermüller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper we address the problem of interactive robot movement adaptation under various environmental constraints. A common approach is to adopt motion primitives to generate target motions from demonstrations. However, their generalization capability is weak for novel environments. Additionally, traditional motion generation methods do not consider versatile constraints from different users, tasks, and environments. In this work, we propose a co-active learning framework for learning to adapt the movement of robot end-effectors for manipulation tasks. It is designed to adapt the original imitation trajectories, which are learned from demonstrations, to novel situations with different constraints. The framework also considers user feedback towards the adapted trajectories, and it learns to adapt movement through human-in-the-loop interactions. Experiments on a humanoid platform validate the effectiveness of our approach.

Original languageEnglish (US)
Title of host publicationHumanoids 2016 - IEEE-RAS International Conference on Humanoid Robots
PublisherIEEE Computer Society
Pages372-378
Number of pages7
ISBN (Electronic)9781509047185
DOIs
StatePublished - Dec 30 2016
Event16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016 - Cancun, Mexico
Duration: Nov 15 2016Nov 17 2016

Other

Other16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016
CountryMexico
CityCancun
Period11/15/1611/17/16

Fingerprint

Demonstrations
Trajectories
Robots
End effectors
Feedback
Experiments
Problem-Based Learning

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Human-Computer Interaction
  • Electrical and Electronic Engineering

Cite this

Mao, R., Baras, J. S., Yang, Y., & Fermüller, C. (2016). Co-active learning to adapt humanoid movement for manipulation. In Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots (pp. 372-378). [7803303] IEEE Computer Society. https://doi.org/10.1109/HUMANOIDS.2016.7803303

Co-active learning to adapt humanoid movement for manipulation. / Mao, Ren; Baras, John S.; Yang, Yezhou; Fermüller, Cornelia.

Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots. IEEE Computer Society, 2016. p. 372-378 7803303.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Mao, R, Baras, JS, Yang, Y & Fermüller, C 2016, Co-active learning to adapt humanoid movement for manipulation. in Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots., 7803303, IEEE Computer Society, pp. 372-378, 16th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2016, Cancun, Mexico, 11/15/16. https://doi.org/10.1109/HUMANOIDS.2016.7803303
Mao R, Baras JS, Yang Y, Fermüller C. Co-active learning to adapt humanoid movement for manipulation. In Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots. IEEE Computer Society. 2016. p. 372-378. 7803303 https://doi.org/10.1109/HUMANOIDS.2016.7803303
Mao, Ren ; Baras, John S. ; Yang, Yezhou ; Fermüller, Cornelia. / Co-active learning to adapt humanoid movement for manipulation. Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots. IEEE Computer Society, 2016. pp. 372-378
@inproceedings{84bcad616d7e46a89f57854c72bbc1ec,
title = "Co-active learning to adapt humanoid movement for manipulation",
abstract = "In this paper we address the problem of interactive robot movement adaptation under various environmental constraints. A common approach is to adopt motion primitives to generate target motions from demonstrations. However, their generalization capability is weak for novel environments. Additionally, traditional motion generation methods do not consider versatile constraints from different users, tasks, and environments. In this work, we propose a co-active learning framework for learning to adapt the movement of robot end-effectors for manipulation tasks. It is designed to adapt the original imitation trajectories, which are learned from demonstrations, to novel situations with different constraints. The framework also considers user feedback towards the adapted trajectories, and it learns to adapt movement through human-in-the-loop interactions. Experiments on a humanoid platform validate the effectiveness of our approach.",
author = "Ren Mao and Baras, {John S.} and Yezhou Yang and Cornelia Ferm{\"u}ller",
year = "2016",
month = "12",
day = "30",
doi = "10.1109/HUMANOIDS.2016.7803303",
language = "English (US)",
pages = "372--378",
booktitle = "Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots",
publisher = "IEEE Computer Society",
address = "United States",

}

TY - GEN

T1 - Co-active learning to adapt humanoid movement for manipulation

AU - Mao, Ren

AU - Baras, John S.

AU - Yang, Yezhou

AU - Fermüller, Cornelia

PY - 2016/12/30

Y1 - 2016/12/30

N2 - In this paper we address the problem of interactive robot movement adaptation under various environmental constraints. A common approach is to adopt motion primitives to generate target motions from demonstrations. However, their generalization capability is weak for novel environments. Additionally, traditional motion generation methods do not consider versatile constraints from different users, tasks, and environments. In this work, we propose a co-active learning framework for learning to adapt the movement of robot end-effectors for manipulation tasks. It is designed to adapt the original imitation trajectories, which are learned from demonstrations, to novel situations with different constraints. The framework also considers user feedback towards the adapted trajectories, and it learns to adapt movement through human-in-the-loop interactions. Experiments on a humanoid platform validate the effectiveness of our approach.

AB - In this paper we address the problem of interactive robot movement adaptation under various environmental constraints. A common approach is to adopt motion primitives to generate target motions from demonstrations. However, their generalization capability is weak for novel environments. Additionally, traditional motion generation methods do not consider versatile constraints from different users, tasks, and environments. In this work, we propose a co-active learning framework for learning to adapt the movement of robot end-effectors for manipulation tasks. It is designed to adapt the original imitation trajectories, which are learned from demonstrations, to novel situations with different constraints. The framework also considers user feedback towards the adapted trajectories, and it learns to adapt movement through human-in-the-loop interactions. Experiments on a humanoid platform validate the effectiveness of our approach.

UR - http://www.scopus.com/inward/record.url?scp=85010203554&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85010203554&partnerID=8YFLogxK

U2 - 10.1109/HUMANOIDS.2016.7803303

DO - 10.1109/HUMANOIDS.2016.7803303

M3 - Conference contribution

AN - SCOPUS:85010203554

SP - 372

EP - 378

BT - Humanoids 2016 - IEEE-RAS International Conference on Humanoid Robots

PB - IEEE Computer Society

ER -