A system for learning continuous human-robot interactions from human-human demonstrations

David Vogt, Simon Stepputtis, Steve Grehl, Bernhard Jung, Hani Ben Amor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

13 Citations (Scopus)

Abstract

We present a data-driven imitation learning system for learning human-robot interactions from human-human demonstrations. During training, the movements of two interaction partners are recorded through motion capture and an interaction model is learned. At runtime, the interaction model is used to continuously adapt the robot's motion, both spatially and temporally, to the movements of the human interaction partner. We show the effectiveness of the approach on complex, sequential tasks by presenting two applications involving collaborative human-robot assembly. Experiments with varied object hand-over positions and task execution speeds confirm the capabilities for spatio-temporal adaption of the demonstrated behavior to the current situation.

Original languageEnglish (US)
Title of host publicationICRA 2017 - IEEE International Conference on Robotics and Automation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2882-2889
Number of pages8
ISBN (Electronic)9781509046331
DOIs
StatePublished - Jul 21 2017
Event2017 IEEE International Conference on Robotics and Automation, ICRA 2017 - Singapore, Singapore
Duration: May 29 2017Jun 3 2017

Other

Other2017 IEEE International Conference on Robotics and Automation, ICRA 2017
CountrySingapore
CitySingapore
Period5/29/176/3/17

Fingerprint

Human robot interaction
Demonstrations
Robots
End effectors
Learning systems
Experiments

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Cite this

Vogt, D., Stepputtis, S., Grehl, S., Jung, B., & Ben Amor, H. (2017). A system for learning continuous human-robot interactions from human-human demonstrations. In ICRA 2017 - IEEE International Conference on Robotics and Automation (pp. 2882-2889). [7989334] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICRA.2017.7989334

A system for learning continuous human-robot interactions from human-human demonstrations. / Vogt, David; Stepputtis, Simon; Grehl, Steve; Jung, Bernhard; Ben Amor, Hani.

ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc., 2017. p. 2882-2889 7989334.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Vogt, D, Stepputtis, S, Grehl, S, Jung, B & Ben Amor, H 2017, A system for learning continuous human-robot interactions from human-human demonstrations. in ICRA 2017 - IEEE International Conference on Robotics and Automation., 7989334, Institute of Electrical and Electronics Engineers Inc., pp. 2882-2889, 2017 IEEE International Conference on Robotics and Automation, ICRA 2017, Singapore, Singapore, 5/29/17. https://doi.org/10.1109/ICRA.2017.7989334
Vogt D, Stepputtis S, Grehl S, Jung B, Ben Amor H. A system for learning continuous human-robot interactions from human-human demonstrations. In ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc. 2017. p. 2882-2889. 7989334 https://doi.org/10.1109/ICRA.2017.7989334
Vogt, David ; Stepputtis, Simon ; Grehl, Steve ; Jung, Bernhard ; Ben Amor, Hani. / A system for learning continuous human-robot interactions from human-human demonstrations. ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 2882-2889
@inproceedings{5553f6aefe364e65bf28a089b28043ff,
title = "A system for learning continuous human-robot interactions from human-human demonstrations",
abstract = "We present a data-driven imitation learning system for learning human-robot interactions from human-human demonstrations. During training, the movements of two interaction partners are recorded through motion capture and an interaction model is learned. At runtime, the interaction model is used to continuously adapt the robot's motion, both spatially and temporally, to the movements of the human interaction partner. We show the effectiveness of the approach on complex, sequential tasks by presenting two applications involving collaborative human-robot assembly. Experiments with varied object hand-over positions and task execution speeds confirm the capabilities for spatio-temporal adaption of the demonstrated behavior to the current situation.",
author = "David Vogt and Simon Stepputtis and Steve Grehl and Bernhard Jung and {Ben Amor}, Hani",
year = "2017",
month = "7",
day = "21",
doi = "10.1109/ICRA.2017.7989334",
language = "English (US)",
pages = "2882--2889",
booktitle = "ICRA 2017 - IEEE International Conference on Robotics and Automation",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - A system for learning continuous human-robot interactions from human-human demonstrations

AU - Vogt, David

AU - Stepputtis, Simon

AU - Grehl, Steve

AU - Jung, Bernhard

AU - Ben Amor, Hani

PY - 2017/7/21

Y1 - 2017/7/21

N2 - We present a data-driven imitation learning system for learning human-robot interactions from human-human demonstrations. During training, the movements of two interaction partners are recorded through motion capture and an interaction model is learned. At runtime, the interaction model is used to continuously adapt the robot's motion, both spatially and temporally, to the movements of the human interaction partner. We show the effectiveness of the approach on complex, sequential tasks by presenting two applications involving collaborative human-robot assembly. Experiments with varied object hand-over positions and task execution speeds confirm the capabilities for spatio-temporal adaption of the demonstrated behavior to the current situation.

AB - We present a data-driven imitation learning system for learning human-robot interactions from human-human demonstrations. During training, the movements of two interaction partners are recorded through motion capture and an interaction model is learned. At runtime, the interaction model is used to continuously adapt the robot's motion, both spatially and temporally, to the movements of the human interaction partner. We show the effectiveness of the approach on complex, sequential tasks by presenting two applications involving collaborative human-robot assembly. Experiments with varied object hand-over positions and task execution speeds confirm the capabilities for spatio-temporal adaption of the demonstrated behavior to the current situation.

UR - http://www.scopus.com/inward/record.url?scp=85028018545&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85028018545&partnerID=8YFLogxK

U2 - 10.1109/ICRA.2017.7989334

DO - 10.1109/ICRA.2017.7989334

M3 - Conference contribution

AN - SCOPUS:85028018545

SP - 2882

EP - 2889

BT - ICRA 2017 - IEEE International Conference on Robotics and Automation

PB - Institute of Electrical and Electronics Engineers Inc.

ER -