Robots that anticipate pain: Anticipating physical perturbations from visual cues through deep predictive models

Indranil Sur, Hani Ben Amor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

To ensure system integrity, robots need to proactively avoid any unwanted physical perturbation that may cause damage to the underlying hardware. In this paper, we investigate a machine learning approach that allows robots to anticipate impending physical perturbations from perceptual cues. In contrast to other approaches that require knowledge about sources of perturbation to be encoded before deployment, our method is based on experiential learning. Robots learn to associate visual cues with subsequent physical perturbations and contacts. In turn, these extracted visual cues are then used to predict potential future perturbations acting on the robot. To this end, we introduce a novel deep network architecture which combines multiple sub-networks for dealing with robot dynamics and perceptual input from the environment. We present a self-supervised approach for training the system that does not require any labeling of training data. Extensive experiments in a human-robot interaction task show that a robot can learn to predict physical contact by a human interaction partner without any prior information or labeling.

Original languageEnglish (US)
Title of host publicationIROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages5541-5548
Number of pages8
Volume2017-September
ISBN (Electronic)9781538626825
DOIs
StatePublished - Dec 13 2017
Event2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017 - Vancouver, Canada
Duration: Sep 24 2017Sep 28 2017

Other

Other2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017
CountryCanada
CityVancouver
Period9/24/179/28/17

Fingerprint

Robots
Labeling
Human robot interaction
Network architecture
Learning systems
Hardware
Experiments

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Cite this

Sur, I., & Ben Amor, H. (2017). Robots that anticipate pain: Anticipating physical perturbations from visual cues through deep predictive models. In IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems (Vol. 2017-September, pp. 5541-5548). [8206442] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IROS.2017.8206442

Robots that anticipate pain : Anticipating physical perturbations from visual cues through deep predictive models. / Sur, Indranil; Ben Amor, Hani.

IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September Institute of Electrical and Electronics Engineers Inc., 2017. p. 5541-5548 8206442.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sur, I & Ben Amor, H 2017, Robots that anticipate pain: Anticipating physical perturbations from visual cues through deep predictive models. in IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. vol. 2017-September, 8206442, Institute of Electrical and Electronics Engineers Inc., pp. 5541-5548, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2017, Vancouver, Canada, 9/24/17. https://doi.org/10.1109/IROS.2017.8206442
Sur I, Ben Amor H. Robots that anticipate pain: Anticipating physical perturbations from visual cues through deep predictive models. In IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September. Institute of Electrical and Electronics Engineers Inc. 2017. p. 5541-5548. 8206442 https://doi.org/10.1109/IROS.2017.8206442
Sur, Indranil ; Ben Amor, Hani. / Robots that anticipate pain : Anticipating physical perturbations from visual cues through deep predictive models. IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems. Vol. 2017-September Institute of Electrical and Electronics Engineers Inc., 2017. pp. 5541-5548
@inproceedings{0defda706c464b6592dfbd227bcf4720,
title = "Robots that anticipate pain: Anticipating physical perturbations from visual cues through deep predictive models",
abstract = "To ensure system integrity, robots need to proactively avoid any unwanted physical perturbation that may cause damage to the underlying hardware. In this paper, we investigate a machine learning approach that allows robots to anticipate impending physical perturbations from perceptual cues. In contrast to other approaches that require knowledge about sources of perturbation to be encoded before deployment, our method is based on experiential learning. Robots learn to associate visual cues with subsequent physical perturbations and contacts. In turn, these extracted visual cues are then used to predict potential future perturbations acting on the robot. To this end, we introduce a novel deep network architecture which combines multiple sub-networks for dealing with robot dynamics and perceptual input from the environment. We present a self-supervised approach for training the system that does not require any labeling of training data. Extensive experiments in a human-robot interaction task show that a robot can learn to predict physical contact by a human interaction partner without any prior information or labeling.",
author = "Indranil Sur and {Ben Amor}, Hani",
year = "2017",
month = "12",
day = "13",
doi = "10.1109/IROS.2017.8206442",
language = "English (US)",
volume = "2017-September",
pages = "5541--5548",
booktitle = "IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Robots that anticipate pain

T2 - Anticipating physical perturbations from visual cues through deep predictive models

AU - Sur, Indranil

AU - Ben Amor, Hani

PY - 2017/12/13

Y1 - 2017/12/13

N2 - To ensure system integrity, robots need to proactively avoid any unwanted physical perturbation that may cause damage to the underlying hardware. In this paper, we investigate a machine learning approach that allows robots to anticipate impending physical perturbations from perceptual cues. In contrast to other approaches that require knowledge about sources of perturbation to be encoded before deployment, our method is based on experiential learning. Robots learn to associate visual cues with subsequent physical perturbations and contacts. In turn, these extracted visual cues are then used to predict potential future perturbations acting on the robot. To this end, we introduce a novel deep network architecture which combines multiple sub-networks for dealing with robot dynamics and perceptual input from the environment. We present a self-supervised approach for training the system that does not require any labeling of training data. Extensive experiments in a human-robot interaction task show that a robot can learn to predict physical contact by a human interaction partner without any prior information or labeling.

AB - To ensure system integrity, robots need to proactively avoid any unwanted physical perturbation that may cause damage to the underlying hardware. In this paper, we investigate a machine learning approach that allows robots to anticipate impending physical perturbations from perceptual cues. In contrast to other approaches that require knowledge about sources of perturbation to be encoded before deployment, our method is based on experiential learning. Robots learn to associate visual cues with subsequent physical perturbations and contacts. In turn, these extracted visual cues are then used to predict potential future perturbations acting on the robot. To this end, we introduce a novel deep network architecture which combines multiple sub-networks for dealing with robot dynamics and perceptual input from the environment. We present a self-supervised approach for training the system that does not require any labeling of training data. Extensive experiments in a human-robot interaction task show that a robot can learn to predict physical contact by a human interaction partner without any prior information or labeling.

UR - http://www.scopus.com/inward/record.url?scp=85041964363&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85041964363&partnerID=8YFLogxK

U2 - 10.1109/IROS.2017.8206442

DO - 10.1109/IROS.2017.8206442

M3 - Conference contribution

VL - 2017-September

SP - 5541

EP - 5548

BT - IROS 2017 - IEEE/RSJ International Conference on Intelligent Robots and Systems

PB - Institute of Electrical and Electronics Engineers Inc.

ER -