Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance

Harel Yedidsion, Jacqueline Deans, Connor Sheehan, Mahathi Chillara, Justin Hart, Peter Stone, Raymond J. Mooney

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.

Original languageEnglish (US)
Title of host publicationSocial Robotics - 11th International Conference, ICSR 2019, Proceedings
EditorsMiguel A. Salichs, Shuzhi Sam Ge, Emilia Ivanova Barakova, John-John Cabibihan, Alan R. Wagner, Álvaro Castro-González, Hongsheng He
PublisherSpringer
Pages133-143
Number of pages11
ISBN (Print)9783030358877
DOIs
StatePublished - Jan 1 2019
Event11th International Conference on Social Robotics, ICSR 2019 - Madrid, Spain
Duration: Nov 26 2019Nov 29 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11876 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Social Robotics, ICSR 2019
CountrySpain
CityMadrid
Period11/26/1911/29/19

Fingerprint

Multi-robot
Guidance
Navigation
Robot
Robots
Natural Language
Mobile Robotics
Human
Intelligent robots
Baseline
Open Problems
Mobile robots
Robotics
Testing
Path
Optimization
Experimental Results

Keywords

  • Human robot interaction
  • Indoor navigation
  • Multi robot coordination
  • Natural language

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Yedidsion, H., Deans, J., Sheehan, C., Chillara, M., Hart, J., Stone, P., & Mooney, R. J. (2019). Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance. In M. A. Salichs, S. S. Ge, E. I. Barakova, J-J. Cabibihan, A. R. Wagner, Á. Castro-González, & H. He (Eds.), Social Robotics - 11th International Conference, ICSR 2019, Proceedings (pp. 133-143). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11876 LNAI). Springer. https://doi.org/10.1007/978-3-030-35888-4_13

Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance. / Yedidsion, Harel; Deans, Jacqueline; Sheehan, Connor; Chillara, Mahathi; Hart, Justin; Stone, Peter; Mooney, Raymond J.

Social Robotics - 11th International Conference, ICSR 2019, Proceedings. ed. / Miguel A. Salichs; Shuzhi Sam Ge; Emilia Ivanova Barakova; John-John Cabibihan; Alan R. Wagner; Álvaro Castro-González; Hongsheng He. Springer, 2019. p. 133-143 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11876 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yedidsion, H, Deans, J, Sheehan, C, Chillara, M, Hart, J, Stone, P & Mooney, RJ 2019, Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance. in MA Salichs, SS Ge, EI Barakova, J-J Cabibihan, AR Wagner, Á Castro-González & H He (eds), Social Robotics - 11th International Conference, ICSR 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11876 LNAI, Springer, pp. 133-143, 11th International Conference on Social Robotics, ICSR 2019, Madrid, Spain, 11/26/19. https://doi.org/10.1007/978-3-030-35888-4_13
Yedidsion H, Deans J, Sheehan C, Chillara M, Hart J, Stone P et al. Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance. In Salichs MA, Ge SS, Barakova EI, Cabibihan J-J, Wagner AR, Castro-González Á, He H, editors, Social Robotics - 11th International Conference, ICSR 2019, Proceedings. Springer. 2019. p. 133-143. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-35888-4_13
Yedidsion, Harel ; Deans, Jacqueline ; Sheehan, Connor ; Chillara, Mahathi ; Hart, Justin ; Stone, Peter ; Mooney, Raymond J. / Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance. Social Robotics - 11th International Conference, ICSR 2019, Proceedings. editor / Miguel A. Salichs ; Shuzhi Sam Ge ; Emilia Ivanova Barakova ; John-John Cabibihan ; Alan R. Wagner ; Álvaro Castro-González ; Hongsheng He. Springer, 2019. pp. 133-143 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{2fa6039d2cc9404db27a34cd7f5a8394,
title = "Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance",
abstract = "Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.",
keywords = "Human robot interaction, Indoor navigation, Multi robot coordination, Natural language",
author = "Harel Yedidsion and Jacqueline Deans and Connor Sheehan and Mahathi Chillara and Justin Hart and Peter Stone and Mooney, {Raymond J.}",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-35888-4_13",
language = "English (US)",
isbn = "9783030358877",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer",
pages = "133--143",
editor = "Salichs, {Miguel A.} and Ge, {Shuzhi Sam} and Barakova, {Emilia Ivanova} and John-John Cabibihan and Wagner, {Alan R.} and {\'A}lvaro Castro-Gonz{\'a}lez and Hongsheng He",
booktitle = "Social Robotics - 11th International Conference, ICSR 2019, Proceedings",

}

TY - GEN

T1 - Optimal Use of Verbal Instructions for Multi-robot Human Navigation Guidance

AU - Yedidsion, Harel

AU - Deans, Jacqueline

AU - Sheehan, Connor

AU - Chillara, Mahathi

AU - Hart, Justin

AU - Stone, Peter

AU - Mooney, Raymond J.

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.

AB - Efficiently guiding humans in indoor environments is a challenging open problem. Due to recent advances in mobile robotics and natural language processing, it has recently become possible to consider doing so with the help of mobile, verbally communicating robots. In the past, stationary verbal robots have been used for this purpose at Microsoft Research, and mobile non-verbal robots have been used at UT Austin in their multi-robot human guidance system. This paper extends that mobile multi-robot human guidance research by adding the element of natural language instructions, which are dynamically generated based on the robots’ path planner, and by implementing and testing the system on real robots. Generating natural language instructions from the robots’ plan opens up a variety of optimization opportunities such as deciding where to place the robots, where to lead humans, and where to verbally instruct them. We present experimental results of the full multi-robot human guidance system and show that it is more effective than two baseline systems: one which only provides humans with verbal instructions, and another which only uses a single robot to lead users to their destinations.

KW - Human robot interaction

KW - Indoor navigation

KW - Multi robot coordination

KW - Natural language

UR - http://www.scopus.com/inward/record.url?scp=85076545068&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85076545068&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-35888-4_13

DO - 10.1007/978-3-030-35888-4_13

M3 - Conference contribution

AN - SCOPUS:85076545068

SN - 9783030358877

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 133

EP - 143

BT - Social Robotics - 11th International Conference, ICSR 2019, Proceedings

A2 - Salichs, Miguel A.

A2 - Ge, Shuzhi Sam

A2 - Barakova, Emilia Ivanova

A2 - Cabibihan, John-John

A2 - Wagner, Alan R.

A2 - Castro-González, Álvaro

A2 - He, Hongsheng

PB - Springer

ER -