Human-Robot Dialogue and Collaboration in Search and Navigation (bibtex)
by Bonial, Claire, Lukin, Stephanie M., Foots, Ashley, Henry, Cassidy, Marge, Matthew, Pollard, Kimberly A., Artstein, Ron, Traum, David and Voss, Clare R.
Abstract:
Collaboration with a remotely located robot in tasks such as disaster relief and search and rescue can be facilitated by grounding natural language task instructions into actions executable by the robot in its current physical context. The corpus we describe here provides insight into the translation and interpretation a natural language instruction undergoes starting from verbal human intent, to understanding and processing, and ultimately, to robot execution. We use a ‘Wizard-of-Oz’ methodology to elicit the corpus data in which a participant speaks freely to instruct a robot on what to do and where to move through a remote environment to accomplish collaborativesearchandnavigationtasks. Thisdataoffersthepotentialforexploringandevaluatingactionmodelsbyconnectingnatural language instructions to execution by a physical robot (controlled by a human ‘wizard’). In this paper, a description of the corpus (soon to be openly available) and examples of actions in the dialogue are provided.
Reference:
Human-Robot Dialogue and Collaboration in Search and Navigation (Bonial, Claire, Lukin, Stephanie M., Foots, Ashley, Henry, Cassidy, Marge, Matthew, Pollard, Kimberly A., Artstein, Ron, Traum, David and Voss, Clare R.), In Proceedings of the AREA Workshop: Annotation, Recognition, and Evaluation of Actions, AREA 2018, 2018.
Bibtex Entry:
@inproceedings{bonial_human-robot_2018,
	address = {Miyazaki, Japan},
	title = {Human-{Robot} {Dialogue} and {Collaboration} in {Search} and {Navigation}},
	url = {http://www.areaworkshop.org/wp-content/uploads/2018/05/4.pdf},
	abstract = {Collaboration with a remotely located robot in tasks such as disaster relief and search and rescue can be facilitated by grounding natural language task instructions into actions executable by the robot in its current physical context. The corpus we describe here provides insight into the translation and interpretation a natural language instruction undergoes starting from verbal human intent, to understanding and processing, and ultimately, to robot execution. We use a ‘Wizard-of-Oz’ methodology to elicit the corpus data in which a participant speaks freely to instruct a robot on what to do and where to move through a remote environment to accomplish collaborativesearchandnavigationtasks. Thisdataoffersthepotentialforexploringandevaluatingactionmodelsbyconnectingnatural language instructions to execution by a physical robot (controlled by a human ‘wizard’). In this paper, a description of the corpus (soon to be openly available) and examples of actions in the dialogue are provided.},
	booktitle = {Proceedings of the {AREA} {Workshop}: {Annotation}, {Recognition}, and {Evaluation} of {Actions}},
	publisher = {AREA 2018},
	author = {Bonial, Claire and Lukin, Stephanie M. and Foots, Ashley and Henry, Cassidy and Marge, Matthew and Pollard, Kimberly A. and Artstein, Ron and Traum, David and Voss, Clare R.},
	month = may,
	year = {2018},
	keywords = {Virtual Humans, ARL, DoD}
}
Powered by bibtexbrowser