Virtual Humans in Augmented Reality: A First Step towards Real-World Embedded Virtual Roleplayers (bibtex)
by Arno Hartholt, Sharon Mozgai, Ed Fast, Matt Liewer, Adam Reilly, Wendy Whitcup, Albert "Skip" Rizzo
Abstract:
We present one of the first applications of virtual humans in Augmented Reality (AR), which allows young adults with Autism Spectrum Disorder (ASD) the opportunity to practice job interviews. It uses the Magic Leap’s AR hardware sensors to provide users with immediate feedback on six different metrics, including eye gaze, blink rate and head orientation. The system provides two characters, with three conversational modes each. Ported from an existing desktop application, the main development lessons learned were: 1) provide users with navigation instructions in the user interface, 2) avoid dark colors as they are rendered transparently, 3) use dynamic gaze so characters maintain eye contact with the user, 4) use hardware sensors like eye gaze to provide user feedback, and 5) use surface detection to place characters dynamically in the world.
Reference:
Virtual Humans in Augmented Reality: A First Step towards Real-World Embedded Virtual Roleplayers (Arno Hartholt, Sharon Mozgai, Ed Fast, Matt Liewer, Adam Reilly, Wendy Whitcup, Albert "Skip" Rizzo), In Proceedings of the 7th International Conference on Human-Agent Interaction - HAI '19, ACM Press, 2019.
Bibtex Entry:
@inproceedings{hartholt_virtual_2019-1,
	address = {Kyoto, Japan},
	title = {Virtual {Humans} in {Augmented} {Reality}: {A} {First} {Step} towards {Real}-{World} {Embedded} {Virtual} {Roleplayers}},
	isbn = {978-1-4503-6922-0},
	url = {http://dl.acm.org/citation.cfm?doid=3349537.3352766},
	doi = {10.1145/3349537.3352766},
	abstract = {We present one of the first applications of virtual humans in Augmented Reality (AR), which allows young adults with Autism Spectrum Disorder (ASD) the opportunity to practice job interviews. It uses the Magic Leap’s AR hardware sensors to provide users with immediate feedback on six different metrics, including eye gaze, blink rate and head orientation. The system provides two characters, with three conversational modes each. Ported from an existing desktop application, the main development lessons learned were: 1) provide users with navigation instructions in the user interface, 2) avoid dark colors as they are rendered transparently, 3) use dynamic gaze so characters maintain eye contact with the user, 4) use hardware sensors like eye gaze to provide user feedback, and 5) use surface detection to place characters dynamically in the world.},
	booktitle = {Proceedings of the 7th {International} {Conference} on {Human}-{Agent} {Interaction}  - {HAI} '19},
	publisher = {ACM Press},
	author = {Hartholt, Arno and Mozgai, Sharon and Fast, Ed and Liewer, Matt and Reilly, Adam and Whitcup, Wendy and Rizzo, Albert "Skip"},
	month = oct,
	year = {2019},
	keywords = {MedVR, Virtual Humans},
	pages = {205--207}
}
Powered by bibtexbrowser