Detection and computational analysis of psychological signals using a virtual human interviewing agent (bibtex)
by Albert Rizzo, Scherer Scherer, David DeVault, Jonathan Gratch, Ronald Artstein, Arno Hartholt, Gale Lucas, Stacy Marsella, Fabrizio Morbini, Angela Nazarian, Giota Stratou, David Traum, Rachel Wood, Jill Boberg, Louis Philippe Morency
Abstract:
It has long been recognized that facial expressions, body posture/gestures and vocal parameters play an important role in human communication and the implicit signalling of emotion. Recent advances in low cost computer vision and behavioral sensing technologies can now be applied to the process of making meaningful inferences as to user state when a person interacts with a computational device. Effective use of this additive information could serve to promote human interaction with virtual human (VH) agents that may enhance diagnostic assessment. This paper will focus on our current research in these areas within the DARPA-funded "Detection and Computational Analysis of Psychological Signals" project, with specific attention to the SimSensei application use case. SimSensei is a virtual human interaction platform that is able to sense and interpret real-time audiovisual behavioral signals from users interacting with the system. It is specifically designed for health care support and leverages years of virtual human research and development at USC-ICT. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the state and inferred intent of the user through analysis of behavioral signals gleaned from facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human to human interaction and communication, SimSensei aims to capture and infer from user non-verbal communication to improve engagement between a VH and a user. The system can also quantify and interpret sensed behavioral signals longitudinally that can be used to inform diagnostic assessment within a clinical context.
Reference:
Detection and computational analysis of psychological signals using a virtual human interviewing agent (Albert Rizzo, Scherer Scherer, David DeVault, Jonathan Gratch, Ronald Artstein, Arno Hartholt, Gale Lucas, Stacy Marsella, Fabrizio Morbini, Angela Nazarian, Giota Stratou, David Traum, Rachel Wood, Jill Boberg, Louis Philippe Morency), In Journal of Pain Management, 2016.
Bibtex Entry:
@article{rizzo_detection_2016,
	title = {Detection and computational analysis of psychological signals using a virtual human interviewing agent},
	issn = {1939-5914},
	url = {http://www.icdvrat.org/2014/papers/ICDVRAT2014_S03N3_Rizzo_etal.pdf},
	abstract = {It has long been recognized that facial expressions, body posture/gestures and vocal parameters play an important role in human communication and the implicit signalling of emotion. Recent advances in low cost computer vision and behavioral sensing technologies can now be applied to the process of making meaningful inferences as to user state when a person interacts with a computational device. Effective use of this additive information could serve to promote human interaction with virtual human (VH) agents that may enhance diagnostic assessment. This paper will focus on our current research in these areas within the DARPA-funded "Detection and Computational Analysis of Psychological Signals" project, with specific attention to the SimSensei application use case. SimSensei is a virtual human interaction platform that is able to sense and interpret real-time audiovisual behavioral signals from users interacting with the system. It is specifically designed for health care support and leverages years of virtual human research and development at USC-ICT. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the state and inferred intent of the user through analysis of behavioral signals gleaned from facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human to human interaction and communication, SimSensei aims to capture and infer from user non-verbal communication to improve engagement between a VH and a user. The system can also quantify and interpret sensed behavioral signals longitudinally that can be used to inform diagnostic assessment within a clinical context.},
	journal = {Journal of Pain Management},
	author = {Rizzo, Albert and Scherer, Scherer and DeVault, David and Gratch, Jonathan and Artstein, Ronald and Hartholt, Arno and Lucas, Gale and Marsella, Stacy and Morbini, Fabrizio and Nazarian, Angela and Stratou, Giota and Traum, David and Wood, Rachel and Boberg, Jill and Morency, Louis Philippe},
	month = nov,
	year = {2016},
	keywords = {MedVR, Virtual Humans, UARC},
	pages = {311--321}
}
Powered by bibtexbrowser