ICT research scientist Louis-Philippe Morency won the best paper award at the highly selective Intelligent Virtual Agent’s Conference in Japan. In the paper, Morency and colleagues Iwan de Kok and Jonathan Gratch propose a new model for learning to recognize and generate meaningful multimodal behaviors from examples of face-to-face interactions including facial expressions, posture shifts, and other interactional signals. The work has importance not only as a means to improve the interactivity and expressiveness of virtual humans but as an fundamental tool for uncovering hidden patterns in human social behavior. Two other USC papers were finalists for Best paper: Evaluation of Justina: A Virtual Patient with PTSD, Patrick Kenny, Thomas Parsons, Jonathan Gratch, Albert Rizzo and best student paper: The Relation between Gaze Behavior and Emotional State: An Empirical Study, Brent Lance, Stacy Marsella.
Read the paper.
Information About the Conference
ICT Research Leading to More Natural and Fluid Virtual Human Conversation Wins Award
Published: September 5, 2008
Category: News