Research Lead: Jon Gratch
The Virtual Humans Group advances research in computer-generated characters that use language, have appropriate gestures, show emotion react to verbal and non-verbal stimuli. Widely considered the most comprehensive research project of its kind, the ICT virtual human effort has applications in training and education and in furthering social science research about real people.
Research Leader: Paul Rosenbloom
Explores a new form of functionally elegant, grand unified, cognitive architecture in support of virtual humans and, hopefully, intelligent agents/robots as well. Visit site.
Research Leader: Stacy Marsella
Models and studies the impact of virtual human non-verbal behavior. SmartBody, an open source character animation system, grew out of this work.
Research Leaders: Jon Gratch and Stacy Marsella
Develops computational systems that use emotion to communicate more effectively and investigates interpretations of emotional behavior and how they influence memory and decision-making. Visit site.
Integrated Virtual Humans
Research Leader: Arno Hartholt
Builds a common platform to rapidly create virtual human prototypes. The freely available ICT Virtual Human Toolkit allows the research community to create their own virtual humans.
Research Leader: Louis-Philippe Morency
Explores the recognition and analyses of visual cues, such as head nods and eye shifts, to facilitate more natural human-computer interaction. Visit site.
Natural Language Processing
Research Leader: David Traum
Advances research in natural language processing, specifically for virtual humans through developing computational models of dialogue and systems that interact with people. Visit site.