Labs + Groups

Virtual Humans

Research Lead: Jon Gratch

Download a PDF overview.

The Virtual Humans Group advances research in computer-generated characters that use language, have appropriate gestures, show emotion react to verbal and non-verbal stimuli. Widely considered the most comprehensive research project of its kind, the ICT virtual human effort has applications in training and education and in furthering social science research about real people.

Cognitive Architecture
Research Leader: Paul Rosenbloom
Explores a new form of functionally elegant, grand unified, cognitive architecture in support of virtual humans and, hopefully, intelligent agents/robots as well. Visit site.

Character Animation and Simulation
Research Leader: Ari Shapiro
Models and studies the impact of virtual human non-verbal behavior. SmartBody, an open source character animation system, grew out of this work. Visit site.

Emotion
Research Leaders: Jon Gratch
Develops computational systems that use emotion to communicate more effectively and investigates interpretations of emotional behavior and how they influence memory and decision-making.

Integrated Virtual Humans
Research Leader: Arno Hartholt
Builds a common platform to rapidly create virtual human prototypes. The freely available  ICT Virtual Human Toolkit allows the research community to create their own virtual humans.

MultiModal Communication
Research Leader: Stefan Scherer
Explores the recognition and analyses of visual cues, such as head nods and eye shifts, to facilitate more natural human-computer interaction. Visit site.

Natural Language Processing
Research Leader: David Traum
Advances research in natural language processing, specifically for virtual humans through developing computational models of dialogue and systems that interact with people. Visit site.