CNET, Mashable and Discovery Canada featured research from ICT’s MultiComp Lab that automatically tracks and analyzes in real-time facial expressions, body posture, acoustic features, linguistic patterns and other behaviors like fidgeting. These signals can aid in identifying indicators of psychological distress.
These stories showcase SimSensei, ICT’s virtual human platform specifically designed for healthcare support. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the perceived user state and intent, through its own speech and gestures. The SimSensei project is led by Louis-Philippe Morency and Skip Rizzo at ICT. The stories note that SimSensei is an aid—not a replacement – for a human health care provder.
The coverage is based on a paper to be presented at the Automatic Face and Gesture Recognition conference in Shanghai, China, this month. The paper was a joint ICT effort from post-doctoral researcher Stefan Scherer and research programmer Giota Stratou with Jonathan Gratch associate director for virtual humans research, leading the data acquisition portion.
CNET, Mashable and Discovery Canada Feature SimSensei and Research in Automated Depression Recognition
Published: April 4, 2013
Category: News