Two articles highlight research from ICT’s MultiComp Lab that automatically tracks and analyzes in real-time facial expressions, body posture, acoustic features, linguistic patterns and other behaviors like fidgeting. These signals can aid in identifying indicators of psychological distress. New Scientist (subscription may be required) featured a study to be presented at the Automatic Face and Gesture Recognition conference in Shanghai, China, this month.
The coverage is based on a paper to be presented at the Automatic Face and Gesture Recognition conference in Shanghai, China, this month. The paper was a joint ICT effort from post-doctoral researcher Stefan Scherer and research programmer Giota Stratou with Jonathan Gratch, associate director for virtual humans research, leading the data acquisition portion. The team used ICT’s MultiSense system to identify characteristic movements that indicate someone may be depressed. “Presently broad screening is done by using only a checklist of yes/no questions or point scales, but all the non-verbal behaviour is not taken into account,” Scherer said. “This is where we would like to put our technology to work.”
This story, as well as one on Gizmodo, showcase SimSensei, ICT’s virtual human platform specifically designed for healthcare support. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the perceived user state and intent, through its own speech and gestures. The SimSensei project is led by Louis-Philippe Morency and Skip Rizzo at ICT. The story notes that SimSensei is an aid—not a replacement – for a human health care provder.
New Scientist and Gizmodo Feature ICT Research Teaching Computers to Recognize Signs of Depression
Published: March 29, 2013
Category: News