A story by NPR’s mental health reporter Alix Spiegel covered ICT’s SimSensei project and research led by Skip Rizzo and Louis-Philippe Morency to develop a virtual human application that can be used to identify signals of depression and other mental health issues. The point, according to Rizzo, ICT’s associate director for medical virtual reality, is to analyze in almost microscopic detail the way people talk and move — to read their body language.
“We can look at the position of the head, the eye gaze,” Rizzo says in the story. Does the head tilt? Does it lean forward? Is it static and fixed?”
The theory, says Spiegel in her report, is that of all this is that a detailed analysis of those movements and vocal features can give us new insights into people who are struggling with emotional issues. The body, face and voice express things that words sometimes obscure.
The story notes that the idea here is not for Ellie to actually diagnose people and replace trained therapists. She’s just there to offer insight to therapists, Morency says, by providing some objective measurements.
“Think about it as a blood sample,” he says. “You send a blood sample to the lab and you get the result. The [people] doing the diagnosis [are] still the clinicians, but they use these objective measures to make the diagnosis.”
The story also states that this work was commissioned by the U.S. Department of Defense as part of its suicide prevention efforts. To learn more about the project visit the SimSensei page on our website.
NPR Features SimSensei and ICT Research Using Computers to Detect Signs of Emotional Distress
Published: May 21, 2013
Category: News