Lixing Huang, Louis-Philippe Morency, Jonathan Gratch: “Parasocial Consensus Sampling: Combining Multiple Perspectives to Learn Virtual Human Behavior”

May 10, 2010 | Toronto, Ontario

Speaker: Lixing Huang, Louis-Philippe Morency, Jonathan Gratch
Host: Ninth International Conference on Autonomous Agents and Multiagent Systems

Virtual humans are embodied software agents that should not only be realistic looking but also have natural and realistic behaviors. Traditional virtual human systems learn these interaction behaviors by observing how individuals respond in face-to-face situations (i.e., direct interaction). In contrast, this paper introduces a novel methodological approach called parasocial consensus sampling (PCS) which allows multiple individuals to vicariously experience the same situation to gain insight on the typical (i.e., consensus view) of human responses in social interaction. This approach can help tease apart what is idiosyncratic from what is essential and help reveal the strength of cues that elicit social responses. Our PCS approach has several advantages over traditional methods: (1) it integrates data from multiple independent listeners interacting with the same speaker, (2) it associates probability of how likely feedback will be given over time, (3) it can be used as a prior to analyze and understand the face-to-face interaction data, (4) it facilitates much quicker and cheaper data collection. In this paper, we apply our PCS approach to learn a predictive model of listener backchannel feedback. Our experiments demonstrate that a virtual human driven by our PCS approach creates significantly more rapport and is perceived as more believable than the virtual human driven by face-to-face interaction data.