Expressive Behaviors for Virtual Worlds (bibtex)
by Marsella, Stacy C., Gratch, Jonathan and Rickel, Jeff
Abstract:
A person's behavior provides signi⬚cant information about their emotional state, attitudes, and attention. Our goal is to create virtual humans that convey such information to people while interacting with them in virtual worlds. The virtual humans must respond dynamically to the events surrounding them, which are fundamentally influenced by users' actions, while providing an illusion of human-like behavior. A user must be able to interpret the dynamic cognitive and emotional state of the virtual humans using the same nonverbal cues that people use to understand one another. Towards these goals, we are integrating and extending components from three prior systems: a virtual human architecture with a wide range of cognitive and motor capabilities, a model of task-oriented emotional appraisal and socially situated planning, and a model of how emotions and coping impact physical behavior. We describe the key research issues and approach in each of these prior systems, as well as our integration and its initial implementation in a leadership training system.
Reference:
Expressive Behaviors for Virtual Worlds (Marsella, Stacy C., Gratch, Jonathan and Rickel, Jeff), Chapter in Life-Like Characters: Tools, Affective Functions, and Applications, 2004.
Bibtex Entry:
@incollection{marsella_expressive_2004,
	title = {Expressive {Behaviors} for {Virtual} {Worlds}},
	url = {http://ict.usc.edu/pubs/Expressive%20Behaviors%20for%20Virtual%20Worlds.pdf},
	abstract = {A person's behavior provides signi⬚cant information about their emotional state, attitudes, and attention. Our goal is to create virtual humans that convey such information to people while interacting with them in virtual worlds. The virtual humans must respond dynamically to the events surrounding them, which are fundamentally influenced by users' actions, while providing an illusion of human-like behavior. A user must be able to interpret the dynamic cognitive and emotional state of the virtual humans using the same nonverbal cues that people use to understand one another. Towards these goals, we are integrating and extending components from three prior systems: a virtual human architecture with a wide range of cognitive and motor capabilities, a model of task-oriented emotional appraisal and socially situated planning, and a model of how emotions and coping impact physical behavior. We describe the key research issues and approach in each of these prior systems, as well as our integration and its initial implementation in a leadership training system.},
	booktitle = {Life-{Like} {Characters}: {Tools}, {Affective} {Functions}, and {Applications}},
	author = {Marsella, Stacy C. and Gratch, Jonathan and Rickel, Jeff},
	month = jun,
	year = {2004},
	keywords = {Social Simulation, Virtual Humans}
}
Powered by bibtexbrowser