Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character (bibtex)
by Andrew W. Feng, Yazhou Huang, Yuyu Xu, Ari Shapiro
Abstract:
Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.
Reference:
Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character (Andrew W. Feng, Yazhou Huang, Yuyu Xu, Ari Shapiro), In International Conference on Motion in Games, 2012.
Bibtex Entry:
@inproceedings{feng_automating_2012,
	address = {Rennes, France},
	title = {Automating the {Transfer} of a {Generic} {Set} of {Behaviors} onto a {Virtual} {Character}},
	url = {http://ict.usc.edu/pubs/Automating%20the%20Transfer%20of%20a%20Generic%20Set%20of%20Behaviors%20onto%20a%20Virtual%20Character.pdf},
	abstract = {Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.},
	booktitle = {International {Conference} on {Motion} in {Games}},
	author = {Feng, Andrew W. and Huang, Yazhou and Xu, Yuyu and Shapiro, Ari},
	month = nov,
	year = {2012},
	keywords = {Virtual Humans, UARC}
}
Powered by bibtexbrowser