Publications
Search
1.
Feng, Andrew W.; Huang, Yazhou; Xu, Yuyu; Shapiro, Ari
Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character Proceedings Article
In: International Conference on Motion in Games, Rennes, France, 2012.
@inproceedings{feng_automating_2012,
title = {Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character},
author = {Andrew W. Feng and Yazhou Huang and Yuyu Xu and Ari Shapiro},
url = {http://ict.usc.edu/pubs/Automating%20the%20Transfer%20of%20a%20Generic%20Set%20of%20Behaviors%20onto%20a%20Virtual%20Character.pdf},
year = {2012},
date = {2012-11-01},
booktitle = {International Conference on Motion in Games},
address = {Rennes, France},
abstract = {Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.
Filter
2012
Feng, Andrew W.; Huang, Yazhou; Xu, Yuyu; Shapiro, Ari
Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character Proceedings Article
In: International Conference on Motion in Games, Rennes, France, 2012.
Abstract | Links | BibTeX | Tags: UARC, Virtual Humans
@inproceedings{feng_automating_2012,
title = {Automating the Transfer of a Generic Set of Behaviors onto a Virtual Character},
author = {Andrew W. Feng and Yazhou Huang and Yuyu Xu and Ari Shapiro},
url = {http://ict.usc.edu/pubs/Automating%20the%20Transfer%20of%20a%20Generic%20Set%20of%20Behaviors%20onto%20a%20Virtual%20Character.pdf},
year = {2012},
date = {2012-11-01},
booktitle = {International Conference on Motion in Games},
address = {Rennes, France},
abstract = {Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.},
keywords = {UARC, Virtual Humans},
pubstate = {published},
tppubtype = {inproceedings}
}
Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We o⬚er a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated.