Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention (bibtex)
by Itti, Laurent, Dhavale, Nitin and Pighin, Frédéric
Abstract:
We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.
Reference:
Realistic Avatar Eye and Head Animation Using a Neurobiological Model of Visual Attention (Itti, Laurent, Dhavale, Nitin and Pighin, Frédéric), In Proceedings of SPIE 48th Annual International Symposium on Optical Science and Technology, 2003.
Bibtex Entry:
@inproceedings{itti_realistic_2003,
	address = {San Diego, CA},
	title = {Realistic {Avatar} {Eye} and {Head} {Animation} {Using} a {Neurobiological} {Model} of {Visual} {Attention}},
	url = {http://ict.usc.edu/pubs/Realistic%20Avatar%20Eye%20and%20Head%20Animation%20Using%20a%20Neurobiological%20Model%20of%20Visual%20Attention.pdf},
	doi = {10.1117/12.512618},
	abstract = {We describe a neurobiological model of visual attention and eye/head movements in primates, and its application to the automatic animation of a realistic virtual human head watching an unconstrained variety of visual inputs. The bottom-up (image-based) attention model is based on the known neurophysiology of visual processing along the occipito-parietal pathway of the primate brain, while the eye/head movement model is derived from recordings in freely behaving Rhesus monkeys. The system is successful at autonomously saccading towards and tracking salient targets in a variety of video clips, including synthetic stimuli, real outdoors scenes and gaming console outputs. The resulting virtual human eye/head animation yields realistic rendering of the simulation results, both suggesting applicability of this approach to avatar animation and reinforcing the plausibility of the neural model.},
	booktitle = {Proceedings of {SPIE} 48th {Annual} {International} {Symposium} on {Optical} {Science} and {Technology}},
	author = {Itti, Laurent and Dhavale, Nitin and Pighin, Frédéric},
	month = aug,
	year = {2003}
}
Powered by bibtexbrowser