Using Kinect and OpenNI to Embody Avatars in Second Life

Published: January 20, 2011
Category: News

Continuing on their previous work using gestures to play World of Warcraft, ICT researchers have developed a new software, using the OpenNI toolkit as a foundation, which utilizes Kinect to read gestures and triggers corresponding server-side scripts within Second Life. Instead of having to think about pressing the right sequence of keys to make a ‘wave’ gesture, the user can simply raise their hand and wave. The software was developed by ICT engineer Thai Phan who is also a computer science grad student at the USC Viterbi School of Engineering. Phan expects a downloadable version to be ready in a few days and demonstrates how it works in this YouTube video.
The work comes out of the MxR Lab run by Mark Bolas, ICT’s associate director for mixed reality and an associate professor in the interactive media division at the USC School of Cinematic Arts.
“This work begins to uncover what can be expected as users begin to use their bodies to interact with computers,” said Bolas. The line between what is and is not “me” – where I begin and end – quickly becomes blurred.”