So, About Those iPhone Animojis

Published: September 12, 2017
Category: News

Apple’s animated emojis – Animojis – for the iPhone X just announced today are getting lots of attention, partly because the tech behind them likely extends from the company’s acquisition of Faceshift in 2015.
While that’s certainly not been officially confirmed, back then, Faceshift was doing some very cool things with driving animated avatars directly (ie. in real-time) from video of your own face, coupled with depth sensing tech – effectively the same thing that happens with these Animojis via the iPhone cameras.
Several tools have of course also been developed elsewhere that use input video and facial performance to drive animated characters, but for fun, I thought it might be interesting to go back to specific pieces of computer graphics research from 2009, 2010 and 2011 that each partly served as the origins of Faceshift.
Other continued research efforts also played a part in the development of Faceshift, but these papers below (which also have accompanying videos), were key and show how the facial animation of CG avatars would be driven in real-time from video captured of human performances.
FACE/OFF: LIVE FACIAL PUPPETRY
Thibaut Weise, Hao Li, Luc Van Gool, Mark Pauly
Proceedings of the Eighth ACM SIGGRAPH / Eurographics Symposium on Computer Animation 2009, 08/2009 – SCA ’09
Visit VFX Blog for more.