Andrew Jones, Magnus Lang, Graham Fyffe, XueMing Yu, Jay Busch, Mark Bolas, Paul Debevec, Ian McDowall: “Achieving Eye Contact in a One-to-Many 3D Video Teleconferencing System”

August 3, 2009 | New Orleans, LA

Speaker: Andrew Jones, Magnus Lang, Graham Fyffe, XueMing Yu, Jay Busch, Mark Bolas, Paul Debevec, Ian McDowall

We present a set of algorithms and an associated display system capable of producing correctly rendered eye contact between a three-dimensionally transmitted remote participant and a group of observers in a 3D teleconferencing system. The participant’s face is scanned in 3D at 30Hz and transmitted in real time to an autostereoscopic horizontal-parallax 3D display, displaying him or her over more than a $180^\circ$ field of view observable to multiple observers. To render the geometry with correct perspective, we create a fast vertex shader based on a 6D lookup table for projecting 3D scene vertices to a range of subject angles, heights, and distances. We generalize the projection mathematics to arbitrarily shaped display surfaces, which allows us to employ a curved concave display surface to focus the high speed imagery to individual observers. To achieve two-way eye contact, we capture 2D video from a cross-polarized camera reflected to the position of the virtual participant’s eyes, and display this 2D video feed on a large screen in front of the real participant, replicating the viewpoint of their virtual self. To achieve correct vertical perspective, we further leverage this image to track the position of each audience member’s eyes, allowing the 3D display to render correct vertical perspective for each of the viewers around the device. The result is a one-to-many 3D teleconferencing system able to reproduce the effects of gaze, attention, and eye contact generally missing in traditional teleconferencing systems.