CNN’s December 3rd episode of The Whole Story with Anderson Cooper featured deep-fake detection and face-tracker technology developed at USC ICT’s Vision and Graphics lab. In the segment, A.I. and the Future of Humanity, Nick Watt chats with Hany Farid of Berkeley about misinformation and deepfakes—the topic of VGL’s 2019 CVPR collaboration with Farid and his lab, “Protecting World Leaders Against Deep Fakes.” Later in the segment, Scott Mann of Flawless showcased the TrueSync software used for efficient lip synching in film dialogue that was built upon VGL’s Face Model technology.
IMAGE: Figure: Shown from top to bottom, are five example frames of a 10-second clip from original, lip-sync deep fake, comedic impersonator, face-swap deep fake, and puppet-master deep fake. “Protecting World Leaders Against Deep Fakes.” Shruti Agarwal, Hany Farid, Yuming Gu, Mingming He, Koki Nagano, Hao Li. CVPR 2019.
FOR MORE INFORMATION: on detecting deepfakes or the Face Model, please contact Kathleen Haase at firstname.lastname@example.org.