A Day of Filming Across ICT Labs with USC’s Digital Media Team

Published: October 9, 2025
Category: News
USC Central Video Shoot

The Institute for Creative Technologies hosted a video shoot today to document current research and leadership perspectives across several of its core laboratories. The production was led by Sean Dube, USC Digital Media Production Manager and director of the shoot, with Kristopher Head serving as sound recordist and production assistant. Head, a videographer in his own right, handled audio capture and equipment setup throughout the day. Still photography was managed by freelance photojournalist Brian Van Der Brug, formerly of the Los Angeles Times.

The three-hour schedule began on ICT’s top floor, in the office suite of Executive Director Dr. Randall W. Hill, Jr. Dr. Hill spoke on camera about ICT’s origin story, its research portfolio, and its 25-year history in wargaming and simulation. This segment was shot to anchor the video, outlining how ICT’s work integrates artificial intelligence, immersive media, and human behavioral research to support national priorities in training and education.

The next stop was the Learning Sciences Lab, where Chief Science Officer Dr. Bill Swartout and Lab Director Dr. Benjamin Nye described current research under the AI Research Center of Excellence for Education (AIRCOEE). The two-year, $4.5 million contract, funded by the U.S. Army Research Office, is addressing two central questions: how artificial intelligence can improve learning, and how to upskill the workforce in AI for 21st-century roles.

Within AIRCOEE, five primary research tracks are underway. The Army Writing Enhancement Toolset (AWE) applies intelligent tutoring methods to improve written communication through guided reasoning rather than simple editing. The AI-Enhanced Dashboard for Instructors and Students (AID) provides engagement metrics and supports timely instructional intervention. Meta-Tutoring for Self-Regulated Learning focuses on developing learning-to-learn skills through adaptive feedback. The AI-Assisted Revisions for Curricula (ARC) project explores semi-automated updates to training materials. Finally, AI-Upskilling with AI Tools (AI-UP) delivers role-specific, personalized AI instruction at scale.

During filming, Dr. Swartout and Dr. Nye demonstrated the AWE system, which uses generative AI as an analytical peer reviewer to prompt deeper critical thinking. Their presentation underscored how the lab’s approach to AI in education prioritizes pedagogy and instructor empowerment.

The crew then moved to the Mixed Reality (MxR) Lab, where Director David Nelson, Creative Producer David Cobbins, and Technical Lead Rhys Yahata presented current work in immersive simulation and extended reality. Originally known for its contributions to early virtual reality research that influenced the Oculus Rift and Google Cardboard, the MxR Lab now focuses on extended reality user interfaces, adaptive human-machine interaction, and generative AI content creation.

Two prototypes were featured. The Watercraft and Ship Simulator of the Future (WSSOF) is a cooperative effort between ICT and the U.S. Army Corps of Engineers Coastal and Hydraulics Laboratory. The system integrates geo-specific terrain data, accurate bathymetry, and nonlinear wave models within a portable, high-fidelity simulator that fits inside a single transport case. The Leaders Enhanced and Applied Doctrine System (LEADS) project aims to make complex Army doctrine more accessible and engaging through web-based, scenario-driven learning. Developed in collaboration with the Command and General Staff College and the Mission Command Center of Excellence, LEADS translates core concepts of multi-domain operations into interactive training exercises.

The final segment of the day was filmed in ICT’s Vision and Graphics Lab (VGL), directed by Dr. Yajie Zhao. The lab is recognized as the birthplace of the Light Stage, now in its tenth generation, which pioneered high-resolution human face capture and advanced relighting for digital performance. The technology has received two Academy Awards for Scientific and Technical Achievement and has been used in more than 50 major films and series, including Avatar and Blade Runner 2043.

Dr. Zhao conducted demonstrations of ongoing research, including AI-Based Avatar Creation, which enables the production of high-fidelity virtual humans directly from photographs or text prompts. The lab maintains a database of more than 44,000 detailed facial models to support this work. A second area of focus, CG-to-Real for Seamless Storytelling, combines avatar generation with large language and video diffusion models to integrate the entire creative process, from script to animation to rendering, within a single AI-driven pipeline. These technologies support advanced military simulation and training systems.

Filming concluded in the early afternoon. Over the course of the day, the production documented how ICT’s research teams are advancing applications of artificial intelligence, immersive technology, and human-centered design across multiple domains.

//