Research Experience for Undergraduates (REU)

Applications are open (deadline: January 31, 2024)

The Institute for Creative Technologies (ICT) offers a 10-week summer research program for undergraduates in intelligent interactive experiences. A multidisciplinary research institute affiliated with the University of Southern California, the ICT was established in 1999 to combine leading academic researchers in computing with the creative talents of Hollywood and the video game industry. Having grown to encompass over 130 faculty, staff, and students in a diverse array of fields, the ICT represents a unique interdisciplinary community brought together with a core unifying mission: advancing the state-of-the-art for creating virtual experiences so compelling that people will react as if they were real.

Reflecting the interdisciplinary nature of ICT research, we welcome applications from students in computer science, as well as many other fields, such as psychology, art/animation, interactive media, linguistics, and communications. Undergraduates will join a team of students, research staff, and faculty in one of several labs focusing on different aspects of interactive virtual experiences. In addition to participating in seminars and social events, students will also prepare a final written report and present their projects to the rest of the institute at the end of the summer.

This Research Experiences for Undergraduates (REU) site is supported by a grant from the National Science Foundation.

ICT also offers another internship program, for both undergraduate and graduate students, which requires a separate application, view a list of available positions here. For questions or additional information, please contact reu@ict.usc.edu.

Location and Housing

The ICT facility is located in the Playa Vista community of West Los Angeles, about 10 miles west of the main USC campus, and includes a 150-seat theater, game room, and gym. There are numerous restaurants and stores within walking distance, including the Westfield Culver City mall, and the beach is only a 10 minute drive away. Housing is on the main USC campus in downtown L.A., with a free shuttle between campus and the institute.

Benefits

    • Participate in a unique multidisciplinary community that combines academic research with the creative talents of Hollywood and the video game industry.
    • Work with some of the leading researchers in human-computer interaction, virtual reality, computer graphics, and virtual humans.
    • Receive a total of $7,650–8,650 over the ten week program in a combination of stipend, meal, and transportation allowance.
    • Receive university housing for the duration of the program.
    • Travel will be reimbursed up to $600 for students living 95 miles or more outside of the Los Angeles area.

Eligibility

    • U.S. citizenship or permanent residency is required.
    • Students must be currently enrolled in an undergraduate program.
    • Students must not have completed an undergraduate degree prior to the summer program.

Important Dates

  • Application deadline: January 31, 2024
  • Notification of acceptance begins: February 12, 2024
  • Notification of declined applicants: April 2, 2024
  • Start Date: May 28, 2024
  • End Date: August 2, 2024

How to Apply

Applications are open! Submit your application now.

Tips for filling out your application:

  • In the personal statement, we recommend concentrating on your academic background and interests.
  • Include an unofficial transcript.
  • Provide the contact information of a faculty member who will write a letter of reference, and give your recommender plenty of notice.

Research Projects

When you apply, we will ask you to rank your interests from the research projects listed below. We encourage applicants to explore each mentor’s website to learn more about the individual research activities of each lab.

Natural Language Dialogue Processing

Mentor: David Traum, Ron Artstein and Kallirroi Georgila
The Natural Language Dialogue Group at ICT is developing artificial intelligence and language technology to allow machines to participate in human-like natural dialogues with people. Our systems include virtual humans, robots, recorded real people, and voice and chat systems. An REU student will work on creating, extending or evaluating such systems; analyzing conversational data or collecting new data; or other topics using dialogue data and state-of-the-art machine learning methods. Specific projects can be chosen or defined by the REU student; examples include technology to allow an agent or robot to understand the context of a conversation, take initiative, or sustain interaction over multiple encounters; use of large language models for system response generation; and reinforcement learning of system policies. Previous REU students in the group have been lead authors and had their REU projects published in the proceedings of international scientific conferences.

Natural Language Dialogue Processing

AI Agents for Conflict Mediation

Mentor: Jonathan Gratch and Gale Lucas
Conflicts evoke emotion and often arise from differences in underlying values. Our research explores how values shape emotions, behaviors and outcomes in negotiation and other conflict settings. We adopt a multidisciplinary approach that combines psychological theories with artificial intelligence methods to examine how AI agents can participate in interpersonal conflicts and aid in their resolution. Our current efforts focus on the use of animated virtual agents and large language models (e.g., GPT-4) to both annotate human behavior and simulate the behavior of human partners within controlled experiments. REU students are sought to assist with developing research software, experiments and data analysis. Candidates should have good programming skills, Prior experience with large-language models and exposure to psychology or negotiation research is a benefit.

Impact of AI on Users Psychology

AI and Machine Learning for Intelligent Tutoring Systems

Mentor: Benjamin Nye
This project will explore applications of Artificial Intelligence and Machine Learning to support personalized tutoring and dialog systems such as the Personal Assistant for Life-Long Learning (PAL3) or CareerFair.ai virtual mentors (MentorPal). PAL3 is a system for delivering engaging and accessible education via mobile devices. PAL3 helps learners navigate learning resources through an embodied pedagogical agent that acts as a guide and persistent learning record to track what students have done, their level of mastery, and what they need to achieve. CareerFair.ai / MentorPal is an open source project which is developing a virtual career fair framework, where STEM mentors can self-author a virtual mentor based on their video-recorded answers. The goal of the REU internship will be to expand the repertoire of the system to further enhance learning and engagement. The specific tasks will be determined collaboratively based on research goals and student research interests. Students will be expected to contribute to a peer-reviewed publication. Possible topics include work with: (1) Generative AI models for developing intelligent tutoring content; (2) modifying the intelligent tutoring system and how it supports the learner, and (3) statistical analysis, and/or data mining to identify patterns of interactions between human subjects and the intelligent tutoring system.

PAL 3 app interface

Real-Time Modeling and Rendering of Virtual Humans

Mentor: Yajie Zhao
The Vision and Graphics Lab at ICT pursues research and works in production to perform high-quality facial scans for Army training and simulations, as well as for VFX studios and game development companies. There has been continued research on how machine learning can be used to model 3D data effortlessly by data-driven deep learning networks rather than traditional methods. This requires large amounts of data; more than can be achieved using only raw light stage scans. We are currently working on software to aid both in the visualization of our new facial scan database and to animate and render virtual humans. To realize and test the usability of this we would need a tool that can model and render the created Avatar through a web-based GUI. The goal is a real-time, responsive web-based renderer on a client controlled by software hosted on the server. REU interns will work with lab researchers to develop a tool to visualize assets generated by the machine learning model of the rendering pipeline in a web browser using a Unity plugin and also integrate deep learning models to be called by web-based APIs. This will include the development of the latest techniques in physically-based real-time character rendering and animation. Ideally, the intern would have awareness about physically based rendering, subsurface scattering techniques, hair rendering, and 3D modeling and reconstruction.

Real time rending for Virtual Humans

Machine Learning for Human Behavior Understanding

Mentors: Mohammad Soleymani
The Intelligent Human Perception Lab at USC’s Institute for Creative Technologies conducts research on automatic human behavior analysis. To model human emotions and behaviors, we research machine learning methods that can fuse multiple modalities and generalize to new people. We are seeking REU interns to work on related research tasks, including the following topics: computer vision for facial expression analysis, multimodal learning for emotion recognition and sentiment analysis; characterization of therapist-client interaction and virtual agent behavior generation, for example, turn-taking, head nod backchannel. REU interns who are interested in these topics should be comfortable handling data and able to program in python or C#. Prior experience with machine learning frameworks, e.g., scikit-learn, Keras, PyTorch, is a plus.

Machine Learning for Human Behavior Understanding diagram

Human-inspired Adaptive Teaming Systems

Mentors: Volkan Ustun and Nikolos Gurney
The ICT Cognitive Architecture Lab leverages machine learning to generate Human-inspired Adaptive Teaming Systems for use in interactive simulations. Our current focus is on leveraging these systems as synthetic characters for military training simulations, with multiple players collaborating or competing against each other. REU interns will have the opportunity to use Multi-agent Reinforcement Learning and augment it with various recent representation aggregation and behavior prediction techniques to stabilize the behavior adaptation.

Human inspired Adaptive Teaming Systems