Applications for summer 2023 are closed.
Please visit us again in December to apply for summer 2024.
The Institute for Creative Technologies (ICT) offers a 10-week summer research program for undergraduates in intelligent interactive experiences. A multidisciplinary research institute affiliated with the University of Southern California, the ICT was established in 1999 to combine leading academic researchers in computing with the creative talents of Hollywood and the video game industry. Having grown to encompass over 130 faculty, staff, and students in a diverse array of fields, the ICT represents a unique interdisciplinary community brought together with a core unifying mission: advancing the state-of-the-art for creating virtual experiences so compelling that people will react as if they were real.
Reflecting the interdisciplinary nature of ICT research, we welcome applications from students in computer science, as well as many other fields, such as psychology, art/animation, interactive media, linguistics, and communications. Undergraduates will join a team of students, research staff, and faculty in one of several labs focusing on different aspects of interactive virtual experiences. In addition to participating in seminars and social events, students will also prepare a final written report and present their projects to the rest of the institute at the end of the summer.
This Research Experiences for Undergraduates (REU) site is supported by a grant from the National Science Foundation.
ICT also offers another internship program, for both undergraduate and graduate students, which requires a separate application, view a list of available positions here. For questions or additional information, please contact reu@ict.usc.edu.
Location and Housing
The ICT facility is located in the Playa Vista community of West Los Angeles, about 10 miles west of the main USC campus, and includes a 150-seat theater, game room, and gym. There are numerous restaurants and stores within walking distance, including the Westfield Culver City mall, and the beach is only a 10 minute drive away. Housing is on the main USC campus in downtown L.A., with a free shuttle between campus and the institute.
Benefits
-
- Participate in a unique multidisciplinary community that combines academic research with the creative talents of Hollywood and the video game industry.
- Work with some of the leading researchers in human-computer interaction, virtual reality, computer graphics, and virtual humans.
- Receive a total of $6,000 over the ten week program.
- Receive university housing for the duration of the program, and an additional $1,650 for transportation and meal expenses.
- Travel will be reimbursed up to $600 for students living 95 miles or more outside of the Los Angeles area.
Eligibility
-
- U.S. citizenship or permanent residency is required.
- Students must be currently enrolled in an undergraduate program.
- Students must not have completed an undergraduate degree prior to the summer program.
Important Dates
- Application deadline: January 31, 2023
- Notification of acceptance begins: February 14, 2023
- Notification of declined applicants: March 31, 2023
- Start Date: May 30, 2023
- End Date: August 4, 2023
COVID-19
We are planning to hold the REU site in person. Please note that at present, USC requires all students to be vaccinated against COVID-19 (with limited medical or religious exemptions). As the situation evolves, we will continue to be led by USC, as well as public health and federal, state, and local authorities, and will update this page as necessary.
How to Apply
Applications for summer 2023 are closed. Please visit us again in December to apply for summer 2024.
Research Projects
When you apply, we will ask you to rank your interests from the research projects listed below. We encourage applicants to explore each mentor’s website to learn more about the individual research activities of each lab.
Natural Language Dialogue Processing
Mentor: David Traum and Ron Artstein
The Natural Language Dialogue Group at ICT is developing artificial intelligence and language technology to allow machines to participate in human-like natural dialogues with people. Systems have a variety of “embodiments”, including virtual humans, robots, recorded video of real people, and voice or chat systems. An REU student will work on creating, extending or evaluating one or more of these systems, analyzing conversational data from these systems, or collecting new data. Specific projects can be chosen or defined by the REU student; examples include technology to allow an agent or robot to understand the context of a conversation, take initiative in conversation, or sustain interaction over multiple encounters. Previous REU students in the group have been lead authors and had their REU projects published in the proceedings of international scientific conferences.
Impact of AI on Users’ Psychology
Mentor: Jonathan Gratch and Gale Lucas
The ICT Affective Computing Group explores how to recognize emotional states and signals, model the impact of emotion on human decision-making, and understand the function of emotional signaling in social interaction. REU students will work on one of several projects: the effect of robots on ethical decision making, which examines if people act less ethically when performing actions via an autonomous robot; emotion recognition across cultures, which attempts to recognize people’s emotional state from facial expressions or vocal features; and artificial intelligence as a teammate, which explores how framing an artificial agent as a partner or team member affects successful teamwork. In each of these projects, students will help with constructing and running experiments, and learn or improve relevant skills such as robot programming, machine learning, and data analysis.
AI and Machine Learning for Intelligent Tutoring Systems
Mentor: Benjamin Nye
This project will explore applications of Artificial Intelligence and Machine Learning to support personalized tutoring and dialog systems such as the Personal Assistant for Life-Long Learning (PAL3) or CareerFair.ai. PAL3 is a system for delivering engaging and accessible education via mobile devices. PAL3 helps learners navigate learning resources through an embodied pedagogical agent that acts as a guide and persistent learning record to track what students have done, their level of mastery, and what they need to achieve. CareerFair.ai is an open source project which is developing a virtual career fair framework, where STEM mentors can self-author a virtual mentor based on their video-recorded answers. The goal of the REU internship will be to expand the repertoire of the system to further enhance learning and engagement. The specific tasks will be determined collaboratively based on research goals and student research interests. Students will be expected to contribute to a peer-reviewed publication. Possible topics include work with: (1) developing dialog models for tutoring or mentoring; (2) modifying the intelligent tutoring system and how it supports the learner, and (3) statistical analysis, and/or data mining to identify patterns of interactions between human subjects and the intelligent tutoring system.
Real-Time Modeling and Rendering of Virtual Humans
Mentor: Yajie Zhao
The Vision and Graphics Lab at ICT pursues research and works in production to perform high-quality facial scans for Army training and simulations, as well as for VFX studios and game development companies. There has been continued research on how machine learning can be used to model 3D data effortlessly by data-driven deep learning networks rather than traditional methods. This requires large amounts of data; more than can be achieved using only raw light stage scans. We are currently working on software to aid both in the visualization of our new facial scan database and to animate and render virtual humans. To realize and test the usability of this we would need a tool that can model and render the created Avatar through a web-based GUI. The goal is a real-time, responsive web-based renderer on a client controlled by software hosted on the server. REU interns will work with lab researchers who are developing tools to visualize assets generated by the machine learning model of the rendering pipeline in a web browser using a Unity plugin, and will also integrate deep learning models to be called by web-based APIs. This will include understanding the latest techniques in physically-based real-time character rendering and animation. Ideally, the intern would have awareness about physically based rendering, subsurface scattering techniques, hair rendering, and 3D modeling and reconstruction.
Machine Learning for Human Behavior Understanding
Mentors: Mohammad Soleymani and Trang Tran
The Intelligent Human Perception Lab at USC’s Institute for Creative Technologies conducts research on automatic human behavior analysis. To model human emotions and behaviors, we research machine learning methods that can fuse multiple modalities and generalize to new people. We are seeking REU interns to work on related research tasks, including the following topics: computer vision for facial expression analysis, multimodal learning for emotion recognition and sentiment analysis; characterization of therapist-client interaction and virtual agent behavior generation, for example, turn-taking, head nod backchannel. REU interns who are interested in these topics should be comfortable handling data and able to program in python or C#. Prior experience with machine learning frameworks, e.g., scikit-learn, Keras, PyTorch, is a plus.
Human-like Autonomous Social Cognitive Systems
Mentors: Volkan Ustun
The ICT Cognitive Architecture Lab leverages machine learning to generate Human-like Autonomous Social Cognitive systems, for use in interactive simulations. These characters possess a Theory-of-Mind and are perceptual, autonomous, interactive, affective, and adaptive. Our current focus is on creating such synthetic characters for military training simulations, with multiple players collaborating or competing against each other. REU interns will have the opportunity to learn and use Multi-agent Reinforcement Learning, with various recent representation aggregation and behavior prediction techniques to stabilize the behavior adaptation.