ARL 12 – Research Assistant, Integrated Circuit Design

Project Name
Integrated Circuit Design for Army Research Applications

Project Description
Development of state-of-the-art integrated circuits to be used in military applications. These computing platforms leverage digital, analog, and mixed signal components, along with heterogeneous integration of different technologies in order to provide efficient solutions.

Job Description
The research assistant will work with ARL scientists to design and verify circuit components that will be used in larger Systems on Chip (SoCs). Cadence design environments will be used to create circuit schematics and simulate circuit functionality. Time permitting, students will also participate in designing circuit layout.

Preferred Skills

  • An undergraduate or graduate student in electrical engineering
  • Experience with Cadence or SPICE simulations analysis preferred
  • Ability to work in a collaborative environment as well as independently
  • Interest in integrated circuit design
  • Strong quantitative skills

ARL 11 – Research Assistant, The Biomechanics of Ballistic-Blunt Impact Injuries

Project Name
The Biomechanics of Ballistic-Blunt Impact Injuries

Project Description
The primary purpose of this project is to research the mechanisms and injuries associated with ballistic-blunt impacts. The motivation for this project results from body armor design requirements. Body armor is primarily designed to prevent bullets from penetrating into the body. However, to absorb the energy of the incoming bullet, body armor can witness a large degree of backface deformation (BFD). Higher energy threats, new materials and new armor designs may increase the risk of injury from these events. Even if the body armor systems can stop higher energy rounds from penetrating, the BFD may be severe enough to cause serious injury or death. Unfortunately, there is limited research on the relationship between BFD and injury, hindering new and novel armor developments. Consequently, there is a need to research these injuries and their mechanisms.

Job Description
The research assistant will help design and execute hands-on lab research related to injury biomechanics, collect and analyze data, as well as document and present findings of the work.

Preferred Skills

  • Graduate student in biomedical engineering, mechanical engineering, or related field
  • Some experience working in a laboratory setting
  • Some experience in the medical field
  • Experience in software for data collection, processing and analysis

ARL 10 – Programmer, Anonymized Human Injury Database

Project Name
Anonymized Human Injury Database

Project Description
The Anonymized Human Injury Database project will develop a tool for the surveillance, assessment, and investigation of acute injuries. The goal is to develop a database to describe injuries and information on the factors associated with the injury event. The proposed database will provide multiple organizations with the ability to query the database to support the investigation of injury mechanisms and the identification of potential mitigation strategies.

Job Description
The Army Research Laboratory (ARL) is looking for an enthusiastic, self-motivated student with a background in web-based programming. The student will take the lead in understanding the requirements for the Anonymized Human Injury Database and developing a schema that will satisfy those requirements. The student will develop a password-protected web-based application that will permit the data to be viewed, queried, and exported. The application should also provide a way for new data records to be inserted, allow files to be uploaded and attached to records, and provide a mechanism for allowing any files associated with an individual record to be retrieved. The student will be mentored and supported by a senior web developer at ARL.

Preferred Skills

  • Fluent in one or more web-based scripting languages, such as PHP, HTML, CSS, and Javascript
  • Strong familiarity with one or more database engines, such as MySQL
  • Basic familiarity with web hosting environments
  • Background in modern web-based frameworks and design is a plus

ARL 9 – Research Assistant, Modeling & Simulation with VR

Project Name
Modeling & Simulation with Virtual Reality

Project Description
Modeling & Simulation with Virtual Reality (VR) will explore implementing a VR device (Oculus) using a simulation interactive currently built in Unreal. The researcher will explore current frameworks for the VR device for specific game engines as well as more open rendering capabilities (open scene graph).

Job Description
The research assistant will help design and run the study, develop working prototypes as well as document and presentation findings of the work.

Preferred Skills

  • 3D environment programming experience (e.g. Unity or Unreal)
  • Some experience in interactive/game design
  • 3D modeling or game asset creation a plus

ARL 8 – Research Assistant, Geometry Limits in a Game Engine

Project Name
Geometry Limits in a Game Engine

Project Description
Geometry Limits in a Game Engine will compare game engine capabilities and limitations of rendering high-resolution geometries. The student will research current methods of loading and interacting with geometry and use this to further modeling and simulation. The student will document the development of prototypes, which compare at least two different game engines.

Job Description
The research assistant will design the study, develop working prototypes, as well as document and present findings of the work.

Preferred Skills

  • 3D environment programming experience (e.g. Unity or Unreal)
  • Some experience in interactive/game design
  • Experience in 3D modeling

ARL 7 – Research Assistant, Machine Learning, State Estimation, Sensor Modeling

Project Name
Machine Learning with State estimation and Sensor Modeling

Project Description
Applied research and development in machine learning, stochastic/Kalman filtering and control systems.

Job Description
The candidate will have experience implementing machine learning algorithms such as Gaussian mixture models, deep learning algorithms, and information theoretic methods; and have experience applying these algorithms to real-world datasets. The candidate will work with postdoc-level researchers to implement machine learning algorithms and analyze multimodal datasets, as well as develop sensor and process models from these datasets which are applicable to heterogeneous autonomous systems. BS/MS in computer science with experience in machine learning and implementation of machine learning algorithms, neural networks, stochastic/Kalman filtering, control systems, preferably with applied research experience. Laboratory experience in applied research and development is preferred. Experience with community-standard machine learning libraries such as Keras, Mallet, and Tensorflow is desired. Experience implementing and running control system algorithms such as extended Kalman filters in C++, Java, and Python is highly desirable.

Required Skills

  • Experience developing software in the Ubuntu Linux environment and mathematical software packages such as MATLAB
  • Experience with software build environments such as CMake, Ant, Maven
  • A solid foundation in machine learning and prototyping and implementing machine learning algorithms directly from theory
  • Experience applying machine learning algorithms to real-world datasets
  • Fluency in C++, Java, and Python
  • Experience with software and revision management tools such as Subversion and Git

Preferred Skills

  • Strong familiarity with the mathematics of fusion algorithms for either autonomous applications or state-estimation; and, the ability/willingness to learn the other.
  • Experience with implementation and use of stochastic filtering and control algorithms
  • Proficiency with community-standard machine learning libraries such as Tensorflow
  • Experience with generating mathematical sensor models involving complex systems
  • Experience with implementing and testing multi-sensor fusion and state estimation algorithms for robotics.
  • Practical understanding of experimental statistics including statistical experiment design, data analysis, validation and verification.
  • Experience with the Robot Operating System (ROS) and associated build environments

ARL 6 – Research Assistant, Adaptive Automation Experimenter

Project Name
Adaptive Automation Experimentation

Project Description

The research project is designed for empirical study of how and when to trigger adaptive computer aids. The aids would be designed to help a user who is giving signs of having difficulty performing a task. By the start of the internship, the project will be at the stage of collecting data from human-subjects experimentation with self-initiated automated aids.

Job Description
The research assistant will help debug experiment programs in E-Prime (Visual Basic programming), run the study, analyze the data, and possibly contribute to a publication and presentation of the work at a scientific conference.

Preferred Skills

  • Experience programming in Visual Basic
  • Some experience in human-subjects running
  • Some experience in data analysis

ARL 5 – Research Assistant, Gamification and Wearable Physiological Sensors

Project Name
Gamification and Wearable Physiological Sensors

Project Description
The research project seeks to detect the peripheral physiological impact of a gamified learning task. Our team plans to conduct an experiment where participants have to learn to withhold responses under various gamification point schemes. A specific research question for this work is whether the independent variable of gamification type has a statistical interaction with an individual’s traits and if this is observable in peripheral physiological sensor data. These include continuous heart rate (ECG), impedance cardiograph (ICG), and blood pressure.

Job Description
The research assistant will help run the study, pre- process, and analyze time series data. Internship will be located at ARL West facilities in Playa Vista and/or UC Santa Barbara.

Preferred Skills

  • Experience conducting behavioral studies with human subjects
  • Experience with wearable sensors
  • Experience analyzing continuous time series data
  • Coursework in statistical hypothesis testing and related tools, e.g., SPSS, R, Matlab
  • Programming experience is a plus

ARL 4 – Research Assistant, 3D Visualization for Designing View-Dependent Materials & Patterns

Project Name
3D Visualization for Designing View-Dependent Materials & Patterns

Project Description
One of our research areas involves the investigation of how objects are perceived from multiple viewpoints and how these views affect the saliency of those objects. The research includes exploring the use of lighting and object models in the construction of shapes and patterns whose perceived appearance may vary depending on viewing angle. These 3D models will be used to study the visual salience of materials with the view dependent optical characteristics. The goal of the project is to set upper bounds on the desired material properties of view dependent surfaces to support figure ground breaking.

Job Description
The research assistant will create 3D environments with a mix of 2D and 3D elements with different view dependent properties. They may possibly contribute to a publication and presentation of the work at a scientific conference.

Preferred Skills

  • Experience programming in a 3D environment
  • Experience in 3D graphics or 3D animation is a plus
  • A background in optical physics or visual perception is a plus.

ARL 3 – Research Assistant, Real-Time Context-Aware State Estimation and Prediction for Multi-Aspect Analytics

Project Name
Real-Time Context-Aware State Estimation and Prediction for Multi-Aspect Analytics

Project Description
The research project seeks to detect and classify various human states (e.g., affect) in real-time. The team will run an experiment where participants perform a gesturing dual-task while wearing multiple sensors, such as EKG and EEG. We will explore existing and develop new machine learning algorithms that integrate sensor data from multiple sources and operate at multiple timescales to best model human states.

Job Description
The research assistant will help run the study, pre- process, and analyze time series data, and possibly contribute to a publication and presentation of the work at a scientific conference or journal venue.

Preferred Skills

  • Experience programming in Matlab
  • Experience programming in Python
  • Experience with time series data
  • Experience with wearable sensors, such as EEG
  • Experience with Unity is a plus
  • Experience and/or interest in machine learning is a plus

ARL 2 – Research Assistant, Brain Computer Interface for Quality of Information

Project Name
Brain Computer Interface for Quality of Information

Project Description
The research project will develop data processing and analysis techniques for extracting estimates of viewed image quality from the brain activity of the viewer. Electroencephalography (EEG) data have already been collected from participants monitoring a rapid stream of images with varying levels of objective image quality (i.e. image compression). The research assistant will apply machine learning techniques to develop a classifier that estimates the image compression level from the EEG of the viewer.

Job Description
The research assistant will process the EEG data and apply existing machine learning techniques. The research assistant will develop a document or presentation summarizing the work for possible inclusion in a journal article or presentation at a scientific conference.

Preferred Skills

  • Experience working with noisy data
  • General familiarity with machine learning, especially as applied to BCI Familiarity with at least one computer programming language (MATLAB is preferred, but not necessary)

ARL 1 – Research Assistant, Individual Traits and Training Effectiveness: Predicting the Effectiveness of Gamified Training

Project Name
Individual Traits and Training Effectiveness: Predicting the Effectiveness of Gamified Training

Project Description
A research focus of The Army Research Laboratory is the study of how individual traits modulate the effectiveness of game-like reward structures to enhance training outcomes. Encephalography (EEG) will be used to track behavior and cognitive states during training in a gamification environment to monitor trainee motivation in a continuous manner.

Job Description
The research assistant will work with ARL scientists to identify state changes based on behavioral and neural data within a gamified training environment and during the subsequent transfer task. The summer project will aim to define components of these state changes relating to regulatory focus and other trait-based measures by linking behavioral data with conventional EEG analyses.

Preferred Skills

  • A PhD student working in the field of cognitive neuroscience
  • Experience with EEG/MEG analysis preferred
  • Ability to work in a collaborative environment as well as independently
  • Interest in human behavior and training
  • Strong quantitative skills

300 – Research Assistant, Data-Driven Multiagent Modeling of Human Behavior/OpenMind project

Project Name
Data-Driven Multiagent Modeling of Human Behavior/OpenMind

Project Description
The Social Simulation Lab works on modeling and simulation of social systems from small group to societal level interactions, as well as data-driven approaches to validating these models. Our approach to simulation relies on multiagent techniques where autonomous, goal-driven agents are used to model the entities in the simulation, whether individuals, groups, organizations, etc.

Job Description
The research assistant will investigate automated methods for building agent-based models of human behavior. The core of the task will be developing and implementing algorithms that can analyze human behavior data and find a decision-theoretic model (or models) that best matches that data. The task will also involve using those models in simulation to further validate their potential predictive power.

Preferred Skills

  • Knowledge of multi-agent systems, especially decision-theoretic models like POMDPs.
  • Experience with Python programming.

299 – Research Assistant, Virtual Doppelganger

Project Name
Virtual Doppelganger

Project Description
The research project will examine the effect of avatar appearance on user performance. The project team will run a study with participants assigned to different appearance conditions. Statistical analyses will be used to test for significant differences between conditions to evaluate the impact of avatar appearance.

Job Description
The research assistant will help design and run the study, process and analyze the data, and possibly contribute to a publication and presentation of the work at a scientific conference.

Preferred Skills

  • Experience programming Unity
  • Some experience in human-computer interaction
  • Some experience in experimental design

298 – Research Assistant, Natural Language Annotation

Project Name
Natural Language Annotation

Project Description
The Natural Language Dialogue team collects linguistic data for use in developing, evaluating, training and extending coverage of our conversational dialogue systems. The annotation project consists of annotating conversation transcripts of human dialogue and/or human-machine dialogue for features relevant to understanding and engaging in dialogue.

Job Description
Annotation of conversation transcripts, using semantic and pragmatic representations or relations that have been developed specifically for our implemented systems. The job is suitable primarily for undergraduate students. Interns who reside locally in the Los Angeles area may be able to continue working at ICT after the summer. Topics of interest may include use of stories and extended narrative in dialogue, human-robot dialogue, cross-cultural dialogue, and conversation-related games.

Preferred Skills

  • Highly proficient in spoken & written English or Spanish (native or near-native competence preferred)
  • Some background in Linguistics or a related field
  • General feel for language and working with linguistic material

297 – Research Assistant, Natural Language Dialogue Processing for Virtual Humans

Project Name
Natural Language Dialogue Processing for Virtual Humans

Project Description
ICT is developing artificial intelligence and natural language processing technology to allow virtual humans to engage in spoken and face to face interactions with people for a variety of purposes, including training of conversational tasks with virtual role-players. Current research areas include, embodied dialogue, socio-cultural & affective dialogue, meta-dialogue, and topic switching, casual chat dialogue, dialogue architectures, computational theories of dialogue genres, evaluation of dialogue systems, and dialogue authoring.

Job Description
The student intern will work with the Natural language research group (including Professors, other professional researchers, and students) to advance one or more of the research areas described above. If the student has a particular goal or related work at their home institution they should briefly describe this in the application letter.

Preferred Skills

  • Some familiarity with dialogue systems or natural language dialogue
  • Either programming ability or experience with statistical methods and data analysis
  • Ability to work independently as well as in a collaborative environment

296 – Research Assistant, Human-Robot Dialogue

Project Name
Human-Robot Dialogue

Project Description
ICT has several projects involving applying natural language dialogue technology developed for use with virtual humans to physical robot platforms. Tasks of interest include remote exploration, joint decision-making, social interaction, and language learning. Robot platforms include humanoid (e.g. NAO) and non-humanoid flying or ground-based robots.

Job Description
This internship involves participating in the development and evaluation of dialogue systems that allow physical robots to interact with people using natural language conversation. The student intern will be involved in one or more of the following activities: 1. Porting language technology to a robot platform, 2. Design of task for human-robot collaborative activities, 3. Programming of robot for such activities, or 4. Use of a robot in experimental activities with human subjects.

Preferred Skills

  • Experience with one or more of:
  • Using and programming robots
  • Dialogue systems, computational linguistics
  • Multimodal signal processing, machine learning

295 – Research Assistant, Interactive Experience with a Holocaust Survivor

Project Name
New Dimensions in Testimony

Project Description
New Dimensions in Testimony is a joint effort of ICT, the USC Shoah Foundation, and Conscience Display, intended to create an interactive experience that replicates a live conversation with Holocaust survivors. The project will gather the survivors’ answers to hundreds of questions, recording them using advanced filming technologies which enable 3-D projection using current and future displays, and storing them in a computer database. The project will create systems that allow individuals to ask questions in conversation, and the survivor will answer from the testimony as if he were in the room, utilizing language understanding technology which allows the computer to find the most appropriate reaction to a user’s utterance.

Job Description
The intern will assist with developing, improving and analyzing the systems. Tasks may include running user tests; analysis of content and interaction results, and improvements to the systems. The precise tasks will be determined based on the skills and interests of the selected applicant,as well as the demands of the project during the time of the internship.

Preferred Skills

  • Very good spoken and written English (native or near-native competence preferred)
  • General computer operating skills (some programming experience desirable)
  • Experience in one or more of the following: 1. Interactive story authoring & design, 2. Linguistics, language processing or 3. A related field; museum-based informal education; Holocaust research and survivor testimonies

289 – Programmer, Captivating Virtual Instruction for Training

Project Name
Captivating Virtual Instruction for Training (CVIT)

Project Description
The objective of this task order is to plan, design, and develop an online learning application centered around the IA program of instruction topics as As part of the overall CVIT project and outside the scope of this modification, part of the intent of developing this courseware will be to validate and verify next-generation e-learning methods and technologies in order to maximize the return on learning and the engagement of course participants. The expected online course will be approximately 10 hours in length, and will cover comprehension, application and evaluation of IA topics and course material. The use of the application may be standalone by individual course participants, or as part of an instructor-led resident course in the classroom. Three courses have been developed and need support and extensions, Advanced Situational Awareness, Supervisor Development Course, and Intelligence Analyst course.

Job Description
Provide new features and correct bugs on CVIT training applications. Help develop new user interfaces and features. Test and ensure operation of system and responsiveness of deployment.

Preferred Skills

  • Full-stack
  • JSON, Java
  • Web services – AWS
  • UI/ UX design and prototyping skills

288 – Programmer, Virtual Acquisition Career Guide

Project Name
Virtual Acquisition Career Guide (VCG)

Project Description
The project is composed of a base phase and option phase – the base phase will develop a prototype VCG that is loosely integrated with existing USAASC systems. It will be made available to USAASC personnel and selected ALTWF members for testing and demonstration but will not be accessible to the broader acquisition community. Should the government opt for the follow-on effort, the final VCG will be tightly integrated with existing USAASC systems and deployed for widespread use by the ALTWF. For the base phase, the development of an ALTWF VCG consists of two primary technical efforts: 1) the design and development of a virtual guide that interacts with users in the contracting career field, specifically on the topic of certification management; and 2) initial integration with the existing ALTWF Career Acquisition Management Portal (CAMP) and Career Acquisition Personnel and Position Management Information System (CAPMIIS). The VCG will be built with the University of Southern California – Institute for Creative Technology’s (USC – ICT) SimCoach technology platform. The SimCoach platform combines a web delivered virtual human with a comprehensive set of web-based tools for content creation. The proposed ALTWF VCG system will be fully persistent, maintaining a record of users’ career information as well as a record of previous interactions with the VCG. For the option phase, the focus will be on hardening the system to: 1) support large numbers of simultaneous users; 2) tight and robust integration with CAPPMIS; 3) expanding the dialogue base to handle large variations in user interactions; and 4) deploying the system for widespread use by the acquisition community. In addition to these tasks, an operations and maintenance (O&M) plan will be put into place that details the requirements for sustaining the VCG after the contract has ended.

Job Description
Work on Language recognition, FAQs, Bugs, and general improvements needed on Sim Cao

Preferred Skills

  • Full-stack
  • JSON, Java
  • Web services – AWS

287 – Programmer, Graphic Programmer Internship

Project Name
Terrain Mod 10

Project Description
USC-ICT will research and develop a proof-of-concept capability for the ingestion, processing, storage, rendering and simulation of alternative sources of geo-referenced terrain data in next-generation game platform(s). Example data sources include elevation data, vegetation indices, commercial satellite imagery, social media, point clouds, buildings & surface features, roads, subterranean, and cultural features. From this source data, 3D models, materials, textures and features are algorithmically classified and imported into one or more game-based simulation environments. The goal is to procedurally convert, store and use this data in a platform that may serve as the foundation for future Army synthetic training.

Job Description
Work on Photogrammetry pipeline as assigned. Work on delineating and modeling objects and tagging objects in the environment. Help bring models into game engines and automate conversion and processing pipeline. Create Terrains in game engines from images and point cloud data.

Preferred Skills

  • Photogrammetry and Procedural Model creation
  • Game engines (Unity, UE, or others)
  • Knowledge of flying UAS and autopilot programs (desirable)
  • Point cloud data processing

286 – Research Assistant, Mixed Reality Lab (MxR) Techniques and Technologies

Project Name
Mixed Reality Lab (MxR) Techniques and Technologies

Project Description
The ICT MxR Lab researches and develops the techniques and technologies to advance the state-of-the-art for immersive virtual reality and mixed reality experiences. With the guidance of the principal investigators (Evan Suma Rosenberg and David Krum), students working in the lab will help to design, create, and evaluate prototypes and experiments designed to explore specific research questions in virtual reality and human computer interaction. Specific projects may include research in redirected walking, perception and cognition, avatar-mediated communication, and learning in virtual worlds.

Job Description
Duties will include brainstorming and rapid prototyping of novel techniques, developing virtual environments using the Unity game engine, running user studies, and analyzing experiment data. Some projects may include programming (such as C#, Python, Unity, Arduino), fabrication (3D design and 3D printing), 3D modeling, and audio design.

Preferred Skills

  • Development experience using game engines such as Unity
  • Prior experience with virtual reality technology or 3D/touch interfaces
  • Programming in C++, C#, or similar languages
  • Familiar with experimental design and user study procedures
  • Prior experience with rapid prototyping equipment (optional)

285 – Programmer, Building a Backbone for Multi-Agent Intelligent Tutoring Systems

Project Name
Building a Backbone for Multi-Agent Intelligent Tutoring Systems

Project Description
Over the last few years, a likely solution has emerged: service-oriented design and combining components from multiple Intelligent Tutoring Systems (ITS), which leverage artificial intelligence to speed up learning. The research problems that this work attempts to address are: 1) Multi-Agent ITS Services: Rapid and seamless integration of multiple ITS services that interact like agents in real-time to provide a coherent and effective learning experience. 2) Plug-and-Play Interoperability: Reducing barriers to adding services to ITS down to a level that a class of students could, within a day, add a new agent as a service. 3) Blending Expert Knowledge with Machine Learning: Models that allow experts to explicitly declare knowledge and then learn from data to improve agent performance, without needing to throw out the existing expert knowledge.

Job Description
The goal of this internship will be to program new web services that leverage artificial intelligence, machine learning, and semantic messaging to make it faster to build AI-based services that support learning. The specific tasks will be determined based on the status of the project at the time of the internship as well as your interests. Possible topics include work with: (1) building new tutoring system components and models, (2) machine learning models that identify efficient ways to leverage user data to improve the performance of tutoring system components in real time, (3) running brief usability tests with users of these components with new users of minimal-working-example tutoring system.

Preferred Skills

  • Python, JavaScript, Java
  • AI Programming or Statistics
  • Strong interest in artificial intelligence, machine learning, and human and virtual behavior

284 – Programmer, Option J – Assessment and Evaluation Programmer

Project Name
Option J – Assessment and Evaluation

Project Description
PAL3 is a system for delivering engaging and accessible education via mobile devices. It is designed to provide on-the-job training and support lifelong learning and ongoing assessment. The system features a library of curated training resources containing custom content and pre-existing tutoring systems, tutorial videos and web pages. PAL3 helps learners navigate learning resources through: 1) An embodied pedagogical agent that acts as a guide; 2) A persistent learning record to track what students have done, their level of mastery, and what they need to achieve; 3) A library of educational resources that can include customized intelligent tutoring systems as well as traditional educational materials such as webpages and videos; 4) A recommendation system that suggests library resources for a student based on their learning record; and 5) Game-like mechanisms that create engagement (such as leaderboards and new capabilities that can be unlocked through persistent usage).

Job Description
The goal of the internship will be to expand the repertoire of the system to further enhance learning and engagement. The specific tasks will be determined based on the status of the project at the time of the internship as well as your interests. Possible topics include work with: (1) models driving the dialog systems for PAL3 to support goal-setting, teamwork, or fun/rapport-building; (2) modifying the intelligent tutoring system and how it supports the learner, and (3) statistical analysis, and/or data mining to identify patterns of interactions between human subjects and the intelligent tutoring system.

Preferred Skills

  • C#, Java, Python, R
  • Dialog Systems, Basic AI Programming, or Statistics
  • Strong interest in intelligent agents, human and virtual behavior, and social cognition

283 – Programmer, Engage: Promoting Engagement in Virtual Learning Environments

Project Name
Promoting Engagement in Virtual Learning Environments

Project Description
The Engage project at ICT seeks to investigate motivation and engagement in game-based, virtual learning experiences. Specifically, the project focuses on how interactions with virtual humans can be made more effective and compelling for learners. If you have ever interacted with characters in video games or web-chat programs, you probably know there is much room for improvement! In this stage of the project, we will be analyzing data and finalizing service-oriented modules for optimizing interactions with agents. This analysis will include investigating the role of emotions and feedback from the system and its impact on engagement experienced by learners.

Job Description
The goal of the internship will be to expand the repertoire of the system to further enhance learning and engagement. The specific tasks will be determined based on the status of the project at the time of the internship as well as your interests. Possible topics include work with: (1) modifying the intelligent tutoring system and how it supports the learner, and (2) models driving the virtual human utterances and behaviors, and (3) emotion coding, statistical analysis, and/or data mining to identify patterns of interactions between human subjects and the intelligent tutoring system.

Preferred Skills

  • C#, Java, Python
  • Basic AI Programming or Statistics
  • Strong interest in human and virtual behavior and cognition

282 – Programmer, Emerging Concepts in Virtual Environments for Training Programmer/Developer

Project Name
Emerging Concepts in Virtual Environments for Training

Project Description
Redefine the role of virtual reality in training by developing new immersive interaction and presentation techniques, and studying collaboration and learning in complex virtual environments with a focus on narrative scenario development that explores the developing ‘language of VR’ for content creation.

Job Description
Programmer will join a team of student developers, artists and designers working on an immersive scenario.

Preferred Skills

  • Development experience using Unity game engine
  • Prior experience with virtual reality technology (e.g. head-mounted displays, motion tracking, virtual humans, etc.)
  • Prior experience with rapid prototyping immersive experiences

281 – Programmer, Lightweight and Deployable 3D Human Performance Capture for Automultiscopic Virtual Humans

Project Name
Lightweight and Deployable 3D Human Performance Capture for Automultiscopic Virtual Humans

Project Description
The lab is developing a lightweight 3D human performance capture method that uses very few sensors to obtain a highly detailed, complete, watertight, and textured model of a subject (clothed human with props) which can be rendered properly from any angle in an immersive setting. Our recordings are performed in unconstrained environments and the system should be easily deployable. While we assume well-calibrated high-resolution cameras (e.g., GoPros), synchronized video streams (e.g., Raspberry Pi-based controls), and a well-lit environment, any existing passive multi-view stereo approach based on sparse cameras would significantly under perform dense ones due to challenging scene textures, lighting conditions, and backgrounds. Moreover, much less coverage of the body is possible when using small numbers of cameras.

Job Description
We propose a machine learning approach and address this challenge by posing 3D surface capture of human performances as an inference problem rather than a classic multi-view stereo task. Intern will work with researchers to demonstrate that massive amounts of 3D training data can infer visually compelling and realistic geometries and textures in unseen region. Our goal is to capture clothed subjects (uniformed soldiers, civilians, props and equipment, etc.), which results in an immense amount of appearance variation, as well as highly intricate garment folds.

Preferred Skills

  • C++, OpenGL, GPU programming
  • Experience with computer vision techniques: multi-camera stereo, optical flow, facial feature, detection, bilinear morphable models, texture synthesis, markov random field
  • Operating System: Windows

280 – Programmer, Head-Mounted Facial Capture and Rendering for Augmented Reality

Project Name
Head-Mounted Facial Capture and Rendering for Augmented Reality

Project Description
The lab is developing techniques for enabling natural and expressive face-to-face communication between subjects in an augmented reality (AR) environment by removing the barriers introduced by immersive headmounted displays (HMDs). The degree to which users in AR environments can expressively interact with each other is hindered by HMDs, which occlude a large portion of the face. We propose a method to overlay a virtual face that replicates the subject’s appearance and expressions using facial performance capture. While state-of-the-art real-time face tracking technologies fail in the presence of occlusions, recent efforts by USC CS and USC ICT Graphics Lab and have resulted in algorithms and systems that allow for wearers of virtual reality HMDs to transfer their expressions to an avatar using sensors mounted to HMDs.

Job Description
Intern will work with researchers to develop a prototype AR HMD device based on Microsoft’s HoloLens and develop new algorithms for light-weight facial performance capture, as well as new techniques for appearance synthesis. Intern will support research efforts to fill in occluded facial regions using a digital face. This will also require capturing and rendering the dynamic lighting conditions on the face.

Preferred Skills

  • C++, OpenGL, GPU programming
  • Experience with computer vision techniques: multi-camera stereo, optical flow, facial feature
  • Operating System: Windows
  • Experience with detection, bilinear morphable models, texture synthesis, markov random fields

279 – Programmer, Authoring Novel Facial Performances for Digital Characters

Project Name
Authoring Novel Facial Performances for Digital Characters

Project Description
Authoring realistic digital characters for interactive applications is becoming a practical possibility leveraging numerous technologies developed at ICT. For example, Digital Ira was developed in collaboration with industry that resulted in a photo-real, real-time digital character driven using facial performance capture. An unanswered question is how to author novel performances for an existing character, without requiring additional performances from the original actor. Also, can we further automate the authoring of difficult areas around the eyelids and lip contours, which presently require artistic attention? We identify three avenues of research to address these questions: 1) Research towards authoring novel facial animations for digital characters; 2) Research towards automatic synthesis of the character’s appearance, in accordance with novel performances (including animated reflectance maps and geometric details); 3) Research towards improved automation in authoring eye movement, eyelid animation, and lip contour animation, especially during speech.

Job Description
Candidate will focus on how to author novel performances for an existing character, without requiring additional performances from the original actor. Also, research will focus on how to further automate the authoring of difficult areas around the eyelids and lip contours, which presently require artistic attention. Additional focus on authoring novel facial animations for digital characters, automatic synthesis of the character’s appearance, in accordance with novel performances and towards improved automation in authoring eye movement, eyelid animation, and lip contour animation.

Preferred Skills

  • C++, OpenGL, GPU programming
  • Experience with computer vision techniques: multi-camera stereo, optical flow, facial feature, detection, bilinear morphable models, texture synthesis, markov random fields
  • Operating System: Windows

267 – Research Assistant, Real-time Behavior Interpretation

Project Name
Real-time Behavior Interpretation

Project Description
The Real-time Behavior Interpretation (RBI) project is developing technologies for the automated interpretation of time-series data using a form of automated reasoning, called logical abduction, in a way that integrates closely with probability theory. We are especially interested in the interpretation of movies in the style of the famous Heider-Simmel film, depicting the shenanigans of two triangles and a circle around a box with a door.

Job Description
The RBI project team is seeking a summer intern with EITHER a deep love of both first-order logic and probability theory, OR a strong familiarity with contemporary deep-learning approaches to event segmentation and classification — preferably both. This summer intern is expected to contribute to technical research to develop system that generate high-level narrative interpretations of low-level observable behavior.

Preferred Skills

  • Practical knowledge of Theano/Keras or TensorFlow
  • Automated deduction and theorem proving
  • Propositional and first-order logic

266 – Research Assistant, Data-driven Interactive Narrative Engine

Project Name
Data-driven Interactive Narrative Engine

Project Description
The Data-driven Interactive Narrative Engine (DINE) project is creating a new platform for interactive fiction that allows for free-text input for textual scenarios and free-speech input for audio/video scenarios.

Job Description
The DINE project team is seeking a summer intern with skills and interests in human-computer interaction and user-interface evaluations to help us design and evaluate different interaction approaches for audio-based interactive storytelling.

Preferred Skills

  • Familiarity with experimental design and experimental hypothesis testing
  • Python programming, Unix/Linux shell scripting, database SQL queries
  • HTML/Javascript programming, and MATLAB/r-package/pandas data analysis

265 – Research Assistant, Multimodal Representation Learning of Human Behaviors/Machine Learning

Project Name
Multimodal Representation Learning of Human Behaviors/Machine Learning

Project Description
Machine Learning in general, relies heavily on good representations or features of data that yield better discriminatory capability in classification and regression experiments. In order to derive efficient representations of data, researchers have adopted two main strategies: (1) Manually crafted feature extractors designed for a particular task and (2) Algorithms that derive representations automatically from the data itself. The latter approach is called Representation Learning (RL), and has received growing attention because of increasing availability of both data as well as computational resources. In fact, RL was responsible for large performance boosts in a number of machine learning applications, including performance improvements for speech recognition and facial expression analysis. At ICT we are in particular interested to advance the state of the art in deep neural networks and machine learning approaches that allow us to learn multimodal representations of human behavior. We will use these representations to assess an individual’s well-being and affective state.

Job Description
The candidate has experience with machine learning approaches and is comfortable in programming in Python. The candidate will participate in machine learning experiments that aim to better predict a person’s psychological well-being, e.g. depression recognition. We have access to the largest dataset of depression screening interviews and will leverage big data resources to train successful models. A big plus is if the candidate has experience with deep learning toolboxes such as Tensorflow, Theano, or Keras.

Preferred Skills

  • Python
  • Machine Learning
  • Linux
  • Deep Learning Toolboxes

264 – Research Assistant, Cognitive Architecture Research Assistant

Project Name
The Sigma Cognitive Architecture

Project Description
This project is developing a new cognitive architecture; i.e., a computational hypothesis about the fixed structures underlying a mind, whether natural or artificial. Sigma is built in Lisp and is based on the elegant but powerful formalism of graphical models, which enable combining both statistical/neural and symbolic aspects. We are working on a broad variety of topics, including (deep) learning and memory, problem solving and decision making, perception and imagery, speech and language, and social and affective processing. We are also developing adaptive virtual humans – graphically embodied humanoids that can learn from their experience.

Job Description
Looking for a student interested in developing, applying, analyzing and/or evaluating new intelligent capabilities in an architectural framework.

Preferred Skills

  • Programming (Lisp preferred, but can be learned once arrive)
  • Graphical models (experience preferred, but ability to learn quickly is essential)
  • Cognitive architectures (experience preferred, but interest is essential)

263 – Programmer, Integrated Virtual Humans Programmer

Project Name
Integrated Virtual Humans

Project Description
The Integrated Virtual Humans project (IVH) seeks to create a wide range of virtual human systems by combining the various research efforts within USC and ICT into a general Virtual Human Architecture. These virtual humans range from relatively simple, statistics based question / answer characters to advanced, cognitive agents that are able to reason about themselves and the world they inhabit. Our virtual humans can engage with real humans and each other both verbally and nonverbally, i.e., they are able to hear you, see you, use body language, talk to you, and think about whether or not they like you. The Virtual Humans research at ICT is widely considered one of the most advanced in its field and brings together a variety of research areas, including natural language processing, nonverbal behavior, vision perception and understanding, task modeling, emotion modeling, information retrieval, knowledge representation, and speech recognition.

Job Description
IVH seeks an enthusiastic, self-motivated, programmer to help further advance and iterate on the Virtual Human Toolkit. Additionally, the intern selected will research and develop potential tools to be used in the creation of virtual humans. Working within IVH requires a solid understanding of general software engineering principles and distributed architectures. The work touches on a variety of Computer Science areas, including Artificial Intelligence and Human-Computer Interaction. Given the scope of the Virtual Humans Architecture, the ability to quickly learn how to use existing components and develop new ones is essential.

Preferred Skills

  • Fluent in C++, C#, or Java
  • Fluent in one or more scripting languages, such as Python, TCL, LUA, or PHP
  • Excellent general computer skills
  • Background in Artificial Intelligence a plus

262 – Technical Artist/Graphics, Technical Artist

Project Name
Art Group

Project Description
The Art Group project (AG) facilitates ICT projects in reaching their full potential by collaboratively defining and meeting art and pipeline needs. Starting with client’s and researcher’s core concepts and needs, we design and create all aspects of immersive experiences; UI, scripts, storyboards, audio, all visual assets, and the tools and pipelines needed to make them.

Job Description
AG seeks an enthusiastic, self-motivated, detail-oriented technical artist to focus on artistic and design verification process and/or asset creation. The intern selected will work closely with our tech-art and QA teams to identify and create tools and assets needed for projects, asset creation, and more efficient QA procedures.

Preferred Skills

  • Excellent general computer skills
  • Excellent Adobe Suite skills
  • Ability to quickly learn new systems
  • Familiarity with Unity a plus

261 – Research Assistant, Digital Character Generation and Control

Project Name
Human Modeling, Simulation and Control

Project Description
Digital characters are an important part of entertainment, simulations and digital social experiences. Characters can be designed to emulate or imitate human-like (and non-human like behavior). However, humans are very complicated entities, and in order to create a convincing virtual human, it is necessary to model various elements, such as human-like appearance, human-like behaviors, and human-like interactions. 3D characters can fail to be convincing representations because of improper appearance, improper behavior, or improper reactions. The goal of this internship is to advance the state-of-the-art in character simulation by improving or adding aspects to a digital character that would make them more convincing representations of real people.

Job Description
Research, develop and integrate methods for use on virtual characters to better fidelity, interaction or realism of characters. Design or implement algorithms from research papers and integrate into animation/simulation system (SmartBody).

Preferred Skills

  • C++
  • Computer graphics and animation knowledge
  • Research in character/animation/simulation/human modeling