West Point’s Pointer View Says ELITE Offers Ideal Way for Leadership Students to Practice Counseling Skills

An article in Pointer View, the newspaper serving the West Point community, covered how the ELITE counseling trainer is being used by all students in the military academy’s Military Leadership course. The article states that ELITE, which was developed by ICT, has already proven to have advantages from how training was previously conducted, by eliminating the need for live role players.

“To get that kind of one-on-one training with someone senior to them—faculty or staff—is 623 hours minimum I need to ask of my colleagues,” said Lt. Col. Darcy Schnack, director of the leadership course.

Schnack said it’s been exciting to conduct this first iteration of counseling instruction with ELITE.

“I’m hopeful to hear some quality feedback from cadets on how helpful this was in their learning,” she said. “I’ve been really pleased in having an opportunity to use ELITE this semester and I can see all kinds of ways we can use this going forward.”

The article also notes that scenarios include SHARP-related storylines and that the system provides hints, feedback and After Action Review.

Advances in Photoreal Digital Humans in Film and in Real-Time

ADOLESCENT SUICIDAL RISK ASSESSMENT IN CLINICIAN-PATIENT INTERACTION: A STUDY OF VERBAL AND ACOUSTIC BEHAVIORS

Suicide among adolescents is a major public health problem: it is the third leading cause of death in the US for ages 13-18. Up to now, there is no objective way to assess the suicidal risk, i.e. whether a patient is non-suicidal, suicidal re-attempter (i.e.repeater) or suicidal non-repeater (i.e. individuals with one suicide attempt or showing signs of suicidal gestures or ideation). Therefore, features of the conversation including verbal information and nonverbal acoustic information were investigatedfrom 60 audio-recorded interviews of 30 suicidal (13 repeaters and 17 non-repeaters) and 30 non-suicidal adolescents interviewed by a social worker. The interaction between clinician and patients wasstatistically analyzed to reveal differences between suicidal vs.non-suicidal adolescents and to investigate suicidal repeaters’behaviors in comparison to suicidal non-repeaters. By using a hierarchical ensemble classifier we were able to successfully discriminate non-suicidal patients, suicidal repeaters and suicidal non-repeaters.

What’s the Next Big Thing in Gaming – Panel Discussion

Jon Gratch’s Emotion Modeling Work Featured in Communications of the ACM

The December issue of Communications of the ACM features an article by Jon Gratch and Stacy Marsella  on their joint research efforts to develop computational models of emotion. This video accompanies the article and explains their work.

LA Times Cites ICT in Examples of Playa Vista Tech Innovators

In a story about Google buying 12 acres of land in Playa Vista, the The Los Angeles Times mentioned that the USC Institute for Creative Technologies is located in Playa Vista and called the area a hub for technology companies and innovation. The story noted that Oculus Rift founder Palmer Luckey once worked at ICT.

3-D Portrait of President Obama Garners Press Coverage for ICT Contributions

The Los Angeles Times featured a 3-D printed bust of President Obama created using digital imaging techniques developed by the USC Institute for Creative Technologies.  MSNBC quoted Paul Debevec of the USC Institute for Creative Technologies about the bust’s creation. The story was also covered by The Associated Press, National Journal, Associated Press, CBS News, Newsweek, New York Post, CNET, Engadget, The Huffington Post, Yahoo News, Gizmodo, Belfast Telegraph, ABC News Las Vegas affiliate KNTV-TV and Fox News Los Angeles affiliate KTTV-TV.

USC Institute for Creative Technologies Helped Create 3D Portraits of President Obama Now on Display at the Smithsonian

White House Releases New Behind-the-Scenes Video Detailing Process Used to Scan a Sitting President for the First Time

The Smithsonian has put on view the first ever 3-D Presidential portraits created using technology from the University of Southern California Institute for Creative Technologies.

A newly released White House video explains the process.

The University of Southern California Institute for Creative Technologies (ICT) was part of a Smithsonian-lead team that created 3-D portraits of President Obama. A modified Light Stage X, a high-speed system with eight cameras and 50 LED lights that capture detailed shape and reflectance properties of a face in seconds, recorded the President’s facial features in high-resolution. The Light Stage data was processed by the ICT Graphics Lab team, and subsequently combined with additional data capture by the Smithsonian team to create a life-sized bust and life mask of the president.

The portraits, part of the collection of the Smithsonian’s National Portrait Gallery, will be on view in the Commons gallery of the Smithsonian Castle starting today through Dec. 31. They were previously displayed at the White House Maker Faire on June 18.

The White House released video shows ICT researchers, part of a Smithsonian-led team creating a 3-D presidential portrait,  on location in the White House to scan the president using their Mobile Light Stage system.

“It can be used to record, almost certainly, the highest resolution digital model that’s ever been made of a head of state,” says ICT’s Paul Debevec in the video.

The video explains that Smithsonian’s inspiration for the project came from the Lincoln life masks in the institution’s National Portrait Gallery. These masks were directly taken from Lincoln’s likeness by putting plaster on his face, with holes poked in the nostrils so he could breathe.

“Seeing that made us think what would happen if we could actually do that with a sitting president, using modern day technologies and tools to create a similarly authentic experience that connects us to history, that connects us to a moment in time and connects us to a person’s likeness,” said Gunther Waibel, director of  the Smithsonian Digitization Program.

The video documents President Obama taking part in the Light Stage scanning process, which Debevec explains involves creating different lighting conditions using 50 custom- built LED lights, eight high-resolution sports photography cameras, an additional six wider-angle cameras,  and a one-second presidential pose.

“That will give us everything we need to understand the shape of the his face and how transforms incident illumination into the images we see of him,” said Debevec. “Ten years ago is was just barely possible to think this could be done.”

The video then shows the Smithsonian team scanning the president’s body and face with a handheld structured light 3-D scanner. The White House states that the president getting his likeness scanned  – as cool as that is – is also about a  broader trend, the third industrial revolution.

“It is the combination of the digital world and the physical world that is allowing students and entrepreneurs to go from idea to prototype in the blink of an eye,” said Tom Kalil of the White House Office of Science and Technology Policy.

The video goes on to show some of the raw data taken from the scans and then shows the president viewing the 3-D bust that was created.

“This is the first bust that’s created of a head of state from objective 3-D scan data,” said Adam Metallo, of the Smithsonian’s 3D Digitization Program. “ So this isn’t an artistic likeness of the president. This is actually millions upon millions of measurements that create a 3D likeness of the president that we can now 3D print and make something that has never been done before.”

For more information visit the ICT Graphics Lab.

Read the USC News story.

 

 

 

Advances in Photoreal Digital Humans in Film and in Real-Time

Dyadic Behavior Analysis in Depression Severity Assessment Interviews

Computational Analysis of Persuasiveness in Social Multimedia: A Novel Dataset and Multimodal Prediction Approach

Our lives are heavily influenced by persuasive communication, and it is essential in almost any types of social interactions from business negotiation to conversation with our friends and family. With the rapid growth of social multimedia websites, it is becoming ever more important and useful to understand persuasiveness in the context of social multimedia content online. In this paper, we introduce our newly created multimedia corpus of 1,000 movie review videos obtained from a social multimedia website called ExpoTV.com, which will be made freely available to the research community. Our research results presented here revolve around the following 3 main research hypotheses. Firstly, we show that computational descriptors derived from verbal and nonverbal behavior can be predictive of persuasiveness. We further show that combining descriptors from multiple communication modalities (audio, text and visual) improve the prediction performance compared to using those from single modality alone. Secondly, we investigate if having prior knowledge of a speaker expressing a positive or negative opinion helps better predict the speaker’s persuasiveness. Lastly, we show that it is possible to make comparable prediction of persuasiveness by only looking at thin slices (shorter time windows) of a speaker’s behavior.

A Multimodal Context-based Approach for Distress Assessment

PC World Features Ari Shapiro and SIGGRAPH’s Motion in Games Conference at ICT

An article about the recent ICT-hosted SIGGRAPH Motion in Games Conference (MIG) focused on the work of ICT’s Ari Shapiro as well as larger trends in gaming and academia that are creating ever-more realistic video game characters. The story noted that Shapiro is the face of Digital Ira, ICT’s collaboration with Acitivision and Nvidia.

“With animation, it’s all about human factors. It’s not just ‘does the character look real’ but does it act real, in socially appropriate ways, alongside the other players in the game,” Shapiro said.

The story noted that MIG 2014 brought academics and industry professionals together for three days to share breakthroughs, debate papers, and then translate the latest research so that it could be used to enhance future games. The story explained that the univesity-research presented at the conference is important for academia and the game industry.

“This sort of research takes years and is outside the scope of the average game company’s R&D budget. This is one major reason why the industry is now in close collaboration with academia—and vice versa. Because university research departments can better attract top talent if there’s a groundbreaking game that results from years of study. At MIG 2014, both sides were learning from each other and that can only bode well for the future of gaming,” said the story.

NPR Features Virtual Veteran Created by ICT

NPR affiliate KPCC reported on MILES, USC project that uses virtual humans to train social workers in how to counsel veterans. A collaboration between ICT and the USC School of Social Work, the story notes that this first of its kind technology aims to improve how caregivers relate to the growing number of vets struggling with mental health issues.

According to the story, only about half of veterans struggling with mental health issues say they’re getting the care they need. Part of the problem is that not all service providers know how to work with the unique challenges facing today’s vets.

The ICT-developed computer simulated veteran is a teaching tool allowing caregivers a chance to learn strategies to help this often at-risk population.

The story states that about 60 Veterans Affairs employees have trained with the program and there are plans for more caregivers to do so as well.

USC mHealth Collaboratory Hosts Self Tracking Technology Event at ICT- PBS Covers

The USC mHealth Collaboratory, co-directed by ICT’s Bill Swartout and directed by Donna Spruijt-Metz, hosted the Quantified Self Los Angeles (QSLA) Show & Tell Meetup in August at ICT. The PBS documentary “Future of You” featured experts and issues from the event. USC News also covered the meeting, noting that ICT is a supporter of the mHealth Collaboratory.

The USC mHealth Collaboratory, part of the USC Dornsife Center for Social and Economic Research, aims to foster a collaborative mHealth research community, support incubation for new ideas, and facilitate creative information exchange in mobile and connected health. We will kick off a series of mobile and connected health symposia/training events, bring together mHealth researchers across institutions to develop a West Coast mHealth Consortium, help junior researchers get their mHealth efforts off the ground, work with education programs to train in transdisciplinary science, and develop exciting relationships with industry.

Skip Rizzo Featured on RTÉ, Ireland

RTÉ (Raidió Teilifís Éireann), Ireland’s national public-service media, featured Skip Rizzo and his work using virtual reality for treating post-traumatic stress. The story, based on Rizzo’s presentation at Ireland’s Web Summit, noted that Rizzo has been using virtual reality since the 90’s and has been experimenting with the Oculus Rift with more detailed simulations of deployments in Iraq and Afghanistan with positive results. This intersection of gaming, virtual reality, cinematic storytelling and clinical psychology could see exposure therapy applied in numerous other contexts, from road traffic accidents to sex attacks, stated the story.

Isolated word recognition in the Sigma cognitive architecture

Efficient message computation in Sigma¹s graphical architecture

Virtual Standardized Patient Authoring Workshop

7th International ACM SIGGRAPH Conference on Motion in Games 2014 (MIG 2014)

Games have become a very important medium for education, therapy and entertainment. Motion plays a crucial role in computer games. Characters move around, objects are manipulated or move due to physical constraints, entities are animated, and the camera moves through the scene. Even the motion of the player nowadays is used as input to games. Motion is currently studied in many different areas of research, including graphics and animation, game technology, robotics, simulation, computer vision, and also physics, psychology, and urban studies. Cross-fertilization between these communities can considerably advance the state-of-the-art in the area. The goal of the Motion in Games conference is to bring together researchers from this variety of fields to present their most recent results, to initiate collaborations, and to contribute to the establishment of the research area. The conference will consist of regular paper sessions, poster presentations, and as well as presentations by a selection of internationally renowned speakers in all areas related to games and simulation. The conference includes entertaining cultural and social events that foster casual and friendly interactions among the participants.

Virtual Reality Demos and Panel Discussion

Defense News and Leavenworth Times Cover ELITE Leadership Trainer

The Army-wide availability of ELITE, the ICT-developed laptop-based couseling skills trainer was covered by Defense News.

The article noted that ELITE is available to soldiers across the Army, and it is in use at West Point, ROTC, the basic officers’ leadership course and the Warrior Leader Course.

“Interpersonal skills are very important, no matter what environment you’re in,” said Marco Conners, chief of the Army’s Games for Training Program at Fort Leavenworth, Kansas. “What we wanted to do was develop a scenario and a counseling tool that would teach counseling processes. [ELITE] provides you with some resources and some techniques for how to effectively counsel.”.

As more and more people learn about ELITE Lite, the Army is getting more requests, including from operational units, for the program, said Tim Wansbury, a technology transition officer with the Army Research Laboratory.

“That’s really encouraging to us,” he said. “When commanders start contacting [us] and say, ‘how do I get my hands on this software?’ ”

The Leavenworth Times also covered the story, noting that some of the ELITE scenarios are designed to address the Army’s focus on sexual harassment and sexual assault.

An Authoring Tool for Movies in the Style of Heider and Simmel

Seventy years ago, psychologists Fritz Heider and Marianne Simmel described an influential study of the perception of intention,where a simple movie of animated geometric shapes evoked in their subjects rich narrative interpretations involving their psychology and social relationships. In this paper, we describe the Heider-Simmel Interactive Theater, a web application that allows authors to create their own movies in the style of Heider and Simmel’s original film, and associate with them
a textual description of their narrative intentions. We describe an evaluation of our authoring tool in a classroom of 10th grade students, and an analysis of the movies and textual narratives that they created. Our results provide strong evidence that the authors of these films, as well as Heider and Simmel by extension, intended to convey narratives that are rich with social, cognitive, and emotional concerns.

Civilian Analogs of Army Tasks

Over the last decade, millions of people have used their public weblogs to tell personal stories about their life experiences. Often intended only for close friends and associates, publishing these stories on the web affords storytelling across enormous distances and time to unimagined audiences, who learn from the experiences narrated in these posts and apply their lessons to their own situations. This artwork, entitled “Civilian Analogs of Army Tasks,” challenges the audience to consider the longevity of these personal stories, and what their collective wisdom offers for audiences far away in time and space.

The piece consists of two main parts. First, through the medium of a digital graphic novel, it tells the story of a post-apocalyptic future where a dying father and his young daughter survive in a bomb shelter under the rubble of a drive-in theater. Second, through an interactive browser application, and a mouse interface, museum patrons navigate high level categories of military relevance, including “Battle Command,” “Stability Operations,” and “Leadership.” For each, they are presented with lists of civilian activities with overlapping skill sets, e.g. “firefighting,” “construction work,” and “sports refereeing.” In turn, each of these is linked to lists of weblog posts that tell personal stories about these activities, written by real people. In all, 1,425 stories are organized into 102 civilian activities, and eight high level military concerns. These stories are displayed as they appeared on the web in 2014.

Learning Agents

The Warrior’s Return: From Surge to Suburbia

When we ask young men and women to go to war, what are we asking of them? When their deployments end and they return—many of them changed forever—how do they recover some facsimile of normalcy? MacArthur award-winning author David Finkel discusses the struggling veterans chronicled in his deeply affecting book, Thank You for Your Service with Skip Rizzo, Director for Medical Virtual Reality at the Institute for Creative Technologies at USC—who has pioneered the use of virtual reality-based exposure therapy to treat veterans suffering from PTSD.
*Presented in association with The L.A. Odyssey Project

Graphics Talk and Panel Discussion

Virtual Humans: Applications in Medicine, Psychology and Nursing

NPR’s Press Play Interviews Skip Rizzo on VR for PTSD

Madeline Brand of KCRW’s Press Play interviewed Skip Rizzo on virtual reality therapy for treating PTSD, calling him a pioneer in the field. Listen to the interview here.

digiSTORY 2014 Keynote and Breakout Session

Debevec is part of the technical team working on The Hunger Games: Mockingjay Part 2. In his morning keynote at digiSTORY2014, he will talk about their efforts to complete scenes after star Phillip Seymour Hoffman’s tragic death. In his afternoon breakout session, Debevec will reveal some of the emerging tools and techniques in special effects.

Beware of computers bearing smiles: Modeling the social and cognitive effects of emotion in virtual humans

Over forty years ago, Herb Simon argued that emotions would be required by any intelligent entity that must act in a dynamic, semipredictable and social world. Nonetheless, the cognitive science revolution largely had comparatively little impact on emotion research. This has change in recent years with greater interest in functional approaches to emotion in psychology and economics, and an explosion of interest in computer science on techniques for recognizing, modeling and exploiting emotions in simulations, decision models, and human-machine interaction. In this interdisciplinary talk, I will broadly overview my research on the role of emotion in understanding the social and cognitive function of emotion for both humans and machines. With regard to cognition, I will review my research on the EMA model of emotion, based on appraisal theory, and illustrate how it predicts the antecedents and consequences of emotion in several decision-making task. With regard to social cognition, I will highlight how expressed emotion fundamentally alters how we make decisions with other decision-makers (both natural and artificial). I will illustrate these points using studies across a variety of domains including medical interviews, economic decision-making and computer games. I will discuss both the theoretical consequences of these findings for human cognition as well as their practical implications for human-computer, computer-mediated and human-robot interaction. Throughout, I will argue the need for an interdisciplinary partnership between the social and computational sciences around to topic of emotion.

Wall Street Journal Reports New Benefits for Virtual Reality Therapy

Skip Rizzo is quoted in this article that covers the development, research results and new applications for virtual reality exposure therapy, including treating victims of sexual assault who are suffering from post-traumatic stress. The story notes that virtual reality exposure therapy dates back to the early 1990’s and that researchers say it has shown to be successful in treating eating disorders, alcoholism and phobias, like a fear of flying or of public speaking.

“All we’re really doing here is taking an evidence-based treatment and delivering it with new technology,” said Rizzo, who developed the Virtual Iraq/Afghanistan exposure therapy treatment that is featured in the story. “People will never forget the horrible things they’ve gone through. But that doesn’t mean they’ve earned a life sentence of pain.”

Secretary of the Army Learns about ELITE

Secretary of the Army John McHugh receives a briefing from Fort Leavenworth’s MAJ Greg Pavlichko about a new program, called the Emergent Leader Immersive Training Environment (ELITE). ELITE provides training scenarios to help young leaders develop their counseling skills. Pavlichko, of the Army Games for Training Program, demonstrated ELITE at the recent Association of the United States Army convention in Washington.

The screen shot comes from an ELITE training scenario. ELITE software can now be downloaded by Soldiers (with a CAC card) from the Army MilGaming portal.

ELITE was developed by ICT in collaboration with the Army Research Lab’s Human Research and Engineering Directorate, Simulation and Training Technology Center.

Photo credit: US Army

SimSensei and Skip Rizzo in the New Yorker

An article on therapeutic uses for virtual humans featured ICT’s SimSensei platform, along with recent ICT research findings suggesting that people feel less judged by interviews with virtual humans. The story notes that SimSensei has received funding frim DARPA and incorporates a virtual human, a microphone, a webcam, sensors. As Ellie asks questions of subjects and they respond, she accumulates data on their speech patterns and motions in order to assess their mental condition, states the story. Albert (Skip) Rizzo, a psychologist who’s co-leading the project, explained that people with post-traumatic stress disorder often touch their faces. In one ongoing study, soldiers in the Colorado National Guard met with Ellie before combat deployment to Afghanistan, and will meet with her at least once more when they return. The goal is to determine whether the data Ellie gathers can be used to predict mental-health difficulties.

Interactive Media Forum

Mind-Reading for Robots: Anthropomorphism in Human-Computer Interaction

This talk is from 2:00 pm to 3:00 pm

Join Andrew S. Gordon, research associate Professor of computer science at the USC’s Institute for Creative Technologies, as he discusses his current research into computer-human interaction and the on going effort to explain to computers how humans think.

The lives of children are filled with playful anthropomorphism, where every train, kangaroo, and sunflower takes on human emotions, expectations, desires, plans, and beliefs. These tendencies become even more important in the everyday lives of adults, where anthropomorphism helps people cope with complexity – particularly the complexities of computer technologies. While it might be useful to treat computers as if they were people, it would be even better if our computers would treat us as people: as creatures with human emotions, expectations, desires, plans, and beliefs.

This talk is presented in conjunction with an exhibit by artist Miri Chase, entitled Re:Mind, on themes related to the human mind and artificial intelligence.

We’re Hiring: Research Programmer Postion

ICT is seeking a Research Programmer focused on the real-time rendering and simulation of large physical spaces using photogrammetry and similar techniques. We are flying quad copters at ~75m taking still imagery that is then fed into photogrammetric software to produce 3D, textured models. However the resulting models are currently not sufficient for real-time rendering in a game engine due to unoptimized geometry, lack of collision volumes and other issues.

Strong skills with geometric and texture manipulation and optimization a must. Knowledge of real-time render of large environmental 3D models, terrains, and the Unity game engine is helpful. Strong graphics programming experience a must. Specializations in geometry manipulation/calculation and texture application desired.

Apply here.

Hollywood Reporter Covers Paul Debevec and New Effort to Advance Realistic Digital Characters

A story covered the newly launched “digital humans league”, which includes Paul Debevec and other members of the ICT Graphics Lab. Along with industry partners, including Chaos Group’s Christopher Nichols and Academy-Award winner Steve Preeg, the group plans to create an open-source reference guide for how to create realistic digital humans. Plans could be available my March, according to the story.

Cara Santa Maria Interviews Paul Debevec on Talk Nerdy Podcast

Paul Debevec joined science journalist Cara Santa Maria on her podcast to share the process of making a digital character, the importance of crossing the uncanny valley, and how you can tell that the moon landing was not a hoax.

Randy Hill and ICT Virtual Humans in the Huffington Post

An article exploring artificial intelligence in the workforce quoted Randy Hill, ICT’s executive director on the ways technology advances can improve leadership skills.
Hill expects that in the next 10 years, it will be much more common for managers to have access to AI-based tools to help develop stronger leaders.

That might include practise in managing an employee who may be experiencing personal issues or rehearsing ahead of complex negotiations. Artificial intelligence will also allow companies to play “what-if” scenarios when making a decision, to predict employee and market reaction, stated the story. These AI-based systems, Hill explained, will not only understand natural language but also body language, interpreting emotion. Imagine Siri, the voice in your iPhone, responding to your body language and reacting to your jokes.

“Virtual humans are analogous to flight simulators — just as a pilot can practise flying different types of aircraft in a variety of scenarios, a virtual human can provide the ability to practise high-stakes personal interactions in a low-risk environment, where mistakes can be transformed into lessons learned through practise, feedback and automated tutoring,” he said.

The story also included a link to the ELITE prototype, stating that ICT uses AI to provide junior leaders in the military with a practice environment to improve their interpersonal communication skills.

A Conceptual Audio Model for Simulating Pulmonary Auscultation in a High-Fidelity Virtual Physical Examination

Physician lung auscultation skills in the USA are suboptimal, primarily due to lazy technique and weak recognition of clinical findings. Virtual patients often play sampled breath sounds, but a sound in isolation is limiting because an actual lung exam varies with stethoscope location and breathing phase according to the degree of inspiratory and expiratory range. A simulator should, in theory, assess and inform technique, recognition skills and diagnostic association. This pilot effort examines the practicality of a simulation that more faithfully replicates the patient lung exam experience.

Popular Science, Engadget Cover ICT’s Virtual Reality Exposure Therapy

Popular Science Magazine notes that virtual reality has been used as a treatment for post-traumatic stress and cites Skip Rizzo, whose team has built a system for veterans who served in Iraq and Afghanistan, in which participants could be subjected–under a clinician’s care–to more than a dozen different scenarios from the Middle East.

To create the most immersive VR therapy experience, Rizzo incorporates extra sensory stimuli. He adds realistic sounds such as boots on gravel, military banter, and even indigenous birdsong, the story states.

Engadget also covered this story, adding that the system been adopted by over 60 facilities, including military bases, university centers and VA hospitals. It allows therapists to recreate a trauma in a controlled environment. By leveraging virtual reality headgear (though not specifically the Oculus Rift), directional audio, force feedback and olfactory stimuli, a trained therapist can help patients confront their trauma at their own pace.

Researchers think the system could be used to help prevent trauma as well, and are working on adapting it into a training program for both stress resilience and PTS diagnosis, notes the story.

Keynote Talk: Designing the User in User Interfaces

In the good old days, the human was here, the computer there, and a good living was to be made by designing ways to interface between the two. Now we find ourselves unthinkingly pinching to zoom in on a picture in a paper magazine. User interfaces are changing instinctual human behavior and instinctual human behavior is changing user interfaces. We point or look left in the “virtual” world just as we point or look left in the physical.

It is clear that nothing is clear anymore: the need for “interface” vanishes when the boundaries between the physical and the virtual disappear. We are at a watershed moment when to experience being human means to experience being machine. When there is not a user interface – it is just what you do. When instinct supplants mice and menus and the interface insinuates itself into the human psyche.

We are redefining and creating what it means to be human in this new physical/virtual integrated reality – we are not just designing user interfaces, we are designing users.

A Raycast Approach to Hybrid Touch / Motion Capture Virtual Reality User Experience

Poster Presentation

Real-time and Robust Grasping Detection

Poster Presentation.

The California Sunday Magazine Quotes Mark Bolas in Virtual Reality Cover Story

Writer Carino Chacano spoke to Mark Bolas for her cover story on virtual reality. She wrote that Bolas began creating rudimentary virtual worlds 20 years ago and recalls them as fondly as a European backpacking trip he took. “I’m nostalgic for worlds. To me, those were very real places. They don’t exist anymore because the computer’s gone, that head-mount’s gone,” he said. “I’m not the only researcher to say that. I’ve talked to some other people from the old days, and they’re, like, ‘Yeah, I miss my worlds.’”

Mr. Clue – A Virtual Agent That Can Play Word-Guessing Games

Army AL&T Magazine Features the ELITE New Leader Training System

An article showcased the laptop version of the Emergent Leader Immersive Training Enviroment. This new training system uses low-cost, game-based “interpersonal communications simulation” in which students conduct interactive counseling sessions with virtual characters in scenarios specifically designed to represent sessions that are challenging for young, inexperienced leader. These include issues of on-the-job personnel conflicts, financial and family stresses, substance abuse, and other personal difficulties and performance concerns.

The ELITE platform represents a new generation of training simulation—one that employs sound instructional design principles along with game technologies to help junior leaders develop “soft” leadership skills like interpersonal communication and counseling. The system employs” virtual human” role-players instead of live actors in engaging practice exercises. It uses artificial intelligence technologies to assess student performance and provides embedded coaching and tutoring as students work through instruction and practice exercises. Much of the research and development (R&D) that produced the ELITE experiences takes place at the University of Southern California Institute for Creative Technologies (USC ICT), a DOD-approved, Army-sponsored, university-affiliated research center, the story noted.

The Army Games for Training Program has begun to distribute the ELITE Lite software Armywide via the Army MilGaming website.

Hollywood Reporter Covers Debevec to Co-Chair Academy Sci-Tech Council

The Hollywood Reporter mentioned Paul Debevec of the USC Institute for Creative Technologies is a co-chair of the Academy of Motion Picture Arts and Sciences’ Science and Technology Council. Deadline also mentioned Debevec’s position.

LA Times Covers ICT-Developed Technologies

A story about digital doubles focused on LightStage LLC in Burbank, a company that licensed the ICT Graphics Lab’s Light Stage technologies to use in commercial projects for film, television and entertainment. The story also noted that ICT collaborated with Image Metrics to help create a widely viewed double of actress Emily O’Brien.

New Republic Features Accent Study from ICT Researchers

The New Republic covered a recent study led by Morteza Dehghani, from USC’s departments of computer science and psychology, along with ICT researchers Peter Khooshabeh, Angela Nazerian and Jonathan Gratch. The story notes that the study, to be published in the Journal of Language and Social Psychology, suggests accents can have a meaningful impact on the way we interpret the world. Just listening to a voice with a foreign accent can prime people to see a situation according to the values of the foreign culture, if they have ties to it—or it can reinforce the standards of their own culture, if they’re “monocultural.”

This study, conducted when Dehghani was at ICT, uses virtual humans and was funded by the Army Research Lab.

Opening Speech: Disruptive TV Technologies on the Horizon for 2020

Paul Debevec in Red Orbit

RedOrbit quoted Paul Debevec about new light technology that debunks moon landing conspiracy theories. Debevec explains that light behaves in a complex way. It does not simply bounce straight to a camera, but also bounces off various parts of a scene

What Will It Mean to Fly in the Age of Cyber-Humanism?

Forbes Calls Mark Bolas a Virtual Reality Pioneer

An article about Oculus VR founder Palmer Luckey in Forbes stated that in the summer of 2011, Luckey landed a part-time job working with virtual reality pioneer Mark Bolas at his lab in the Institute for Creative Technologies at the University of Southern California.

MSNBC Features ICT Virtual Human Work Addressing PTSD

On their Big Idea segment, MSNBC interviewed Skip Rizzo and Gale Lucas about SimSensei, ICT’s virtual human based sensing system that tracks non-verbal behaviors and can detect signs of mental health distress.

“There are a lot of subtle parameters that really in depth psychologists are attuned to. We are trying to give the computer that kind of perceptual ability on a person,” said Rizzo.

Recent research, led by Lucas, suggests people are more comfortable opening up to a virtual human than a real one.

“They feel more comfortable and less likely to be judged and so they’ll share more personal information,” said Lucas

The Simsensei project is funded by DARPA. ICT’s virtual human development and evaluation is funded through the Army Research Lab.

Forbes Features ICT Virtual Reality Projects for Mental Health

In a Forbes column, Steven Kotler describes ICT’s virtual reality projects being as on the “bleeding edge”.

Kotler writes about Ellie, part of ICT’s DARPA-funded SimSensi project.

“Ellie is a diagnostic tool capable of reading 60 non-verbal cues a second—everything from eye-gaze to face tilt to voice tone—in the hopes of identifying the early warning signs of depressions and (part of the long term goal) stemming the rising tide of soldier suicide.

And early reports indicate that Ellie is both good at her job and that soldiers like talking to an AI-psychologist more than they like talking to a human psychologist (AI’s don’t judge).”

He also covers Bravemind, a VR-based protocol for the treatment of PTSD, developed by psychologist Skip Rizzo.

“It’s an impressive piece of tech. With soldiers returning from combat, already Rizzo’s protocol has proven itself more effective than traditional methods,” he wrote.

Beware of computers bearing smiles: Modeling the social and cognitive effects of emotion

Over forty years ago, Herb Simon argued that emotions would be required by any intelligent entity that must act in a dynamic, semipredictable and social world. Nonetheless, the cognitive science revolution largely had comparatively little impact on emotion research. This has change in recent years with greater interest in functional approaches to emotion in psychology and economics, and an explosion of interest in computer science on techniques for recognizing, modeling and exploiting emotions in simulations, decision models, and human-machine interaction. In this interdisciplinary talk, I will broadly overview my research on the role of emotion in understanding the social and cognitive function of emotion for both humans and machines. With regard to cognition, I will review my research on the EMA model of emotion, based on appraisal theory, and illustrate how it predicts the antecedents and consequences of emotion in several decision-making task. With regard to social cognition, I will highlight how expressed emotion fundamentally alters how we make decisions with other decision-makers (both natural and artificial). I will illustrate these points using studies across a variety of domains including medical interviews, economic decision-making and computer games. I will discuss both the theoretical consequences of these findings for human cognition as well as their practical implications for human-computer, computer-mediated and human-robot interaction. Throughout, I will argue the need for an interdisciplinary partnership between the social and computational sciences around to topic of emotion.

Army News Service Spotlights Army-Wide Release of ELITE Counseling Trainer

An Army News Service story covered the Army-wide availability of ELITE, a computer-based counseling training system developed by ICT.

The article explains that students can interact with uniformed avatars that have problems ranging from disagreements with their platoon sergeant to driving under the influence and sexual harassment. Responses provided to the avatars determine the direction of the counseling sessions.

Five ELITE Lite training modules are now being used as part of cadet leadership classes at the U.S. Military Academy. And the virtual scenarios may soon be part of the curriculum for junior NCOs in the Warrior Leader Course.

This new type of interactive training is the wave of the future, said Marco Conners, chief of the Army Games for Training program at the National Simulation Center, Fort Leavenworth, Kansas.

Today’s training tools need to have an element of “captivation and entertainment,” he said.

“Soldiers today have grown up in a digital age,” Conners said. “Students tend to learn faster and more if you place it into an interactive game environment instead of standing up there with a butcher board.”

Simulations fill a vital need, he added.

“It’s critical that our young leaders learn how to counsel Soldiers,” Conners said. “Counseling skills help these leaders prepare Soldiers for any mission. Just as important, ELITE helps Army leaders develop to their full potential.”

The story notes that ELITE was developed in collaboration with the Army Research Lab’s Human Research and Engineering Directorate, Simulation and Training Technology Center (STTC). It also states that the Program Executive Office for Simulation, Training and Instrumentation, helped develop long-term logistics support for sustaining the software.

KNX News Features ICT

KNX News Radio featured the work of ICT as part of the “On Your Corner” series focused on the Playa Vista area. Reporter Ed Mertz interviewed Randy Hill about ICT’s work using simulation technologies in healthcare, education, training and entertainment.

“[The goal is] to give people the most immersive realistic experience possible so they are prepared for whatever they are going to face,” said Hill.

Advances in Photoreal Digital Humans in Film and in Real-Time

The Argonaut Features ICT Futuristic Research and Technologies

The cover story in the weekly Argonaut showcases ICT research and technologies that are delivering experiences for training and entertainment. The article includes descriptions of Project BlueShark and several ICT-developed systems for projecting scanning and interacting with virtual humans. The story also notes that funding comes from the U.S. Army Research Lab, the Office of Naval Research and the National Science Foundation.

The story states that ICT’s focus is less about the microchips and wires that make virtual reality work than it is about exploring how that technology is experienced by the end user.

“Los Angeles is about people who create experiences, not things. When you’re watching a movie, they don’t want you to know they used the latest and greatest technology to get that shot; they just want you to experience it,” said Mark Bolas, in the story. “In the end, that’s what we’re about, too. Even for hardcore Department of Defense training applications, it’s not the technology in front of the person but whether they have the experience you need them to have.”

Detection and computational analysis of psychological signals using a virtual human interviewing agent

It has long been recognized that facial expressions, body gestures and vocal features/prosody play an important role in human communication signaling. Recent advances in low cost computer vision and sensing technologies can now be applied to the process of sensing such behavioral signals and from them, making meaningful inferences as to user state when a person interacts with a computational device. Effective use of this additive information could serve to enhance human interaction with virtual human (VH) agents and for improving engagement in Telehealth/Teletherapy approaches between remote patients and care providers. This paper will focus on our current research in these areas within the DARPA-funded “Detection and Computational Analysis of Psychological Signals” project, with specific attention to our SimSensei application use case. SimSensei is a virtual human platform able to sense real-time audio-visual signals from users interacting with the system. It is specifically designed for health care support and is based on years of expertise at ICT with virtual human research and development. The platform enables an engaging face-to-face interaction where the virtual human automatically reacts to the estimated user state and intent of the user through vocal parameters and gestures. Much like non-verbal behavioral signals have an impact on human to human interaction and communication, SimSensei aims to capture and infer from user’s non-verbal communication to improve engagement between a VH-human and a user. The system can also quantify sensed signals over time that could inform diagnostic assessment within a clinical context.

Science Friday Live Features Paul Debevec in the Science of Feature Films

Science Friday is headed to the movies! Paul Debevec, ICT’s chief visual officer, joined actor Stephen Lang onstage with host Ira Flatow to discuss the science behind digital actors and more. Watch a live stream here.

The Verge Features ICT in Virtual Reality Special

A multi-part feature in the Verge featured ICT research and researchers and their role in virtual reality past, present and future. A history of VR section higlighted the work of Skip Rizzo and Mark Bolas. Another section showed how to build a VR headset using the VR2GO plans available from the ICT Mixed Reality Lab.

Science Friday Live: The Science of the Silver Screen

Science Friday is headed to the movies!
Ira Flatow and the SciFri staff will join KPCC’s Sanden Totten at Caltech’s Beckman Auditorium on Wednesday, Aug. 27, at 7:30 p.m. for an entertaining and thought-provoking evening all about the science of cinema. Paul Debevec, ICT’s chief visual officer, will be on hand to discuss digital actors, movie special effects and more.
Tickets are on sale now.

Generative Models of Cultural Decision Making for Virtual Agents Based on User’s Reported Values

Poster Presentation

Generative Models of Cultural Decision Making for Virtual Agents Based on User’s Reported Values

Poster Presentation

Abstract.
Building computational models of cultural decision making
for virtual agents based on behavioral data is a challenge because nding
a reasonable mapping between the statistical data and the computational
model is a dicult task. This paper shows how the weights on a multi
attribute utility based decision making model can be set according to
the values held by people elicited through a survey. If survey data from
di erent cultures is available then this can be done to simulate cultural
decision making behavior. We used the survey data of two sets of players
from US and India playing the Dictator Game and the Ultimatum Game
on-line. Analyzing their reported values in the survey enabled us to set
up our model’s parameters based on their culture and simulate their behavior in the Ultimatum Game.

Rapid Avatar Capture Demo

Workshop on Architectures and Standards for IVAs

The scope of building a complete intelligent virtual agent is too vast for a single research group to fully tackle. It requires interdisciplinary collaborations between research groups and reuse of existing components. Starting with the SAIBA framework, an important current research direction deals with facilitating the collaboration between groups using modular architectures, interface standards, reuse of data & assets, and supporting tools, that allow researchers to better leverage each other’s work.

The Economist Features SimSensei and Benefits of Speaking to a Virtual Human

The Economist featured a recent ICT study that suggests people share more with virtual humans than with real ones. The story notes that study co-author Jonathan Gratch believes virtual humans, like Ellie in ICT’s SimSensei project, will be of particular value in assessing the psychological problems of soldiers. The story states the work was funded in part by DARPA.

Paul Rosenbloom Named Fellow of the Cognitive Science Society

The Cognitive Science Society has named Paul Rosenbloom a fellow. Rosenbloom is a professor in the USC Computer Science Department at the University of Southern California director of cognitive architectures. A fellow is the society’s highest level of membership, indicating of the individuals whose research has exhibited sustained excellence and had sustained impact on the Cognitive Science community.

Mashable, Fox News and Live Science Feature ICT Virtual Reality Work in PTSD Treatment, Training and Prevention

A Mashable story, originally posted to Live Science, highlighted Bravemind, ICT’s reality therapy system for treating post-traumatic stress disorder. The story noted that the system builds on traditional exposure therapy, which studies have shown can be effective in treating PTSD. The story also mentioned additional ICT virtual reality work including using the virtual-reality system as a preventative therapy before soldiers are deployed, developing a virtual patient project in which clinicians can practice working with a simulated trauma victim before they work with a real person, and launching a military sexual trauma project for service members who have experienced sexual assault.

“The [virtual reality] format may appeal to a generation of service members who have grown up with the digital world, and feel comfortable with it,” said Skip Rizzo, ICT’s director of medical virtua reality. In addition, the virtual-reality program is wireless, making it convenient for veterans to use, he added.

Fox News also ran the story.

Hair Today: Measurement and Modeling of Microfacet Distribution under Deformation

A model to simulate dynamic microfacet distribution for rough surfaces such as human skin that models the dynamic surface reflectance under deformation. For rendering, dynamic micro-geometries can be used to exhibits such effects.

Real-Time Live! Make Your Own Avatar

This near-automatic pipeline captures a human subject and, in just a few minutes, simulates the person in a virtual scene. The process can be fully managed by the capture subject, who operates a single Microsoft Kinect. No additional assistance is required. The speed and accessibility of this process fundamentally changes the economics of avatar capture and simulation in 3D. Because the avatar-capture cost is near zero, and the technology to perform this capture has been deployed in millions of households worldwide, this technology has the potential to significantly expand the use of realistic-looking avatars.

The short capture time allows frequen, even daily, avatar creation. The avatars are of sufficient resolution to be recognizable to those familiar with the human subject, and they are suitable for use at a medium distance (such as third-person perspectives) and in crowd scenes.

The pipeline consists of three stages: capture and 3D reconstruction, rigging and skinning, animation and simulation. The capture process requires the subject to remain steady for about 15 seconds at four angles with 90-degree offsets from each other. The automatic rigging and skinning uses voxel-based approaches that do not require watertight meshes, and are thus suitable for capture methods that use reconstruction from points. The simulation and animation performs online retargeting of a large variety of behaviors, ranging from locomotion to reaching and gazing.

Binghampton University Magazine Features Skip Rizzo

The summer issue of the Binghampton University magazine features the resiliency work of Skip Rizzo, an alumnus. The story describes how Rizzo has developed a virtual reality exposure therapy system for treating PTSD and how current work in is focused on using virtual reality scenarios for predeployment preparation.

The idea, states the story, is that letting soldiers confront the fear, the uncertainty and the kill-or-be-killed decision making before they go into combat will help them be better adjusted both on the battlefield and when they return to civilian life.

The story notes that the system will be studied at West Point this fall.

Rapid Avatar Capture and Simulation Using Commodity Depth Sensors

This system can capture a human figure using a Microsoft Kinect without assistance, create a 3D model of the subject, then automatically rig, skin, and simulate it in a 3D environment in a matter of minutes.

Creating a Life-Sized Automultiscopic Morgan Spurlock for CNN’s “Inside Man”

This new system for capturing and rendering life-size 3D human subjects on an automultiscopic projector array was used to create a 3D digital Morgan Spurlock featured on the CNN documentary “Inside Man”.

Digital Ira and Beyond: Creating Photoreal Real-Time Digital Characters

This course summarizes the process of creating Digital Ira, presented in SIGGRAPH 2013 Real-Time Live! It covers the complete set of technologies from high-resolution facial scanning, blendshape rigging, video-based performance capture, animation compression, and realtime skin and eye shading to hair rendering. The course also presents and explains late-breaking results and refinements, and points the way to future directions that may increase the quality and efficiency of this kind of digital-character pipeline.

For this project, an actor was scanned in 30 high-resolution expressions, from which eight were chosen for real-time performance rendering. Performance clips were captured using multi-view video. Expression UVs were interactively correlated with the neutral expression, then retopologized to an artist mesh. An animation solver created a performance graph representing dense GPU optical flow between video frames and the eight expressions. Dense optical flow and 3D triangulation were computed, yielding per-frame spatially varying blendshape weights approximating the performance.

The performance was converted to standard bone animation on a 4k mesh using a bone-weight and transform solver. Surface stress values were used to blend albedo, specular, normal, and displacement maps from the high-resolution scans per-vertex at run time. DX11 rendering includes SSS, translucency, eye refraction and caustics, physically based two-lobe specular reflection with microstructure, DOF, antialiasing, and grain.

The course explains each of these processes, why each design choice was made, and alternative components that could replace any of the steps. It also covers emerging technologies in performance capture and facial rendering. Attendees receive a solid understanding of the techniques used to create photoreal digital characters in video games and other applications, and the confidence to incorporate some of the techniques into their own pipelines.

Countering User Deviation During Redirected Walking

Virtual Reality Goes to War—Advances in the Assessment and Treatment of PTSD

This panel hosted by Rizzo also includes: Update and Expansion of the Virtual Iraq/Afghanistan PTSD Exposure Therapy System.
The stressful experiences that have been characteristic of the combat environments in Iraq and Afghanistan have produced significant numbers of returning service members at risk for developing posttraumatic stress disorder (PTSD) and other psychosocial/behavioral health conditions. A growing literature has now evolved that suggests that Virtual Reality Exposure Therapy (VRET) may be a useful approach for addressing this healthcare challenge. This presentation will describe the newly updated “Virtual Iraq/Afghanistan” virtual reality (VR) system for the delivery VRET and a set of projects that are expanding its application for a wider range of trauma experiences. The talk will start with a brief detailing of the factors that led to the initial development of the Virtual Iraq/Afghanistan VRET system and the clinical outcomes that have been reported with its use. We will then discuss the current efforts to update the VRET system with more advanced software to expand the VR content and features based on input from clinical users of the previous 2007 version of the system. Following a description of this new Virtual Iraq/Afghanistan application, we will then focus on a general overview of two new projects that aim to provide relevant and customizable options for conducting VRET with users having a wider range of military trauma experiences, combat medics/corpsmen and victims of military sexual trauma. The talk will conclude with a description of the further expansion of the system to create a VR tool for use to prevent the incidence of combat-related PTSD via pre-deployment resilience training.

Reinforcement Learning for Adaptive Theory of Mind in the Sigma Cognitive Architecture

The Guardian Features ICT Virtual Reality Research

An article exploring the past and future of virtual reality quotes Skip Rizzo and describes some of the VR work taking place at ICT.

The story states that Rizzo leads a team looking at the use of virtual reality environments in everything from classrooms for children with attention deficit disorder to exposure therapy for war veterans suffering from post-traumatic stress disorder.

Advances in graphics, computing power and interface devices such as the Nintendo Wii or Microsoft Kinect have opened the door to a new level of sophistication of virtual reality, said Rizzo. Most important, though, has been the continuing drop in cost of virtual reality technology, a trend largely driven by the gaming industry.

“Right now, a headset is $2,000… if you could replace that with a $350 headset [such as the Oculus Rift] and have that be better then you’re golden – that’s the direction we’re heading,” says Rizzo, whose lab the Oculus Rift’s inventor, Palmer Luckey, worked in before launching the headset.

The article also noted that Oculus Rift founder Palmer Luckey worked at ICT before starting his company.

Distributed Vector Representations of Words in the Sigma Cognitive Architecture

Sigma Cognitive Architecture Tutorial

Situated Pedagogical Authoring Demo and Talk

NBC News Highlights How ICT Virtual Humans Research Could Help Treat PTSD

A story on NBC News features ICT’s SimSensei project and states that it could be key screening tool for identifying PTSD. The article also covers a recent study of the system that suggests people disclose more to virtual humans than to real ones. It states that the work was funded by DARPA and the U.S. Army.

The story notes that the technology may be available in clinics, kiosks and via laptops and that the goal is to have an inexpensive, easy-to-carry tool that can spot mental health problems early and track them consistently over time, letting real-life therapists do their jobs more effectively.

“Ultimately, the program can give people a sense of safety,” said ICT’s Gale Lucas, who led the disclosure study. “A human therapist can encourage a sense of safety and make people feel anonymous, but they probably can never make someone feel as anonymous as they do talking to computer.”

We’re Hiring: Director for Learning Science Research

ICT is seeking a Director for Learning Science Research to define the overall vision for our learning sciences research. The ideal candidate will have demonstrated success securing research funding, and will be widely recognized for his or her achievements in computer-supported learning and educational technology, with a national or international reputation. The candidate will be knowledgeable about intelligent tutoring systems, artificial intelligence, instructional design, user assessment design, and systems evaluation design. The candidate will be familiar with good practices for system design. Click here for more info.

Graphics Research Talk

Deconstructing Episodic Memory and Learning in Sigma

Harvard Business Review Features ICT Post on How Virtual Humans Build Leadership Skills

Harvard Business Review featured an article by Randall Hill, executive director of the USC Institute for Creative Technologies, which focused on how virtual humans could be used to develop new skills. Hill wrote that ICT’s research on virtual humans has provided training exercises for stressful situations, like veterans who have lost a comrade, or even those who are studying cultural differences ahead of an international meeting. Other efforts train law students to interview child witnesses, clinicians on how to improve their diagnostic skills and autistic adults on job interview skills.

Voice of America Covers Rapid Avatar Generation

Voice of America featured Ari Shapiro and Rapid Avatar Generation, the three minute process for turning people into avatars.

“For me it is exciting to see what we now have,” said Shapiro, describing a low-cost technology that may revolutionize social media. “Soon a lot of people will have the capability to model themselves very, very quickly for almost no cost.”

The story also mentioned ICT’s Mixed Reality Lab and the VRLA event they hosted.

Los Angeles Times Quotes Paul Debevec on Role of Science in Film Animation

A story about the increasing number of scientists involved in film animation quotes Paul Debevec.

“The physics behind what’s happening in these movies is incredibly complicated,” said Debevec, a computer scientist and chief visual officer at the USC Institute for Creative Technologies. “You need real scientists to understand what’s going on. These are Ph.D.-level folks who could have been publishing papers in Physics Today. Instead, they are working on Hollywood blockbuster films.”

The story noted that film studios increasingly rely on the services of scientists to create complex algorithms to simulate realistic-looking water, fire, dust and other elements in movies packed with action and special effects.

Motherboard Quotes Skip Rizzo on Virtual Reality

An article in Motherboard exploring the ability for virtual reality to alter people’s perceptions quoted Skip Rizzo, cited his work treating post-traumatic stress disorder and noted that he leads the Medical Virtual Reality Group at ICT

Forbes Covers ICT’s SimSensei Project and Research Suggesting Computers can Help Address PTSD and Mental Health Issues

A Forbes column by Steven Kotler discusses Ellie, ICT’s virtual human interview who is part of the DARPA-funded SimSensei project.

Ellie evolved from the suspicion that our twitches and twerks and tones reveal much more about our inner state than our words (thus Ellie tracks 60 different “features”—that’s everything from voice pitch to eye gaze to head tilt), states the story.

The article notes coverage of ICT’s recent study that found that patients are much more willing to open up to a robot shrink than a human shrink. Here’s how Neuroscience News explained it: ”The mere belief that participants were interacting with only a computer made them more open and honest, researchers found, even when the virtual human asked personal questions such as, ‘What’s something you feel guilty about?’ or ‘Tell me about an event, or something that you wish you could erase from your memory.’ In addition, video analysis of the study subjects’ facial expressions showed that they were also more likely to show more intense signs of sadness — perhaps the most vulnerable of expressions — when they thought only pixels were present.

The reason for this success is pretty straightforward, states the author. Robots don’t judge. Humans do.

Paul Debevec Participates in Reddit’s Ask Me Anything

In advance of the SIGGRAPH 2014 conference, Paul Debevec, ICT’s chief visual officer, answered questions on Reddit’s Ask Me Anything. Debevec fielded questions on computer graphics education and careers, movie special effects, future research directions and more.

KTLA Covers ICT Research Suggesting People Reveal More to Virtual Humans

CW News Los Angeles affiliate KTLA-TV featured research by the USC Institute for Creative Technologies about the use of virtual humans used in diagnosing post-traumatic stress. The study found that people are more comfortable divulging sensitive personal history to a computer than to a person due to a fear of judgment. The story noted that the research was funded in part by the U.S. Army.

Los Angeles Times Features ICT Mixed Reality Lab and VRLA Event

The Los Angeles Times, in a story about the applications of virtual reality, highlighted VRLA, a meet-up of technologists at the USC Institute for Creative Technologies’ MxR Lab. Attendees experienced the work of Nonny de la Peña of the USC School of Cinematic Arts, whose immersive journalism projects have simulated real-life incidents, such as a rocket attack in Syria. The story noted that Oculus Rift creator Palmer Luckey developed his technology in the MxR Lab. “Virtual reality can touch everything that reality touches. It’s stunningly difficult to limit the areas that I think it’s going to influence,” said Mark Bolas of the USC Institute for Creative Technologies.

fxguide Features Ari Shapiro and ICT’s Make You and Avatar Demo at SIGGRAPH 2014

A podcast spotlighting the SIGGRAPH RealTime Live session features Ari Shapiro and his work on the Make You an Avatar demonstration that will be part of the presentation. The podcast site also included this video of the Rapid Avatar Capture and Generation process.

“You can essentially go from the real world to the simulated world in just a few minutes,” said Shapiro.

Psychology Today Features Study Suggesting People Disclose More to Virtual Humans

Psychology Today interviewed Gale Lucas of the ICT virtual humans group about recent findings that suggest patients are more willing to disclose personal information to virtual humans than actual ones, in large part because computers lack the proclivity to look down on people the way another human might.

The research, which was funded by the Defense Advanced Research Projects Agency and the U.S. Army, is promising for people suffering from post-traumatic stress and other mental anguish, said Lucas, a social psychologist at USC’s Institute for Creative Technologies, who led the study.

In an online Q&A Lucas discusses the research.

“We provide the first empirical evidence that virtual humans can increase disclosure in a clinical interview context by making participants feel though their responses are not currently being observed,” she said.

Fast Company Features STRIVE, ICT’s Virtual Reality Resilience Trainer

Fast Company highlighted research by Skip” Rizzo on the use of virtual reality to address post-traumatic stress.

The article noted that Rizzo works on several DoD-funded efforts to related to virtual reality simulations and PTSD, including STRIVE, a simulation that run soldiers through Iraq and Afghanistan narratives, but add a training session when the plot takes a dark turn.

There’s some evidence that realistic wartime scenarios can actually help make people more resilient before they hit the front lines, said Rizzo. At the same time, he acknowledges, researchers have to be careful.

Rizzo notes that the STRIVE simulation offers a mentor character to walk the trainee through his or her response.

“He’ll walk out from behind a tree in one environment. He’ll show up in the front seat of the Humvee,” Rizzo said. “And he walks up and walks you through the types of emotional resilience training activities that have been found to be beneficial.”

Fast Company Cites Skip Rizzo and the Use of Virtual Reality Therapy

An article about the use of computer games to treat medical conditions noted that the virtual reality therapy system developed by ICT’s Skip Rizzo is used to treat PTSD in veterans.

Voice of America Features Evan Suma’s Redirected Walking Research

Reporter Elizabeth Lee attended the VRLA Meetup and featured Evan Suma’s redirected walking research.

The story stated that Suma is currently studying how to trick a person’s senses, an essential part of immersing them in a virtual environment. This Army-funded research could help soldiers train for combat or those suffering from post-traumatic stress disorder, the story noted.

“If I rotate the world slowly around your head while you are in virtual reality, as long as it’s small enough, you will actually believe what you are seeing more than what you are feeling, and I can use that to actually get you to curve and walk in a circle while perceiving that you are walking straight. So that is something that came straight out of psychology,” he said.

Bridging the gap between human and non-human decision makers

Gravity Lighting R&D, Digital Ira, and a Hologram of Morgan Spurlock

Computer Graphics Talk

Deseret News Features ICT Technologies for Health Treatment and Assement

An article examining the role of technology in revolutionizing treatment featured ICT’s Bravemind, SimSensei and Games for Rehab projects. The article includes Skip Rizzo, ICT’s director of medical virtual reality, and calls ICT’s Bravemind virtual reality exposure therapy treatment an example of how some innovators are taking video game technology and using it in new ways — in this case, to help veterans make sense of traumatic experiences. The story also notes that future scenarios will address sexual trauma.

“The goal is to help a person confront the things that are hard for them,” said Rizzo.

The article described an interaction with Ellie, a virtual character who is part of SimSensei, which can detect, analyze and infer information about nonverbal behaviors people relay in conversation.

ICT’s Stephan Scherer said he hopes SimSensei will one day hasten access to care with kiosks at places like VA hospitals or any place where people can get help when suicidal or homicidal thoughts occur. Early trials also suggest that people have no problem opening up to a computer like Ellie.

“People came in and were able to talk to Ellie in such a natural way, and it surprised us that Ellie was able to hold a conversation like that,” Scherer said. “Everybody loved her and could see the immediate benefit of just talking to her.”

ICT’s Games for Rehab, a program that uses Kinect technology to create individual, accurate physical and cognitive therapy, was also featured. Using different games, patients can achieve their physical therapy goals with Kinect technology that is calibrated to their individual bodies and abilities — a step up from traditional instructions, says project manager Kevin Feeley.

“Typically, physical therapy patients are given a piece of paper that says something like, lift these soup cans. That’s kind of like the dentist and flossing. And if you do it wrong, you won’t know until you go back,” Feeley said. “This is more fun and it verifies that it’s being done.”

Voice of America Features Mark Bolas and Project BlueShark

Reporter Elizabeth Lee attended the VRLA Meetup at ICT’s Mixed Reality Lab and covered Project BlueShark, a virtual reality interface for imagining the future of communication and collaboration.

According to Mark Bolas, Mark Bolas, the aim of this Navy-funded effort is to imagine technology that will be available in 15 years so Navy ships can become lighter, better protected and provide more information to sailors to help them make better decisions.

”Instead of looking at data on 2-D screens, for example, you can have it floating as a 3-D map in front of you,” he said. “If you’re going to build a ship, you better be thinking about the design of that ship at least 15 years before you build it because that’s how long it’s going to take to make it.”

Make Your Own Avatar

This near-automatic pipeline captures a human subject and, in just a few minutes, simulates the person in a virtual scene. The process can be fully managed by the capture subject, who operates a single Microsoft Kinect. No additional assistance is required. The speed and accessibility of this process fundamentally changes the economics of avatar capture and simulation in 3D. Because the avatar-capture cost is near zero, and the technology to perform this capture has been deployed in millions of households worldwide, this technology has the potential to significantly expand the use of realistic-looking avatars.

The short capture time allows frequen, even daily, avatar creation. The avatars are of sufficient resolution to be recognizable to those familiar with the human subject, and they are suitable for use at a medium distance (such as third-person perspectives) and in crowd scenes.

The pipeline consists of three stages: capture and 3D reconstruction, rigging and skinning, animation and simulation. The capture process requires the subject to remain steady for about 15 seconds at four angles with 90-degree offsets from each other. The automatic rigging and skinning uses voxel-based approaches that do not require watertight meshes, and are thus suitable for capture methods that use reconstruction from points. The simulation and animation performs online retargeting of a large variety of behaviors, ranging from locomotion to reaching and gazing.

Trojan Family Magazine Spotlights Mark Bolas

USC’s alumni magazine featured Mark Bolas and his open-door, open-source philosophy towards helping people create new virtual reality technologies and immserive experiences

“I believe a big part of my job is helping to develop amazing people,” he says.

The story notes that Bolas hired Palmer Luckey, whose virtual reality company Oculus Rift was purchased by Facebook for $2 billion.

Today, Bolas makes sure that the Mixed Reality Lab’s doors are open 
to students of all backgrounds. Humanists, social scientists and artists “aren’t preoccupied with the limitations of technology,” he says. “We take technology, push it to the edge of its limitations and, hopefully, break it.”

CBS News Features ICT’s Mixed Reality Lab, Working at the Forefront of Virtual Reality

CBS News featured the work of Mark Bolas and the ICT Mixed Reality Lab. The story highlights Mark Bolas’ 30 years working in VR and the lab’s influence on virtual reality past, present and future.

“The way people collaborate and communicate is going to shift, to where physical presence isn’t as important as virtual presence,” said Mark Bolas, director of ICT’s Mixed Reality Lab and professor in the USC School of Cinematic Arts.

The story stated that posting USC’s VR findings and designs online for free helped a little-known start-up company named Oculus VR develop its own virtual reality headset, the Oculus Rift.

The segment also noted that USC is now bringing the technology to the U.S. Navy. As part of Project BlueShark, the Mixed Reality Lab designed a virtual world that turns plastic tables into digital menus that can be used to man a ship. It also noted the lab’s award-winning DIY work pairing smartphones with $2 lenses and special software to create low-cost, hand-made headsets.

“We see smartphones as a way that we brought computers out into the real world,” said Bolas. “But virtual reality’s really the way we’re now gonna go into the virtual world. And I think it’ll be as common as smartphones — we’ll all have one, and we’ll bring them with us.”

Bloomberg Business Week, Time and The Verge Feature ICT’s Smartphone Virtual Reality Viewers

Bloomberg Businessweek, in a story about Google’s new smartphone virtual reality kit, reported that the USC Institute for Creative Technologies has been developing similar kits since 2011. “I don’t see a dividing line between virtual reality and real reality. I think in a few years they’re going to be indistinguishable,” said Mark Bolas of ICT’s Mixed Reality Lab. Time and The Verge also covered the Mixed Reality Lab’s precursor kits.

Interviews with Mark Bolas from VRLA event

Hackertrips, a blog that explores emerging hacker trends, covered the VRLA Meetup #2 event at ICT’s Mixed Reality Lab and featured a series of interviews with Mark Bolas. Approximately 300 people attended the virtual reality showcase held on June 23.

Civilian Analogs of Army Tasks: Supporting Pedagogical Storytelling Across Domains

Friends You Haven’t Met Yet Presentation

The Washington Post Features ICT’s Work Helping Create 3-D Bust of President Obama

The Washington Post featured the first 3D printed bust of a sitting president, which was created using facial scans of President Obama with assistance from the USC Institute for Creative Technologies.

Re/Code Features Mixed Reality Lab’s Research and Role in Advancing Virtual Reality

Games writer Eric Johnson wrote a story about the VRLA Virtual Reality Meetup at ICT’s Mixed Reality Lab, a place he calls a de facto incubator for VR projects. Both Oculus VR’s Palmer Luckey and immersive gaming startup Survios came through Bolas’s tutelage, he wrote. The story notes the sense of optimism attendees shared about the future of virtual reality.

“The movie business came to Hollywood because they needed the light,” said Mark Bolas, in the story. “We don’t need light anymore, but we need people. And LA is one of the hotbeds.”

Johnson described his experience in Project Blueshark, ICT’s Office of Naval Research-funded project that explores the future of collaboration and work.

“VR had successfully tricked me into believing something that wasn’t real”, he said.

He also wrote about Evan Suma’s redirected walking work, noting that it uses a perceptual trick called change blindness to make a small physical space infinitely large inside a virtual world.

West Point Cadets Use ICT’s Virtual Human Toolkit

An article in the summer issue West Point Magazine features cadets working with ICT’s Virtual Human Toolkit as part of the Engineering Psychology major, a program described as having roots as an education in Soldier-oriented research. Images in the story show cadets using the virtual human toolkit as part of a seminar that taught cadets how to use the toolkit to build interactive training systems that require trainees to interact with virtual humans. Colonel James Ness, director of the West Point Engineering Psychology program, wrote the story.

Single-Agent vs. Multi-Agent Techniques for Concurrent Reinforcement Learning of Negotiation Dialogue Policies

Pacific Standard Features ICT Findings Showing People Disclose More When Talking to Virtual Humans

Pacific Standard covered research by Gale Lucas and colleagues from the = ICT Virtual Humans group finding that people are more likely to disclose personal health information to a virtual human than to an actual human health professional. The “more honest responding comes from the sense that no one is observing or judging,” Lucas said.

The study was published in the August issue of Computers in Behavior.

Can a conversation be a game? Designing conversational strategy games for health & medical training

Associated Press Features Smithsonian’s 3D Portrait of Obama, Notes Use of USC Light Stage Scanner

A widely carried Associated Press story stated that a team at the Smithsonian Institution has scanned President Barack Obama to create the first 3D portrait of a sitting president.

The story noted that scanning used two distinct 3D processes. The team used handheld 3D scanners and cameras to record 3D data to create a bust of the president and experts from the University of Southern California used their “light stage” face scanner to document the president’s face from ear to ear.

According to the story, a 3D printed bust and life mask of Obama will become part of his presidential depictions in the National Portrait Gallery collection. Both were shown Wednesday, June 18 at a gathering of inventors, entrepreneurs and students at the White House.

In describing the scanning process, which took about 5 minutes, the story explains that the Smithsonian set out to update plaster casting with 21st century technology. The result was a 3D scan of the president at a higher resolution that is currently possible to print in 3D.

“You can see down to the wrinkles in the skin and the pores on his face,” said Vince Rossi, a 3D imaging specialist.

Photo Credit: Smithsonian Institution

The Future Holistic Training Environment Panel Discussion

As part of CAC-T’s Training and Education 2025 Industry Forum, panelists cover the current status of the Army’s Integrated Training Environment and the future merging of environments to provide simpler, low-overhead, easily accessible and higher-fidelity training.

Todd Richmond Discusses Futuristic Training in Daily Press

Todd Richmond was quoted in a story about future of Army training, including challenges in developing a massive “global terrain database” that includes not only land and sea, but air and outer space.

That mapping is well underway in some cases, said Richmond, who was speaking on a panel at the Combined Arms Center – Training (CAC-T) Training and Education 2025 Industry Forum.

“Pretty much the entire world, the land terrain, at least, will be mapped to 1- to 2-meter resolution within the next year or two,” he said. “Low resolution data is going to exist … but we also need higher resolution than that in certain areas and for certain needs. Where can we get away with low-res? How do we swap in high-res? These are all tricky problems.”

A Demonstration of Dialogue Processing in SimSensei Kiosk

This demonstration highlights the dialogue processing in SimSensei Kiosk, a virtual human dialogue system that conducts interviews related to psychological distress conditions such as depression, anxiety, and post-traumatic stress disorder (PTSD). The dialogue processing in SimSensei Kiosk allows the system to conduct coherent spoken interviews of human users that are 15-25 minutes in length, and in which users feel comfortable talking and openly sharing information. We present the design of the individual dialogue components, and show examples of natural conversation flow between the system and users, including expressions of empathy, follow-up responses and continuation prompts, and turn-taking.

Panel: Entertainment Technology in the Internet Age

Visit The Epicenter of Virtual Reality in LA: USC Institute for Creative Technologies Mixed Reality Lab to Host VRLA Meet Up #2

Experience the latest in immersive media and virtual reality

Meet the leaders of the SoCal-centered VR revolution

ICT Media Contact: Orli Belman, belman@ict.usc.edu, 310 301-5006

WHEN: 6:30 – 10:30 pm Mon., June 23

WHERE: ICT Mixed Reality Lab, 5318 McConnell Ave, LA, CA 90066

WHAT: VRLA, the bi-monthly event positioned at the confluence of the entertainment industry and the rapidly expanding world of virtual reality is bringing its next gathering of technologists and industry creatives to the USC Institute of Creative Technologies Mixed Reality Lab. The event will feature exciting VR demos from the ICT, as well as guest demos from some of the most inventive developers in the field, including VR demos from Epic Games and Sixense.

Mixed Reality Lab Demos Include:

E2C2: Project BlueShark: Take the helm on the bridge of a ship, fly along with a drone or remotely operate a real robotic arm in this virtual reality vision of the workspace of the future developed with the U.S. Navy.
Stretching Space/ReDirected Walking: Experience the holy grail of full body locomotion, precision tracking and a mind altering 150 degree field-of-view head-mounted display within a this fully immersive demo.
VR2GO Suite of Immersive Viewers: Transform Smartphones and tablets into portable virtual reality systems for training, education, health and fitness, entertainment and more with the lab’s open source toolkit.
Rapid Avatar Capture: Get into the game with this system that creates 3D versions of people in under four minutes.
Project Syria: Witness real events as they transpire in an immersive journalism experience that uses real time graphics, high resolution VR goggles and compelling audio to put participants inside the story.
Light Stage 6 – See part of ICT’s Academy Award winning-technology that allows for the creation of photorealistic digital doubles as they would appear in any lighting condition.

WHO: Introduction and welcome by VR pioneer, Mark Bolas, director of the Mixed Reality Lab (MxR) at ICT and a professor of interactive media at the USC School of Cinematic Arts. Bolas supervised Palmer Luckey when he worked at ICT and advised Survios founders James Illiff and Nate Burba when they were his students. Bolas continues to serve as a teacher and mentor to the generation that is currently defining and refining this new medium.

About the USC Institute for Creative Technologies:

At the University of Southern California Institute for Creative Technologies (ICT) leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society. Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center working in collaboration with the U.S. Army Research Laboratory. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more.

ICT is the hub of much of the current activity around virtual reality in Los Angeles. Situated in the hotbed of ‘Silicon Beach’ and combining the best and brightest from the entertainment industry and computer science and engineering, the MxR Lab is changing the way we interact with each other and the world around us.

Special thanks to PhaseSpace, Inc.

Jonathan Gratch Named AAAI Fellow

The Association for the Advancement of Artificial Intelligence (AIII) has elected Jonathan Gratch to its Fellows Program, in recognition of his sustained contributions to the field of artificial intelligence.

Gratch is the director of virtual humans research at ICT and a research professor in the department of computer science at the USC Viterbi School of Engineering.

His research focuses on virtual humans and computational models of emotion. He studies the relationship between cognition and emotion, the cognitive processes underlying emotional responses, and the influence of emotion on decision-making and physical behavior.

Gratch,and the other 2014 fellows, will be honored at a dinner as part of AAAI-14 in Quebec, Canada in late July.

Fast Company Covers ICT’s Virtual Reality Tools for Treatment and Training

An article in Fast Company featured Bravemind, ICT’s virtual reality exposure therapy system for treating post-traumatic stress. The article noted that ICT is working on virtual reality scenarios for soldiers injured by IED blasts, for combat medics with PTSD, and even soldiers with post-traumatic stress stemming from sexual assault.

Albert “Skip” Rizzo, ICT’s director of medical virtual reality, told Co.Exist that a big part of Bravemind’s future applications is giving therapists new options.

“This might not work for everybody, but works for some it works very well. The goal for PTSD treatment is to have a range of options. It certainly isn’t one size fits all,” he said.

The article also mentioned ICT Executive Director Randall W. Hill, Jr., and the recent Human Matters event, which was organized in partnership with ReD Associates and the Harvard Business Review to explore how a human-centered approach to business and technology can make for more meaningful experiences.

“Virtual environments are now advanced enough to serve as top-notch training tools, and they offer something especially important to sprawling organizations like the Defense Department: standardization. Virtual environments mean that there’s less risk of a bad instructor. They can also guarantee that troops get a better learning experience,” stated the story.

The article also stated that ICT projects are collaborations with Hollywood and the U.S military.

Public Radio International Features Andrew Gordon’s Heider Simmel Storytelling Research

The Public Radio International series, “The Really Big Questions” featured Andrew Gordon’s artificial intelligence research, which is training computers to recognize and tell stories, in an episode about storytelling. Gordon spoke about his Navy-funded project, the Heider-Simmel Interactive Theater.

In 1944, Fritz Heider and Marianne Simmel created an animated film that depicted the motion of two triangles and a circle as they moved in and around a box that alternated between being opened and closed. Heider and Simmel asked people to describe what they saw in what is now considered a classic work in the field of social psychology. The subjects responded with creative narratives that ascribed human-like goals, plans, beliefs and emotions to the moving objects. Gordon’s Heider-Simmel Interactive Theater is a Web-based application that allows people to make their own movies and write their own stories.

Gordon’s work is also featured on the program website and his interview can be found at approximately 19 minutes into the audio hour.

Learn more about Gordon’s research here.

NPR Features Pioneering VR Work of the MxR Lab

NPR featured Mark Bolas and his virtual reality research at the Mixed Reality Lab. The Facebook acquisition of Oculus Rift, a VR company started by former research employee Palmer Luckey, was “a huge success moment for my lab,” Bolas said. “We are here to train and educate the next generation of people to go after new technologies and new content and change the world.” The story spotlighted a number of cutting-edge VR projects to come out of ICT, including Blueshark, a VR system being developed for the U.S. Navy; virtual environments for military veterans to confront their PTSD; and immersive journalistic storytelling about issues like the crisis in Syria.

Keynote Address: From Avatar to Gravity: Lighting Photoreal Digital Actors

Amazingly, we have entered an age where even the human actors in a movie can be created as computer generated imagery. Somewhere between Final Fantasy in 2001 and The Curious Case of Benjamin Button in 2008, digital actors crossed the “uncanny valley” from looking strangely synthetic to believably real. This talk describes how the light stage scanning systems and HDRI lighting techniques developed at the USC Institute for Creative Technologies have helped create digital actors in a wide range of recent films. For in-depth examples, the talk describes how high-resolution face scanning, advanced character rigging, and performance-driven facial animation were combined to create “Digital Emily”, a collaboration with Image Metrics (now Faceware) yielding one of the first photoreal digital actors, and 2013’s “Digital Ira”, a collaboration with Activision Inc., yielding the most realistic real-time digital actor to date. The talk includes recent developments in HDRI lighting, polarization difference imaging, and skin reflectance measurement, 3D object scanning, and concludes with advances in autostereoscopic 3D displays enabling 3D teleconferencing, holographic characters, and cultural preservation.

National Post Covers ICT’s Virtual Reality Therapy for PTSD

An opinion piece in Canada’s National Post advocates for treatments for post-traumatic stress and mentions ICT’s virtual reality exposure therapy.
“Academic studies show the software’s ‘exposure therapy’ is one of the few things that produces a meaningful reduction in PTSD symptoms,” states the story. “It works by using a head-mounted display, as well as directional 3D audio, vibrations and smells to produce a simulation of an Afghan environment that’s the next best thing to reality.”

The article notes the system is being adapted for use in Canada.

Premiere of “Friends you Haven’t Met Yet”

“Friends You Haven’t Met Yet” premieres Tuesday, June 3 at the Dances with Films film festival at the famous Chinese Theaters in Hollywood. The documentary short film, which is part of the ICT Narrative Group’s research, chronicles encounters between extremely prolific bloggers and a computer scientist who uses their personal narratives for research. It explores issues related to public sharing of personal stories, the ethical obligations of researchers who use web data, and the changing nature of online privacy. The film was produced by Psychic Bunny.

The screening begins at 2:45 pm. Tickets are $13 and can be purchased online.

CBS News Showcases ICT’s Role in Making Movie Magic

A feature story on CBS News This Morning highlights the future-forward work of Paul Debevec and the ICT Graphics Lab. The story notes the lab’s role in creating digital doubles for films like Maleficent and Gravity and also in advancing research in virtual characters and digital projections military training projects and the New Dimensions in Testimony collaboration with the USC Shoah Foundation, in partnership with Conscience Display.

Debevec is the chief visual officer at ICT. Inside Science also covered this work.

Privacy Considerations for Public Storytelling

The popularity of the web and social media have afforded re-
searchers unparalleled access to content about the daily lives
of people. Human research ethics guidelines, while actively
expanding to meet the new challenges posed by web research,
still rely on offline principles of interaction that are a poor
fit to modern technology. In this context, we present a study
of the identifiability of authors of socially sensitive content.
With the goal of identity obfuscation, we compare this to the
identifiability of the same content translated to and then back
from a foreign language, focusing on how easily a person
could locate the original source of the content. We discuss the
risk to these authors presented by dissemination of their con-
tent, and consider the implications for research ethics guide-
lines.

ICT Documentary To Premiere at Dances with Films Festival

“Friends You Haven’t Met Yet” premieres Tuesday, June 3 at the Dances with Films film festival at the famous Chinese Theaters in Hollywood. The documentary short film, which is part of the ICT Narrative Group’s research, chronicles encounters between extremely prolific bloggers and a computer scientist who uses their personal narratives for research. It explores issues related to public sharing of personal stories, the ethical obligations of researchers who use web data, and the changing nature of online privacy. The film was produced by Psychic Bunny.

The screening begins at 2:45 pm. Tickets are $13 and can be purchased online.

Popular Science Covers ICT Research Using the Kinect to Study Babies

Popular Science featured an exercise system for babies at risk of developing cerebral palsy designed by Barbara Sargent of the USC Division of Biokinesiology and Physical Therapy and Kevin Feeley of ICT. Sargent designed the gaming system so babies would practice kicking their legs using a modified Microsoft Kinect. “We’re taking stuff that was never intended for this, and proving that the Kinect can track movement in infants. That’s a really cool thing, from a system that’s relatively affordable,” Feeley said.

Popular Mechanics Cover Story Features MxR Lab

The June issue of Popular Mechanics charts the rise of virtual reality and the development of the Oculus Rift. In telling the story of Palmer Luckey, the developer of the Oculus Rift who worked at ICT before launching his start up, the article describes Luckey’s time at ICT and his experiences using the Wide 5 HMD.

“In 2011, Luckey got a dream job working as a technician at the Mixed Reality Lab (MxR) at the University of Southern California’s Institute for Creative Technologies (ICT). There, he had a chance to cross deep into VR territory. On Sept. 25, he posted the following on MTBS3D. [These excerpts are condensed and lightly edited.]

I told you guys I would give a writeup on what it feels like to use the Wide5/body tracking/Unity based engine setup we have at my work … It sounds crazy, I know, but The Matrix is so much closer than we all think … ”

The article also mentions Mark Bolas and notes that the MxR Lab is a joint effort between ICT and the USC School of Cinematic Arts that trains art and engineering students in virtual-reality design and also works on a number of military systems. It mentions that Bolas and a business partner started developing the Wide5 in 2005. It is an HMD with a horizontal field of view of about 150 degrees, and it is studded with light-emitting diodes (LEDs). The story states that Bolas, who was working on developing low-cost VR, assigned Luckey to a team already at work on some of these projects, which include several generations of inexpensive, immersive head-mounted displays.

Towards Cloth-Manipulating Characters

Virtual Coaches over Mobile Video

New Dimesions in Testimony in South Florida Sun Sentinal

South Florida Sun-Sentinel covered the interactive 3D projections of Holocaust survivors that are being developed by the USC Shoah Foundation Institute for Visual History and Education and the USC Institute for Creative Technologies.

Towards a Multimodal Taxonomy of Dialogue Moves for Word-Guessing Games

We develop a taxonomy for guesser and clue-giver dialogue moves in word guessing games. The taxonomy is designed to aid in the
construction of a computational agent capable of participating in these games. We annotate the word guessing game of the multimodal
Rapid Dialogue Game (RDG) corpus, RDG-Phrase, with this scheme. The scheme classifies clues, guesses, and other verbal actions
as well as non-verbal actions such as gestures into different types. Cohen kappa inter-annotator agreement statistics for clue/non-clue
and guess/non-guess are both approximately 76%, and the kappas for clue type and guess type are 59% and 75%, respectively. We
discuss phenomena and challenges we encounter during annotation of the videos such as co-speech gestures, gesture disambiguation,
and gesture discretization.

Orange County Register Covers ICT Work Pushing Boundaries of Virtual Reality

In a story headlined, “These Scientists Intend to Mess with your Mind“, the Orange County Register describes boundary-pushing virtual reality work coming out of the MxR Lab as well as other areas of ICT. The story features Project Blueshark, New Dimensions in Testimony, Fast Avatar Capture and the ICT Graphics Lab.

A day spent with the USC Mixed Reality Lab and its technology demos brings to mind a quote from “Blade Runner:” “I’ve seen things you people wouldn’t believe.”

That’s how hard it is to put into words technologies that are designed to mess with our perception of what’s real.

Back in 2011 one of the engineers at that lab was Palmer Luckey. He worked on a team creating a virtual reality (VR) experience by holding a cellphone to your face. Luckey moved on in 2012 to start Oculus VR, an Irvine company that was snatched up by Facebook for $2 billion earlier this year.

Luckey has since landed on magazine covers (he’s staring out from under a hoodie on June’s Wired: “It will blow your mind.”). The lab, meanwhile, continues to plug away at mind-bending projects under the direction of Mark Bolas. Continue…

A Multimodal Corpus of Rapid Dialogue Games

Rapid Avatar Capture and Simulation using Commodity Depth Sensors

The Atlantic Covers SimSensei – ICT’s Virtual Reality Tool for Detecting Depression and PTSD

A story in The Atlantic featured Ellie, ICT’s virtual human interviewer who is capable of reading and responding to human emotion in real time. And capable, more to the point, of offering those responses via a human-like animation, states the story.

The article notes that Ellie is part of the DARPA-funded SimSensei effort to help detect systems of anxiety, depression and PTSD. In studies, over 500 people have spoken to Ellie.

And—here’s the surprising thing— states the story – they seem to enjoy the experience. The set time for each demo was initially 15 minutes; yet people kept extending their time with Ellie. That’s because, according to project co-leader Louis Philippe Morency, “they don’t feel judged” by her.

“Ellie is an interviewer, but she is there as a computer,” he said. “She doesn’t have judgment directly. So people love talking to her…. they’re more themselves. They’re really expressing and showing something that usually if you know that people are around you—or as an interviewer—they think, ‘Oh, I’m going to be careful.’ But with Ellie, they’re more themselves.”

Affective Computing for the Assessment of Depression

Keynote Talk – Virtual Humans: Bridging the Gap between People and Machines

Wall Street Journal, Tech Crunch and Venture Beat Cover ICT’s Mixed Reality Lab

Several news stories noted the leading virtual reality work taking place at the Mixed Reality Lab. Wall Street Journal, Tech Crunch and Venture Beat stories about VR start-up Survios noted that the founders were students of Mark Bolas and that they worked alongside Palmer Luckey at MxR.

“Survios shares roots with Oculus. Both companies grew with support from USC’s Institute for Creative Technologies” stated the Wall Street Journal, which stated that Southern California is the epicenter for virtual reality equipment.

LA Business Journal Covers MxR Lab’s Connection to VR Start Up Survios

Los Angeles Business Journal featured Survios, a virtual-reality company that developed with support from the USC Institute for Creative Technologies. It was founded by former USC students James Illiff and Nathan Burba.

Fast Company Says ICT is Changing the Nature of Video Games

Fast Company featured the USC Institute for Creative Technologies and the companies that have been launched through research at the MxR Lab – Oculus Rift and Survios, the latter of which just received $4 million in venture capital funding. The story noted that ICT “is changing the nature of video games. “Survios founder and USC alumnus James Illiff said the company is developing hardware and software for a new generation of virtual reality gaming. “If you want something that’s like the holodeck or the Matrix, you really have to be able to use your body naturally. Not just use buttons,” Illiff said.

Human Matters: Developing Technologies and Strategies with Meaningful Impact

The USC Institute for Creative Technologies, ReD Associates, and Harvard Business Review invite you to a panel discussion about the power of a human-centered approach – how human sciences, engaging storytelling and high-tech simulations of human interactions can benefit organizations and your customers.

Many organizations get caught up in big data, industry trends, and the latest technology. They miss the opportunities to understand what truly makes a difference – what people really want and need in their products and services. The most forward-leaning business, academic and military organizations are already leveraging the human sciences to identify problems and create solutions based on how people really live and learn.

This event will share some of those lessons. Moderated by Gardiner Morse, senior editor of the Harvard Business Review, a panel discussion will take its point of departure from the new book “The Moment of Clarity: Using the Human Science to Solve Your Toughest Business Problems.”

Author and senior partner at ReD Associates, Christian Madsbjerg will discuss an approach that puts people in the center, creating solutions that are more reliable because they are more human. Dr. Randall Hill, executive director of the USC Institute for Creative Technologies, will describe how new and emerging virtual reality technologies are creating a new era for learning and development. A combination of gaming and artificial intelligence advances, instructional design and creative content are helping people to lead and remain resilient in the face of humanity’s greatest challenges.

Following the discussion, guests will be invited to see and experience ICT’s cutting edge technologies.

Click here for more information and to reserve a seat at the event.

Schedule: Please note the press preview has been moved from 4pm to 5 pm.
5 pm Press Preview

5-6 pm Registration

6-7 pm Panel Discussion

7-8 pm Tech Demonstrations and Reception

VITA: Virtual Interactive Training Agent

Download a PDF overview.

The Virtual Interactive Training Agent (VITA) is a virtual reality job interview practice system for building competence and reducing anxiety in young adults with Autism Spectrum Disorder (ASD) and other developmental disabilities. It was developed by the USC Institute for Creative Technologies (ICT), in partnership with the Dan Marino Foundation (DMF).

While it is recognized that many persons with ASD have the necessary capabilities for success in vocational activities, many report that the process of engaging in a job interview is anxiety provoking, as well as socially and verbally challenging; these factors may limit their success in job seeking situations.

Combining DMF expertise on ASD and ICT virtual human technology, VITA provides the opportunity for ASD users to repetitively practice job interviewing in a safe simulated VR environment. There are a total of six characters, three male and three female, each with three behavioral dispositions: soft-touch, neutral and hostile. Various scenarios have been created that simulate real life job interview situations. Video recording of the user’s interaction allows a vocational expert and the user to visually review and analyze the user’s performance.

Although VITA is still in testing phases at the Dan Marino Campus in Fort Lauderdale, Florida, it is expected that exposure to the interview process through VITA will support users’ efforts to overcome the anxiety that they report during this process and provide a platform where job interviewing skills can be practiced with the support of a vocational expert. The result of the practice and habitation through using VITA will be improved job interview skills for individuals with developmental disabilities.

Imaging, Simulation and Animation in Medicine

From Avatar to Gravity: Lighting Photoreal Digital Actors

Upcoming Event – Human Matters: Developing Technologies and Strategies with Meaningful Impact

On Tuesday, May 20, the USC Institute for Creative Technologies, ReD Associates, and Harvard Business Review invite you to a panel discussion about the power of a human-centered approach – how human sciences, engaging storytelling and high-tech simulations of human interactions can benefit organizations and your customers.

Many organizations get caught up in big data, industry trends, and the latest technology. They miss the opportunities to understand what truly makes a difference – what people really want and need in their products and services. The most forward-leaning business, academic and military organizations are already leveraging the human sciences to identify problems and create solutions based on how people really live and learn.

This event will share some of those lessons. Moderated by Gardiner Morse, senior editor of the Harvard Business Review, a panel discussion will take its point of departure from the new book “The Moment of Clarity: Using the Human Science to Solve Your Toughest Business Problems.”

Author and senior partner at ReD Associates, Christian Madsbjerg will discuss an approach that puts people in the center, creating solutions that are more reliable because they are more human. Dr. Randall Hill, executive director of the USC Institute for Creative Technologies, will describe how new and emerging virtual reality technologies are creating a new era for learning and development. A combination of gaming and artificial intelligence advances, instructional design and creative content are helping people to lead and remain resilient in the face of humanity’s greatest challenges.

Following the discussion, guests will be invited to see and experience ICT’s cutting edge technologies.

Click here for more information and to reserve a seat at the event.

Schedule: Please note the press preview has been moved from 4pm to 5 pm.
5:00 pm Press Preview

5-6 pm Registration

6-7 pm Panel Discussion

7-8 pm Tech Demonstrations and Reception

Uncertainty During Interpersonal Negotiations: Perceptual and Physiological RamificationsDuring Interpersonal Negotiations: Perceptual and Physiological Ramifications

Emotion is important in motivated performance scenarios such as negotiations. But studying complex interpersonal tasks is difficult with existing behavioral science research methods because of some challenges, namely ecological validity and experimental control. To overcome these challenges, we have built realistic virtual humans that can express complex behaviors, such as emotional facial expressions, as a technological innovation to traditional research methods. Longstanding theories of emotion suggest that facial expressions provide enough information to perceive another person’s internal affective state. Alternatively, the contextual emotion hypothesis posits that situational factors bias the perception of emotion in others’ facial displays. This hypothesis predicts thatindividuals will have different perceptions of the same facial expression depending upon the context in which the expression is displayed. In this study, cardiovascular indexes of motivational states (i.e., challenge vs. threat) were recorded while players engaged in a multi-issue negotiation wherethe opposing negotiator (confederate) displayed emotional facial expressions (angry vs. happy); the confederate’s negotiation strategy (cooperative vs. competitive) was factorially crossed with his facial expression. During the game, participants’ eye fixations and cardiovascular responses, indexing task engagement and challenge/threat motivation, were recorded. Results indicated that participants playing confederates withincongruent facial expressions (e.g., cooperative strategy, angry face) exhibited a greater threat response, which arises due to increased uncertainty. Eye fixations also suggest that participants look at the face more in order to acquire information to reconcile their uncertainty in the incongruent condition. Taken together, these results suggest that context matters in the perception of emotion.

MxR in the Los Angeles Times

A story about Oculus founder Palmer Lucky notes that he got his start working in his parent’s garage and at the Mixed Reality Lab at ICT.

Invited Talk: Sequential Decision-Making in Human-Agent Interactions

Itís Only a Computer: The Impact of Human-agent Interaction in Clinical Interviews

Gesture Generation with Low-dimensional Embeddings

International Coverage of Virtual Human Studies on Facial Expressions and Cooperation

Asian News International featured a study by Peter Carnevale of the USC Marshall School and colleagues, including ICT’s Jonathan Gratch, on how facial expressions affect cooperation. “Good negotiators are adept at making offers and talking in negotiation, but also at managing their facial expressions,” Carnevale said. RedOrbit also covered the story, noting that the study was a collaboration with Celso M. de Melo of the USC Marshall School and researchers at the USC Dornsife College and USC Institute for Creative Technologies.

Game Informer Covers MxR

Game Informer magazine featured the transformative virtual reality work of Mark Bolas, director of ICT’s MxR Lab and an associate professor in the Interactive Media Division at the USC School of Cinematic Arts.

“With the benefit of academia, Bolas’ lab has the opportunity to start with very expensive and powerful virtual reality solutions and slowly scale back in service of finding what Bolas calls “essential elements,” or what might otherwise be considered a minimum spec. Academia leads design of prototypes and eventually consumer models. For instance, lab testing determined a minimum field of view of 90-100 degrees, since applied in both the Rift and Morpheus,” stated the story.

The story states that the MxR Lab is pushing forward the field of virtual reality research. Palmer Luckey, developer of the Oculus Rift, worked for the MxR Lab for about a year, the story noted. The lab is developing a “virtual bridge” for the U.S. Navy that will allow personnel to interact with virtual displays and do work on more then one ship at a time, without actually being there. “I haven’t read a science fiction book yet, or seen a science-fiction movie about virtual reality that has come up with something I don’t believe could happen,” Bolas said.

Evolving Innovation: Fostering Cutting-Edge Thinking in Army Science and Technology

A story in the April-June issue of Army AL&T Magazine featured an article on the Army Sci Tech Reconnaissance program co-authored by ICT’s Julia Kim. ICT is one member of a team working to identify trends that can help shape the Army of the future.
Read the full article on beginning on page 86.

Panel: R&D Educational Engagement with the Industry

Do We Punish the Robot

“Gravity”, Lighting R&D and “Digital Ira”

The New Yorker Profiles Paul Debevec

In “Pixel Perfect: The Scientist Behind the Digital Cloning of Actors”, the April 28 issue of The New Yorker features Paul Debevec, ICT’s chief visual officer, and his role in advancing technologies for creating realistic digital doubles. In addition to showcasing Debevec and his research the article includes quotes from ICT’s David Traum, Oleg Alexander and Ari Shapiro. It also mentions other ICT work, including military-funded projects to create VR training simulations, ICT’s Virtual Human Embodiment and the Computational Emotions groups, and New Dimensions in Testimony, ICT collaboration with the USC Shoah Foundation and Conscience Display that is creating interactive 3-D projections of Holocaust survivors that can answer questions from students or museum visitors. Subscription is required to read the full story.

ICT Technology Helps Morgan Spurlock in His Quest to Live Forever: CNN’s Inside Man Sunday, April 20 at 10/7pm

The Morgan Spurlock program Inside Man will feature ICT’s Chief Visual Officer Paul Debevec and the Light Stage technology that was used to turn Spurlock into an interactive hologram as part of his episode on futurism. The program airs Sunday, April 20 at 10:00 EST/7:00 PST and will appear online as well. Be sure to tune in.

ICT REU Intern Shares Her Experiences Getting Published and Presenting Her Paper Overseas

Washington and Lee computer science major Haley Archer-McClellan was part of ICT’s Research Experience for Undergraduates (REU) program last summer. REU is a National Science Foundation-funded effort to expose students at liberal arts colleges to graduate level research. Archer-McClellan worked with Andrew Gordon and Melissa Roemmele of ICT’s Narrative Group. She ended up being an author on an accepted publication and traveled to Israel to present at a conference. You can read about her experience here.

“Haley’s post demonstrates that our REU program is already making a meaningful impact for the students, said ICT’s Evan Suma, who oversees the program here. “This is really what the REU is all about.”

Learning from Experience: Viterbi News Features Andrew Gordon’s Narrative Work

A story on the Viterbi website highlights Andrew Gordon’s narrative research. It also includes a video on his Heider-Simmel Interactive Theater project.

Gordon’s goal is for computers also to learn from experiences—experiences people share when they narrate and interpret the events of their lives. Driven by a desire to develop machines that can think like people, he is identifying, collecting and studying stories in order to give computers knowledge they can apply in new situations.

His narrative research intersects with many of the multidisciplinary topics explored at the 15-year-old ICT, a research center established at USC by the U.S. Army to advance the state of the art in simulation and training. Story is a common thread throughout the institute, which specializes in the creation, study and use of believable characters and scenarios.

“Soldiers swap war stories for a reason,” said Gordon, whose current work is funded by the Army, Navy and Defense Advanced Research Projects Agency, also known as DARPA. “They help people explain why things happened and predict what will happen next. They serve to pass on knowledge of things that people didn’t go through themselves.”

USC, home of top art and communication schools, is a natural fit for someone interested in narrative. What is unusual is to find this focus in a department of computer science, where Gordon’s research ranges from basic science analyzing the structure of stories to the development of training video games that incorporate real-world lessons. He has turned the Internet into a living laboratory by collecting and analyzing millions of personal accounts posted on blogs. He even developed a documentary about the bloggers he studied.
Gordon’s overarching challenge is to understand the processes that produce stories and program computers with the same interpretive powers. Read the full story here.

Can Computers Tell Stories? from USC Viterbi on Vimeo.

Michael Naimark Talk: Cycloramas Re-Imagined

“VR is back!”, and while for most this means head-mounted displays and fantasy imagery, collective displays and realworld imagery are untapped both as a new art form and as a new market (e.g., “virtual travel”). Of course, display type and imagery style need not be coupled, but the concept of large-scale public exhibition of realworld imagery has a good hundred years of history to learn from. These spaces were called Cycloramas, and the hundred years were throughout the 19th Century, before cinema. So how would this be done today? For one thing, the rules for such real world image capture and collective displays are not yet written, and different approaches to specifications, artifacts, and genres will be presented. In the end, these rules and approaches will be driven by the messages we wish to convey. A colorful and exemplar use-case will be described, as well as a fresh look at the 2008 USC Viewfinder project in this context.

Bio:
Michael Naimark is a media artist and researcher who often explores “place representation” and its impact on culture. He is noted in the Computer History Museum’s account on Street View; the Wikipedia entries on Projection Mapping, Virtual Reality, and New Media Art; and a short vision essay of his ranks #1 (of over 1 billion results) on Google searches for live global video. He has directed projects with support from Apple, Disney, Atari, Panavision, Lucasfilm, Interval, and Google; and from UNESCO, National Geographic, the Rockefeller Foundation, the Exploratorium, the Banff Centre, Ars Electronica, and the Paris Metro. Michael has recently served as faculty at the MIT Media Lab (2011-14), NYU Arts’ Interactive Telecommunications Program (2009-13), and USC Cinemas’ Interactive Media Division (2004-09). He’ll be keynoting at the upcoming First International Symposium on Immersive Creativity in Montreal next month.

RSVP: Valerie Dauphin, dauphin@ict.usc.edu

Here is a link to a PDF of the talk (12 MB, no video) for those who were unable to attend.

Photo Credit: 2014, Scott Snibbe

New Scientist Covers Mark Bolas and VR for Teaching

New Scientist quoted Mark Bolas of the USC Institute for Creative Technologies and USC School of Cinematic Arts about the use of augmented reality in teaching.

Voice of America: Visual Effects Could Change How Movies are Made

A Voice of America story on digital technologies in film featured the work of Paul Debevec and the ICT Graphics Lab. Debevec spoke about the challenges of perfecting the human face and the increasing interest in bringing back digtal versions of performers who have passed away. The story noted that ICT researchers worked with Digital Domain to create a performing hologram of a deceased Asian pop star.

“With a ton of data from us and a ton of artistic effort and technical know-how from Digital Domain, they were able to create a singing face of Teresa Teng where she performed not only one of her original songs but two songs that weren’t even written at the time she was alive with Jay Chou,” said Debevec.

Paul Debevec Appears on the Katie Couric Show

Paul Debevec appeared on the Katie Couric show, along with Morgan Spurlock, to discuss the process of turning Spurlock into an interactive hologram for his CNN program, Inside Man, which airs, April 20.

Debevec explains this digital technology and its implications for the future, including a mention of New Dimensions in Testimony, ICT’s collaboration with the USC Shoah Foundation, in partnership with Conscience Display.

Mark Bolas on Virtual Reality in the Wall Street Journal

A video story in the Wall Street Journal features Mark Bolas discussing the present and future of virtual reality, with a focus on uses beyond gaming. The segment highlights Project BlueShark, ICT’s mixed reality project with the Office of Naval Research.

“Virtual reality is going to change the nature of collaboration and communication,” said Bolas, director of the MxR Lab at ICT and associate professor at the USC School of Cinematic Arts.

VentureBeat Covers MxR Lab

VentureBeat mentioned that research for the virtual reality gaming companies Survios and Oculus VR got their start in the USC Institute for Creative Technologies’ Mixed Reality Lab.

ICT Research Recognized at IEEE VR Conference

ICT virtual reality research was recognized at the IEEE VR conference this week.

Fast Capture Avatar received honorable mention for Best Research Demo. The team includes, Ari Shapiro, Andrew Feng, Ruizhe Wang, Gerard Medioni, Mark Bolas and Evan Suma.

New research in redirected walking received honorable mention for Best Poster. The team includes, Mahdi Azmandian, Rhys Yahata, Mark Bolas and Evan Suma.

Congratulations to everyone.

ICT Virtual Human Health Prototype Named Top Ten Digital Innovation by NetExplo Forum

Awards presentation held at UNESCO headquarters in Paris on March 26.

The USC Institute for Creative Technologies’ SimSensei prototype was named one of the year’s top ten most promising digital initiatives by the NetExplo Forum, a partnership with UNESCO to recognize technologies that are shaping the future in areas including education, health and communication.

SimSensei is a virtual human interviewer that can be used to identify signals of depression and other mental health issues. In recognizing ICT’s innovation, NetExplo’s event organizer’s noted SimSensei’s potential as, “a state-of-the-art tool that health care providers can use for screening and monitoring patients.”

SimSensei leverages ICT’s advances in developing interactive virtual humans – computer-generated characters that use language, have appropriate gestures, show emotion, and react to verbal and non-verbal stimuli. It also incorporates ICT’s MultiSense technology which provides real-time tracking and analysis of non-verbal behaviors, including facial expressions, eye gaze, body posture and voice intonation. From these signals, SimSensei can engage a user in conversation, follow up with appropriate questions based on an individual’s answers and body language, and use this data to infer signs of emotional distress. SimSensei is not designed for therapy or medical diagnosis, but is intended as a support tool for clinicians and healthcare providers.

“Think about SimSensei as a diagnostic tool, like a blood sample,” said Louis-Philippe Morency, director of ICT’s MultiComp Lab, who co-leads the project with Skip Rizzo, director for medical virtual reality at ICT. “You send a blood sample to the lab and you get the result. The people doing the diagnosis are the clinicians, but they use these objective measures to make the diagnosis.”

The SimSensei research and development effort is funded as part of DARPA’s Detection and Computational Analysis of Psychological Signals (DCAPS) project. This effort aims to address the high numbers of soldiers affected with post-traumatic stress by developing new tools to assess their mental health and enable them to seek timely help.

“Many people suffer in silence because they fear the stigma that may come from seeking help through traditional channels or because they simply don’t know where to turn,” said Rizzo. “Computer‐mediated care offers anonymity and access that may help reach these service men and women who need it most and could help support them in deciding to seek help with a live provider.”

Additional ICT projects are using the underlying MultiSense technology to enable richer human-computer interactions. These include an interactive virtual human audience that can give feedback to people looking to improve their public speaking skills and virtual human role players that can help people to prepare for job interviews.

About the USC Institute for Creative Technologies

At the University of Southern California Institute for Creative Technologies (ICT) leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society. Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center working in collaboration with the U.S. Army Research Laboratory. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more.

Wall Street Journal, Los Angeles Times and New York Times Cover ICT Mixed Reality Lab’s Work Making Virtual Reality More Accessible

Several articles about the development of the Oculus Rift virtual reality head mounted display noted the role of Mark Bolas, head of the Mixed Reality Lab at ICT and a professor in the USC School of Cinematic Arts.

The Wall Street Journal wrote that Bolas was a critical influence in the development of Oculus Rift’s virtual reality goggles. Since 2010, Bolas has led a lab dedicated to making VR technology more affordable, the story said. He published a paper describing how headsets could be created using two iPhones.

The Los Angeles Times noted the role that USC, ICT and the MxR Lab are playing in developing virtual reality technology. Much of the activity is taking place at USC’s Institute for Creative Technologies, where Oculus founder Palmer Luckey, 21, once worked as a lab assistant, the story said. The Playa Vista institute has attracted several top VR researchers including Mark Bolas.

“We really do have an emerging hotbed here. I think this is the heart and soul of where virtual reality is going to be,” said Bolas, who is director of the Mixed Reality Lab, a joint effort between the institute and USC’s School of Cinematic Arts that trains art and engineering students in virtual reality design. “We’ve got a nexus of people here that have been working in the field for over 20 years.”

The story also noted that the Mixed Reality Lab is leading the virtual reality entrepreunurial effort by providing hardware and software plans free to help foster start-ups.

The New York Times referred to Luckey’s time at MxR as an important break for him. The story also referenced a previous article that quoted Bolas about the cost of virtual reality systems, noted Bolas is an adviser to Oculus VR, and that elements of the company’s headset are based on USC research.

Demonstrations of E2C2 Work

Unity Virtual Reality Tutorial at IEEE VR

This tutorial will provide an introduction to Unity (http://www.unity3D.com) and several VR components that are designed to work with Unity. These VR components can be used in isolation or pieced together to provide fully immersive VR experiences.

Unity is a feature rich multi-platform game engine for the creation of interactive 3D content. It includes an intuitive interface while at the same time allowing low-level access for developers. Thousands of assets provided by other content creators can be reused to quickly develop immersive experiences. Because of its intuitive interface, well designed architecture, and ability to easily reuse assets, 3D software can be developed in a fraction of time compared to traditional development.

Consumer-level virtual-reality hardware combined with Unity have recently empowered hobbyists, professionals, and academics to quickly create virtual reality applications. Because of Unity’s widespread use and ease of use, several virtual reality companies now fully support Unity. During this tutorial, participants will learn how to quickly build virtual reality applications from some of the leaders of Unity virtual reality development. Attendees will gain an understanding of how to use multiple VR components with Unity and will have enough knowledge to start building VR applications using Unity by the end of the tutorial.

http://ieeevr.org/2014/tutorials.html

ICT’s Andrew Gordon Brings Seminal 1940’s Social Science Experiment Online

Andrew Gordon studies stories. He leads the Narrative Group at the University of Southern California Institute for Creative Technologies where his research is devoted to getting computers to be able to read and generate stories and to use the knowledge in stories in order to become more intelligent.

Gordon is also a professor in the USC Department of Computer Science and his newest project involves reimagining a 70-year-old social science experiment for the digital age.

In 1944, Fritz Heider and Marianne Simmel created a simple animated film depicting the motion of two triangles and a circle as they moved in and around a box that alternated between being opened or closed. Heider and Simmel asked people to describe what they saw. Now a classic work in the field of social psychology, the subjects responded with creative narratives that ascribed human-like goals, plans, beliefs, and emotions to the moving objects. Popular themes included romantic relationships and prison breaks.

In short, viewers of the Heider-Simmel film treated the shapes as if they were people.

Can Computers Tell Stories? from USC Viterbi on Vimeo.

Gordon wondered if a computer could be taught to do the same thing. Thus, the Heider-Simmel Interactive Theater was born. It is a web-based application that allows people to make their own movies and write their own stories using triangles. A companion site is called Triangle Charades. Here, people can make their own animations of different actions, and guess the intended actions in other people’s animations.

“This research is trying to solve fundamental problem in human computer interaction,” said Gordon. “The end goal is to collect enough data to test and train our systems to recognize actions and narrative so that computers will tell stories that are as creative and compelling as the ones people are telling.”

In order to inform the computer, Gordon is hoping to collect large amounts of data so that the system can be tested and trained to recognize actions and motivations. Anyone interested in participating as a study subject and content creator can follow the links below.

Learn more about the projects here.
Try the Heider-Simmel Interactive Theater online here.
Play Triangle Charades online here.

This Office of Naval Research-funded effort is a collaboration with Jerry R. Hobbs of the USC Information Sciences Institute and Louis-Philippe Morency and Melissa Roemmele at the USC Institute for Creative Technologies.

Facebook Buys Oculus VR: USA Today Notes that Founder Palmer Luckey Worked at ICT

In coverage of Facebook’s purchase of virtual reality headset maker Oculus VR, an article in USA today notes that Oculus Founder Palmer Luckey worked at the ICT’s Mixed Reality Lab.

Congratulations Palmer!

Relative Facial Action Unit Detection

Mahmoud Khademi of McMaster University and ICT’s Louis-Philippe Morency will present their paper.

ICT Executive Director Explains How To Create Simulators for Leadership Skills

Lessons form chapter of new book on using experience to develop talent

At the USC Institute for Creative Technologies, Executive Director Randall W. Hill, Jr. steers the institute’s exploration of how virtual reality and video games can be used to develop meaningful learning experiences.

Hill explains how new and emerging technologies are enabling the modern equivalent of flight simulators for social skills in “Virtual Reality and Leadership Development”, a chapter of the new book Using Experience to Develop Leadership Talent. Edited by Morgan McCall, a professor of management and organization at the USC Marshall School of Business, and Cynthia D. McCauley, a senior fellow at the Center for Creative Leadership, the Jossey-Bass-published book is part of the Professional Practice Series sponsored by the Society for Industrial and Organizational Psychology.

“Advances in the learning sciences and simulation technologies now make it possible to acquire leadership experience by practicing virtually before meeting reality, said Hill. “These breakthroughs enable leaders to practice and learn in safe environments where mistakes are not costly in human resource terms but are sources for reflection and learning.”

The chapter describes ICT’s approach to experience-based learning that allows students to develop the mix of social skills that leaders can apply to conducting one-on-one conversations and to navigating the complex dynamics of vibrant organizations. ICT-developed social simulations mentioned in the chapter are the ELITE and INOTS training systems, which use digital media to demonstrate concepts and virtual human role players for teaching informal counseling skills. Hill also includes UrbanSim, a video game-based practice environment for understanding the intended and unintended consequences of a leader’s decisions that leverages real-life experiences and simulations that model the beliefs, actions, goals and attitudes of individuals and groups.

Hill imparts some lessons learned over ICT’s nearly 15 years creating virtual reality simulations designed to improve leadership, negotiation, cultural awareness and communication skills. His key messages:

• Be prepared to leverage new technologies. They are going to change the way we educate and train the workforce, particularly in the area of social skills.
• Don’t just use technology for technology’s sake. Make sure you have a sound instructional design that includes a clear idea of what is to be learned and how it will be learned.
• Story and play are powerful elements for increasing emotional engagement and making lessons stick.
• Require a feedback system that ensures the correct lessons are being learned including, what happened, why and how to improve the next time.

“The approach we have developed working with the U.S. Army and Department of Defense are applicable to business and the society as a whole,” said Hill. “The combination of technology, instructional design and creative content will make possible a new era for learning.”

About the USC Institute for Creative Technologies
At the University of Southern California Institute for Creative Technologies (ICT) leaders in artificial intelligence, graphics, virtual reality and narrative advance low-cost immersive techniques and technologies to solve problems facing service members, students and society. Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center working in collaboration with the U.S. Army Research Laboratory. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more.

Al Jazeera America’s TechKnow Features the ICT Graphics Lab and Med VR Group

Science reporter Cara Santa Maria returned to the ICT Graphics Lab for a TechKnow story on movie magic. Cara was scanned in ICT’s Light Stage X and volunteered to be turned into a digital double. She discussed her experience on the Al Jazeera America blog. The full segment aired on TechKnow at 7:00 PM EST/ 4:00 PM PST, Saturday, March 22.

The same episode also featured a rerun of Lindsay Moran’s earlier TechKnow segment on using virutal reality therapy for post-traumatic stress.

Check Al Jazeera America’s Channel Finder for information about your area.

Popular Mechanics Features Project BlueShark and the MxR Lab at ICT

A feature story on virtual reality included an interview with ICT’s MxR Lab Director Mark Bolas and a demonstration of Project BlueShark, ICT’s collaboration with the Navy that uses virtual reality to imagine the workspace of the future.

“The primary goal is breaking time and space,” Bolas said in the article. “You can virtually teleport someone in there from shoreside” to collaborate on tactics, for instance, states the story.

The article highlighted the MxR Lab’s work in virtual reality head mounted displays, including low-cost DIY versions and the Wide5 HMD, a headset with a 150-degree field of view that Bolas and Ian McDowall started developing around 2005. The story also notes the lab’s experiments in redirected walking, which allows a fixed space to become much larger in the virtual world, and mentions that Palmer Luckey, founder of the Oculus Rift, worked in the military-funded lab.

Study Uses Internet Blog Posts to Understand Women’s Stroke Symptoms

A collaborative study between USC and Pomona College researchers analyzed internet weblog posts and found that women report more changes in mental health status than do male patients. The study was published in the Journal of Medical Internet Research and written about in the Pomona College news. ICT’s Andrew Gordon and Christopher Weinberg conducted this research with Pomona College Professor of Linguistics and Cognitive Science Deborah Burke.

Motherboard and VICE Canada Feature ICT Virtual Reality Mental Health Work

Canadian military veteran Jody Mitic visited ICT as part of a crew from Motherboard and VICE Canada to explore ICT’s work using virtual reality to address mental health issues, like post-traumatic stress and depression. Mitic spoke Skip Rizzo, ICT’s director for medical virtual reality and also tried out Bravemind, our virtual reality exposure therapy system, and SimSensei, a virtual human interviewer who can detect signs of depression. Mitic also spoke to Sgt Jonathan Warren, who was treated with ICT’s VR therapy.

“This is probably the most effective treatment out there that I’ve had, and I’ve had many,” said Sgt. Warren in the online piece. “I think most of the Soldiers out there could benefit from something like this.”

In a story accompanying the online documentary, VICE reporter Patrick MacGuire wrote, “while we’re hoping that the technologies we saw at ICT will soon trickle down into militaries worldwide, there are clearly implications here for the entire planet”.

Mitic also spoke about his experience at ICT on CTV News.

Gizmodo Visits ICT to Test Drive Blue Shark, our VR Vision of the Future

Gizmodo reporter Brent Rose featured Blue Shark, an Office of Naval Research-funded project that uses virtual reality to imagine how Sailors will collaborate, communicate and train. In an accompanying video, Todd Richmond of ICT said the project is being developed for the next generation, who will have grown up immersed in virtual technologies. “It changes our conception of how ships should be designed, and how people could be collaborating and communicating,” said Mark Bolas of ICT’s MxR Lab.

GIZMODO – Blue Shark at the Institute for Creative Technologies from Gizmodo on Vimeo.

Triangle Charades: A Data-Collection Game for Recognizing Actions in Motion Trajectories

Toward Crowdsourcing Micro-Level Behavior Annotations: The Challenges of Interface, Training, and Generalization

Kitchen Talk: Can we build Passionate Intelligent Tutoring Systems?

The intersection of affective computing, educational games, and intelligent learning technologies has produced heightened interest in the construct of engagement and its role in learning. Although games researchers have largely dominated this conversation, in this talk I will focus on the application of embodied virtual agents to the challenge of promoting learning, sparking interest, and sustaining engagement. In work funded by the U.S. National Science Foundation, we designed and built three virtual humans that now inhabit the Boston Museum of Science (MOS) and interact with thousands of visitors a month. These animated, life-sized characters seek to promote general interest in STEM and motivate museum visitors to want to know more about computer science, robotics, and technology. I will discuss our experiences building the virtual human twins Ada and Grace, who answer general questions about themselves and technology, as well as Coach Mike, a pedagogical agent that teaches basic programming skills at an exhibit known as “Robot Park”. These characters seek not only to convey knowledge, but also to entertain visitors and increase engagement through humor, animation, and speech. They were designed with the intention of being role models for young learners. I will summarize the results from a 3-month evaluation and discuss lessons learned from deployment. As time permits, I will describe a future of pedagogical agents that are (1) passionate about what they teach, and (2) genuinely happy when they successfully promote learning for those they try to help.

ICT Tour and Reception for Next Med’s Medicine Meets Virtual Reality Conference

On Friday evening, participants from the Medicine Meets Virtual Reality Conference enjoyed a reception at the University of Southern California’s Institute for Creative Technologies. Attendees saw demonstrations of a number of research and development projects with health applications, including game-based rehabilitation, virtual reality exposure therapy for post-traumatic stress and user sensing systems for health care support. USC’s Center for Body Computing also showcased some of their digital health research and innovations. See photos on our Facebook page:

Demo: Mobile Personal Healthcare Mediated by Virtual Humans

Building the Holodeck

Panel discussion

Integrating Intelligent Tutoring Systems in Virtual World Training/Learning

Advances in Photoreal Digital Humans in Film and in Real-Time