CES 2017: VR and AR Play Big

Virtual reality (VR) and augmented reality (AR) had a banner year in 2016. Capturing the public’s interest and attention, these mixed reality technologies spread into home devices from smartphones to headsets, with media from the NFL to the NBA, to business markets including real estate and travel.

As CES dawns—just two weeks away—both VR and AR will both play a big role in the coming technology show. Many expect new headsets to be be previewed and announced, along with ways to integrate the technology into different markets and devices.

“2017 is the year that VR will be more mainstream, will have more market penetration,” says Todd Richmond, the director of the Mixed Reality Studio at the USC Institute for Creative Technologies, the director of the Mixed Reality Lab at USC’s School for Cinematic Arts and an IEEE (Institute of Electrical and Electronics Engineers) Fellow. “I think the biggest challenge is the headsets will have to wireless because no one wants to be tethered.”

Will CES bring what Richmond predicts?

Read more on gearbrain.

None Of Us Are Any Good at Deciding Who Is and Isn’t Trustworthy

Fascinating new research from the University of Southern California and Temple University harnesses the power of computer science to determine what behaviors make someone seem trustworthy and what behaviors indicate that someone really is trustworthy.

Spoiler alert: The two sets of behaviors don’t match up.

Read the full article on Business Insider.

City Hall’s Call for Tech Integration

Garbage cans that play soothing music, smartphones that detect potholes by reacting to bumps, and augmented reality at City Hall meetings could soon be coming to Boston as officials call on tech companies to propose new programs aimed at changing how people use city services.

ICT’s Todd Richmond gives some insight into how Boston’s City Hall can utilize augmented reality as a way to unite and encourage participation at City Hall meetings.

Read the full article on Boston Herald.

The New Frontiers of Virtual Reality

This article explores virtual reality and immersion, mentioning Bravemind in its download. Read the full article on Live Mint.

Virtual Reality is Actually Here

If you think virtual reality exists today primarily in the gaming industry, you are seriously behind the times! VR is springing up all over, and in some rather surprising places. VR videos are already available on YouTube, Android PlayStore and the Apple Store. Equipment prices and quality vary dramatically, from Google’s inexpensive Cardboard to incredibly expensive and complex systems.

VR has been evolving since the first commercial flight simulator was patented in 1931. That invention helped train over 500,000 pilots during World War II. More recently, consumer VR has been used in games, particularly shooter games. As processing power has increased and gaming engines have evolved, the virtual worlds created for games have expanded significantly. Realism has increased through better light and shadows, along with an improved representation of the physics of moving objects.

In parallel with gaming, VR is expanding into many other areas. Visit ARN to read the full article featuring the areas in which VR is breaking into, including a mention of ICT’s Bravemind.

Efforts Underway to Erase Male Sexual Assault Stigma, Focus on Prevention

The U.S. Army takes a look at its SHARP Academy and its Hologram Project.

Read more here.

IBM Wants to Build AI That Isn’t Socially Awkward

Big Blue says its latest Watson tech enables chatbots, robots, and even smart cars and houses that can understand and relate to humans.

Though artificial intelligence experts may cringe at the portrayals of humanlike AI in science fiction, some researchers are nudging us closer to those visions.

In this piece for Fast Company, ICT’s virtual humans research and Jonathan Gratch are featured in driving home the point that some service people are more comfortable opening up to a virtual human than to a real one who may be judgmental. Given the cost and ever shakier status of mental health care coverage, AI may fill a gap that human professionals can’t.

Read the full article on Fast Company.

ICT Wins Big at I/ITSEC 2016

Virtual Reality integrates real-time computer processing, interface technology, body tracking & sensory displays to support a user becoming immersed in a computer generated simulated environment. It is a way for humans to communicate with computers and extremely complex data in a more naturalistic fashion, and with such controllable, dynamic and interactive 3D stimulus environments, behavioral action can be monitored, recorded and measured.

Though VR has been around for decades, it is only now reaching its hype cycle in mainstream consumerism. USC researchers Albert ‘Skip’ Rizzo and Thomas B. Talbot won ‘Best Tutorial’ at Interservice/Industry Training Simulation and Education Conference (I/ITSEC) 2016 for their presentation about all things VR – past, present and future. Rizzo and Talbot brought to light the technology progress train, gaps in our ability to effectively use VR, compelling science as a reason to use VR, and how to make it useful.

In addition, the Medical VR team at USC Institute for Creative Technologies took home the ‘Best Government Game’ award for USC Standard Patient, a freeware Virtual Standardized Patient (VSP) community for medical students, residents, continuing medical education and medical educators.

For more information, please visit http://medvr.ict.usc.edu and/or https://prod.standardpatient.org.

Choose Your Own Digital Adventure

On Nov. 15, the ninth annual International Conference on Interactive Digital Storytelling (ICIDS) brought more than 75 researchers from around the world to ICT’s Waterfront Drive offices for groundbreaking discussions about interactive fiction, virtual characters and untapped intersections of technology and narrative.

Nine juried art exhibits with revolutionary approaches to interactive digital storytelling featured prominently in the conference, which was free and open to the public. Playa Vista Direct dives deeper into each exhibit and its experience, read more here.

SUMMER INTERNSHIP APPLICATIONS ARE NOW OPEN!

Are you a student wishing to pursue a career in simulation, interactive media and/or virtual reality? Join the ICT team this summer to create compelling immersive systems for military, entertainment and educational purposes. Click here for more information or to apply, deadline is February 5, 2017.

Google Brain & Artificial Intelligence

Comic-Con’s “Impossible Science” visits ICT to meet with Jonathan Gratch and ‘Ellie’ for an episode centered around artificial intelligence.

Visit Comic-Con HQ‘s site to view the full episode.

This Neural Network Creates 3D Faces from Blurred Photos

When actors like Kevin Spacey or Mads Mikkelsen want to star in a video game, the process is incredibly complex. For this, the celebrities are scanned with dozens of cameras and scanned by scanners, that map every square meter of their face. A research group at the Institute for Creative Technologies at the University of Southern California believes to be able to change that. For a realistic 3D image, they only need a high-profile photo that does not even have to show the entire face, Shunsuke Saito, Hao Li and colleagues claim.

WIRED Germany explores this more, click here for the full article.

The Emerging Tech of Virtual Reality Applications

As VR spreads far beyond the world of gaming, SYS-CON explores the state of VR in 2016 and where developers are taking it in the future.

Read the full article featuring ICT’s Medical VR initiative here.

The Virtual Laboratory Patient

For the Mitele TV show, MTMAD, ICT’s Skip Rizzo sat down with host Marcos Anton to discuss how technology can improve our health.

Watch the full segment here.

 

ELITE SHARP POST

Download PDF overview.

Emergent Leader Immersive Training Environment Sexual Harassment/Assault Response & Prevention, Prevention & Outreach Simulation Trainer (ELITE SHARP POST) is a laptop training application that educates United States (US) Army Sexual Assault Response Coordinators (SARCs) and Victim Advocates (VAs) on their roles and responsibilities related to Prevention and Outreach when supporting the Commander and other SHARP stakeholders responsible for implementing the SHARP Program. Developed under the guidance of the SHARP Program Management Office and in collaboration with the US Army SHARP Academy, the ELITE SHARP POST content incorporates evidence-based instructional design methodologies, and USC ICT research technologies. Virtual humans, story-based scenarios, and intelligent tutoring technology help create a challenging yet engaging training experience.

The ELITE SHARP POST software features four scenarios. These scenarios are based on relevant real-world incidents that SARCs and VAs may face on the job. Each scenario offers the user a chance to practice the Prevention and Outreach-related skills and activities required by SARCs and VAs to successfully help Commanders establish an environment where SHARP incidents are deterred. The package includes three phases: Up-front Instruction, Practice Environment, and an After Action Review (AAR).

The total training time for ELITE SHARP POST is anywhere from one and a half to two hours depending on a user’s proficiency. Time will vary depending on user experience level, performance, and engagement. Some users may take time to review missed concepts based on how well they respond to quiz questions. Some users may choose to watch all suggested training vignettes and comparisons. Some users may thoroughly engage in the AAR after an interaction in the practice environment. Some users may choose to practice all three scenarios.

ELITE SHARP POST offers US Army SARCs and VAs a unique opportunity to learn and practice their skills and activities for SHARP Program Prevention and Outreach so they’re better prepared to interact with the SHARP stakeholders they support. Upon completion of the ELITE SHARP POST training, users will be able to demonstrate their understanding of the critical SARC and VA roles and responsibilities when supporting the Commander’s SHARP Prevention and Outreach Program.

ELITE SHARP POST is available for download on the MilGaming web portal: https://milgaming.army.mil/

I/ITSEC 2016

I/ITSEC 2016 will be held 28 November – 2 December 2016 in the West Concourse of the Orange County Convention Center in Orlando, Florida. Stop by the Army and Navy booths to learn more and demo some of ICT’s cutting-edge research.

When Robots Feel Your Pain

Artificial intelligence is now in a position to transform psychiatric hospitals for the better in the year ahead. Read the full article featuring ICT research from Louis-Phillipe Morency, Jonathan Gratch and Stefan Scherer on The Economist.

Does an App That Measures Suicide Risk Violate Your Rights?

ICT’s Stefan Scherer talks with Vice’s Health Channel, Tonic, about these risks more. Read the full article here.

2016 International Conference on Interactive Digital Storytelling (ICIDS)

This year, the International Conference on Interactive Digital Storytelling will take place at ICT. It aslo features a collaboration with the ninth edition of Intelligent Narrative Technologies (INT9), a related series of gatherings that holds Artificial Intelligence as its focus.

Terrorism is No Game, but Playing This One Could Help in the Fight

Fast Company’s Sean Captain sat down with ICT’s Ryan McAlinden to discuss CounterNet and how it’s used to help detect the possible ways in which groups and individuals use the Internet for . . . nefarious, criminal, terrorist purposes. To read the full article, click here.

USC Institute for Creative Technologies Hosts Digital Storytelling Showcase

By USC News Staff

The latest in creative digital works that integrate digital technologies, interaction and narrative were on display in a special exhibition were on display Tuesday at a special exhibition at USC’ Institute for Creative Technologies. The juried exhibition was the kickoff for the ninth International Conference on Interactive Digital Storytelling, behind held through Friday at the Playa Vista center.

Art Exhibition at International Conference on Interactive Digital Storytelling (ICIDS) 2016

The Art Exhibition of International Conference on Interactive Digital Storytelling (ICIDS) 2016 is an annual juried showcase of new creative works that integrate digital technologies, interaction, and narrative in artistically significant ways. This year, the exhibition will be held on a single day (November 15, from 11 a.m. to 4 p.m.) at the ICIDS conference venue, the USC Institute for Creative Technologies in the Playa Vista neighborhood of Los Angeles, CA. The exhibition is open to the public through email reservations. For more information, visit http://icids2016.ict.usc.edu/exhibition/.

RSVP by email to icids2016exhibit@ict.usc.edu indicating number of guests and arrival time.

How Video Games Are Helping Young Veterans Cope

In honor of Veterans Day, Complex talks about alternative PTSD treatments featuring ICT’s Skip Rizzo and Bravemind.

Is the Retail World Ready for Virtual Reality?

Todd Richmond discusses the latest IEEE survey and whether the retail world is ready new strides being made in virtual reality with Wall Street Journal’s Tech Talk.

How Virtual Reality is Helping Veterans Overcome PTSD

According to the US Department of Veteran’s Affairs about 11 out of every 100 veterans who served in Operations Iraqi Freedom and Enduring Freedom have PTSD in a given year as well as about 12 out of every 100 veterans who served in Desert Storm.

To help our American heroes, medical professionals and VA hospitals are turning to virtual reality to help heal them through gradual exposure therapy. At the University of Southern California Institute for Creative Technologies, Bravemind is a virtual reality exposure therapy system has shown to reduce symptoms of PTSD with veterans.

Guided by a therapist, the patient is taken through virtual scenarios that resemble their experience in Afghanastan or Iraq, in order to help them process their traumatic memory and talk through their fears and anxiety. To date, Bravemind has been distributed to over 60 sites including VA hospitals, military bases and university centers.

Read the full article in Tech.co.

Pinscreen Launches with High-Tech Distractions for a Nerve-Wracking Election

ICT’s Vision and Graphics Lab Director Hao Li along with his team at Pinscreen launch a system for generating what can be photorealistic three-dimensional images from a single scan. While the technology’s current applications involved distorted faces for caricature, the same technology could be used for exceptionally realistic video.

Basically, it brings the technologies that were previously only available to folks at graphics studios like Industrial Light & Magic or Weta (the go-to studio for Peter Jackson’s Hobbit and Lord of the Rings series) to mainstream users.

Read the full article on TechCrunch.

Global Behavior Change Through Virtual Reality

With Virtual Reality (VR) entering its golden age, new uses abound. VR now has the potential to help us internalise experiences that are not filtered through our own prejudices. And while VR continues to make its mark on entertainment, it has immense potential to allow people to experience something groundbreaking: “duality of presence” – being in two realities at once. This has an extraordinary capacity for encouraging greater empathy, understanding, compassion, and connection to the “real world.”

What makes VR possible? Sophisticated yet simple, intuitive technology and lifelike graphics that create seamless transitions from the real world to a virtual world. Not only does virtual reality have the capacity to make you feel, it has the power to make you know. By immersing yourself in a “real” experience, VR provides evidence from prime sources and acts as a truth-telling representation of the real world.

Read the full article featuring ICT and Bravemind on MIS-Asia.

USC Institute for Creative Technologies to Host Free Interactive Art Exhibit

USC Institute for Creative Technologies to Host Free Interactive Art Exhibit

Media and general public are invited to attend; See below for more information.

WHAT:

The Art Exhibition of International Conference on Interactive Digital Storytelling (ICIDS) 2016 is an annual juried showcase of new creative works that integrate digital technologies, interaction, and narrative in artistically significant ways. This year, the exhibition will be held on a single day (November 15) at the ICIDS conference venue, the USC Institute for Creative Technologies in the Playa Vista neighborhood of Los Angeles, CA. The exhibition is open to the public through email.

WHO:

Loss of Grasp by Serge Bouchardon, Vincent Volckaert

A narrative and interactive experience about the feeling of losing grasp.

I Can Feel It in The Air Tonight by James Earl Cox III and Julie Buchanan

A 30-minute interactive fiction game with diegetic field recordings, the story switches between a young couple’s first date and a military tribunal, juxtaposing the nervous beginnings of a new love with a transcribed recounting of past actions.

Priya’s Mirror AR by Ram Devineni and Dan Goldman  

Dive into the augmented reality and experience the interactive art of the next edition titled “Priya’s Mirror” about acid attacks. The comic series received the Tribeca Film Institute New Media Fund from the Ford Foundation and funded by the World Bank. “Priya’s Mirror” premiered at the 2016 New York Film Festival at Lincoln Center.

Moving through Space and Narrative by Mara Dionisio, Paulo Bala, Rui Trinade, Sandra Câmara, Dina Dionisio and Valentina Nisi

An art installation where participants are able to fully explore an interactive narrative called “The Old Pharmacy”, experimenting with different input methods for movement and exploration of a digital narrative. 

Walden, a Game by Tracy Fullerton and Walden Team

A first-person simulation of the life of American philosopher Henry David Thoreau during his experiment in self-reliant living at Walden Pond.

Axion by Ivaylo Getov, Jasmine Idun Isdrake and Kyle Gustafson

An immersive documentary game engaging players with the perspectives of contemporary scientists who confront the deepest questions in astrophysics. By allowing a player a measure of freedom to navigate and discover an evolving virtual environment, the project suggests a broad consonance between the process of scientific discovery and a personal, emotive confrontation with the unknown.

Pry by Tender Claws

An interactive story (intended for tablets) that allows the reader to unravel the fabric of memory and discover a story shaped by the lies we tell ourselves; lies revealed when we pry apart the narrative and read between the lines.

Nevermind by Erin Reynolds

A biofeedback-enhanced adventure horror game that takes you into the dark and surreal world of subconscious.

  1. Way by Annamaria Andrea Vitali

A short texture interactive novel about dissonances, desires and fears.

WHEN:

Tuesday, November 15, 2016. 11:00 a.m. to 4:00 p.m.

WHERE:

USC Institute for Creative Technologies

12015 Waterfront Drive, Playa Vista, California

RSVP:

To attend, please RSVP by email to icids2016exhibit@ict.usc.edu and indicate the number of guests in your party plus arrival time.

 

 

How Deepak Chopra is Using Virtual Reality to Improve Your Health

Tech.co explores Deepak Chopra’s new initiative using color therapy and Hindu Chakras for a guided mediation VR application. In this feature, various works of VR as a tool for mental health are mentioned, as well as ICT’s PTSD tool.

 

5 Ways Humanitarian Bots Can Save the World

While bots have a long way to go to reach their potential, even the emotionless bots of today have changed the world for the better, from revealing epidemics of violence against young girls (U-Report) to automating government bureaucracies like homeless status applications (Do Not Pay). Bots can tell stories to help you empathize with humanitarian crises (Yeshi), assist your healthcare providers (GYANT, Ellie, Sensely), and help you quit your worst habits (Stoptober).

Read the full article about humanitarian bots changing the world on VentureBeat.

This Woman’s Experience Shows a Major Problem Facing Virtual Reality

Why experiencing abuse in VR can be damaging in real life.

Dr. Albert “Skip” Rizzo, the director of Medical Virtual Reality at the University of Southern California’s Institute for Creative Technologies, told ATTN: over the phone that Belamire’s feelings are valid:

“We’re very fond of saying that people suspend disbelief when they’re in a virtual environment, and we always say that when we’re talking about doing it for good entertainment, or in my area, for clinical purposes. We accept that VR can create positive entertaining experiences and also positive healthful or educational experiences. I don’t think we can draw a line when it comes to people [saying] well, it’s real enough that people feel considerable discomfort when they’re violated. When you’re in a virtual world and you’re really engaging with it as if it’s a real thing, to have some ignorant troll harassing you can have an impact. I don’t think anyone [in their] right mind that knows anything about the technology and its impact on how people interact and behave in the world should deny that.”

Dr. Rizzo noted recent examples of online bullying-related suicides to show that computer-based and virtual activity can have a real impact on people.

“Why is it any different if somebody transduces a physical action through an interface that creates a virtual experience that upsets people?” he said. “It’s real easy to say, ‘Oh, it didn’t really happen.’ Well a lot of things happen that have impacts on people, whether it’s words, digital, or virtual representation. I think it’s grossly insensitive and ignorant to take that kind of position. That’s my personal view as someone who tries to leverage the impact of the virtual world for healing.”

Read the full article on attn.

North America 17 Consumer Trends

USC Institute for Creative Technologies and Bravemind is featured in Mintel’s 2017 Consumer Trends report.

Imagine a Time…

SIGNAL Magazine talks about ARL’s work and collaboration with USC ICT.

Virtual Trauma: Could Home VR Affect PTSD Sufferers?

For years now, VR has been used to help people overcome fears and in the treatment of conditions such as PTSD, and it’s proven to be extremely successful.

This is because the VR headset is used to enhance an already scientifically proven method of treatment: exposure therapy. Patients are exposed to situations through the VR headset which resemble what they were traumatised by or are fearful of, and by doing this repeatedly the memory loses its emotional potency.

This process, sometimes called habituation, can take a long time and is always done in controlled environments with trained professionals who can support the patient should they become severely distressed.

This raises a question: if someone with PTSD can be triggered by a VR experience safely in a controlled environment, what happens if someone with latent PTSD is triggered in the uncontrolled and unsupervised environment of their own home?

To get a better idea of whether this is something that’s even likely we spoke to psychologist Albert ‘Skip’ Rizzo, who told us that it’s not impossible and is certainly something worth monitoring.

As director for medical virtual reality at the USC Institute for Creative Technologies, Rizzo has used virtual reality in treatments himself and his work using virtual reality-based exposure therapy to treat PTSD received the American Psychological Association’s 2010 Award for Outstanding Contributions to the Treatment of Trauma.

Read the full article on TechRadar.

VR for Good – It Can Do Much Better Than Game

To inspire others about the power of VR to do good, AMD took some gear on the road to the VERGE conferencein Silicon Valley.

The conference led off with a keynote from AMD Vice President Roy Taylor and VR pioneer Nonny de la Peña.

Roy talked about how quickly VR technology has advanced, citing applications like “Bravemind.” This is a virtual reality exposure therapy (VRET) simulation developed by Dr. Skip Rizzo at USC’s Institute for Creative Technologies that is used to help those suffering from post-traumatic stress disorder (PTSD). It uses an immersive and realistic virtual environment, enabled by AMD graphics technologies, to recreate unique interactive scenarios to help soldiers normalize the experiences they went through during combat. For Marine Corps veteran Chris Merkle, Bravemind has helped him cope with the stress from his deployment.

Read the full article on Huffington Post.

Image Credit: Huffington Post 

25 of VR’s Greatest Innovators

Polygon compiled a list of VR’s Top 25 Innovators and to no surprise, a few current and former ICT’ers made the list. Check it out!

We Can Measure Feelings

France 2 Replay in Paris featured ICT and our researchers in a video about measuring feelings through technology.

How VR and Surgery are a Match Made in Heaven

A piece by Alphr featuring ICT’s Skip Rizzo, click here for the full article.

Image Credit: Alphr

‘Thank You For Your Service’ Documentary Highlights Alarming Mental Health Crisis

One of the most alarming statistics in the new documentary “Thank You For Your Service” is that 150,000 veterans died by suicide after Vietnam, and we are currently on track to have the same number of losses in the U.S. following Iraq and Afghanistan. This powerful film is now showing in select theaters in Los Angeles, Washington, D.C. and New York, and highlights the failed mental health policies within the U.S. military and deadly consequences (including an average of 22 veteran suicides per day in the U.S., now reported to be 20/day).

Separate from the film, Huffington Post met Psychologist and Research Scientist Skip Rizzo, who is working on PTSD Exposure Therapy using VR solutions to help veterans heal. He received the American Psychological Association’s 2010 Award for Outstanding Contributions to the Treatment of Trauma, and is the associate director for medical virtual reality at the USC Institute for Creative Technologies.

Visit Huffington Post for the full article.

These Doctors are Incorporating VR into Pain Management

PC Magazine talks with Skip Rizzo about his Clinical VR work with soldiers suffering from PTSD.

VR and Post-Traumatic Stress with Bravemind and STRIVE

What if VR can revolutionize our approach to treating those who suffered post-traumatic stress after service in wars? Albert “Skip” Rizzo and Arno Hartholt spearheaded a project that addressed just that and VRCircle.com recently published a piece about it all.

Can Virtual Therapists Treat Veterans?

BBC Radio’s ‘Up All Night’ host Dotun Adebayo talks with Skip Rizzo about virtual enhancement therapy.

Click here to listen.

What if Virtual Reality Can Make Us Better Citizens?

The Atlantic’s CityLab investigates and ICT’s Skip Rizzo gives insight into Bravemind.

Visit City Lab for the full article.

Image Credit: REUTERS/Stefanie Loos

Robots are Developing Feelings. Will They Ever Become “People”?

AI systems are beginning to acquire emotions. But whether that means they deserve human-type rights is the subject of a thorny debate. Paul Rosenbloom is featured in this Fast Company discussion. Below is an excerpt and here is a link to the full article.

“Adding emotions isn’t just a fun experiment: It could make virtual and physical robots that communicate more naturally, replacing the awkwardness of pressing buttons and speaking in measured phrases with free-flowing dialog and subtle signals like facial expressions. Emotions can also make a computer more clever by producing that humanlike motivation to stick with solving a problem and find unconventional ways to approach it.

Rosenbloom is beginning to apply Sigma to the ICT’s Virtual Humans program, which creates interactive, AI-driven 3D avatars. A virtual tutor with emotion, for instance, could show genuine enthusiasm when a student does well and unhappiness if a student is slacking off. “If you have a virtual human that doesn’t exhibit emotions, it’s creepy. It’s called uncanny valley, and it won’t have the impact it’s supposed to have,” Rosenbloom says.

Robots can also stand in for humans in role playing. ICT, which is largely funded by the U.S. military, has developed a training tool for the Navy called INOTS (Immersive Naval Officer Training System). It uses a virtual human avatar in the form of a sailor, Gunner’s Mate Second Class (GM2) Jacob Cabrillo, in need of counseling. Junior officers speak with Cabrillo, who is based on 3D scans of a real person, in order to practice how they would counsel people under their command. About 12,000 sailors have trained in the program since it started in 2012. INOTS draws from a deep reserve of canned replies, but the troubled sailor already presents a pretty convincing facsimile of real emotion.”

Why Are Millennials Already Nostalgic for Music from 10 Years Ago?

ICT’s Todd Richmond tells LA Weekly that, “We have faster access to news, information … everything’s been accelerated. So I think there’s been a compression in our desire for nostalgia.”

For the full feature, click here.

 

Image Credit: LA Weekly 

What VR Will do for Psychologists: The Current State

Part II of Medium’s Beyond the Headset sit-down with Skip Rizzo. In this piece, Skip discusses where his research is currently holding, why he’s putting so much effort into mobile VR (Gear VR), how VR will go hand in hand with current treatment methods, and what are the current available Virtual Reality treatment options for therapists.

Read the full article here.

Virtual Reality Relieves Physical Pain

According to a report by Goldman Sachs , applications of virtual reality in the health field will represent a market of $ 5.1 billion in 2025.

In an earlier report by L’Atelier,  Bravemind, the initiative of Professor Skip Rizzo, exploits the benefits of virtual reality in the treatment of post-traumatic stress disorder, anxiety disorder common among war veterans. But virtual reality can also affect our physical health, click here for more

Image Credit: L’Atelier 

Meet Ellie: The Robot Therapist Treating Soldiers with PTSD

“The behavioural indicators that Ellie identifies will be summarised to the doctor, who will integrate it as part of the treatment or therapy. Our vision is that Ellie will be a decision support tool that will help human doctors and clinicians during treatment and therapy.”

For more, visit News.com.au.

Culture Saturday

Culture Saturday out of Denmark talks with Skip Rizzo about clinical virtual reality and how ‘illustrated universes’ can be used to help both veterans and people with autism.

Click here to listen.

How VR Could Change the Way We Treat Depression

IGN discusses how virtual reality helps in mental health treatment and cites Bravemind in the full article.

Visit IGN to read the entire story.

Can Virtual Reality Unlock the Cure to PTSD?

“If there’s one thing we can agree on in this country, it’s that we want to take care of veterans when they return from combat. We want to support them in living productive lives, in which the traumas of war no longer haunt them. In order to do so, we need to innovate, to be forward-thinking, and to combine science and proven therapeutic methods to build new approaches to an old problem: Post Traumatic Stress.

That’s what the team at USC’s Institute for Creative Technologies is doing. Albert “Skip” Rizzo and his team are using virtual reality — a technology in the midst of booming growth — to help combat veterans fully process and recover from PTSD. In Rizzo’s “Bravemind” program, patients revisit painful memories in a VR setting, under the care of a trained therapist. This sense memory allows them to access the memory clearly and, in doing so, to fully process it. It’s a revolutionary type of exposure therapy that has so far netted promising results.”

See the full feature on Uproxxx here.

 

Image Credit: Uproxxx

Motivational Interviewing Novice Demonstration (MIND)

Download a PDF overview.

The Motivational Interviewing Novice Demonstration (MIND) provides future therapists the opportunity to advance their skills in treating service members, veterans, or military-impacted family members through practice with a simulated patient.

Developed in collaboration with Veterans Affairs (VA) Puget Sound, the University of Washington Department of Psychology, and the University of Southern California Institute for Creative Technologies (USC ICT), MIND features research technologies such as virtual humans and intelligent tutoring to create a challenging yet engaging learning experience in which users can hone their proficiency in Motivational Interviewing (MI).

MIND replicates a therapist-client interaction with a simulated veteran using a multiple-choice-style progression through a therapy session. During the MIND experience, users interact with a virtual patient named Mike Baker. Mike is a National Guard veteran who recently returned from deployment. In the first scenario, Mike discusses the problems he’s having at home, but he is not convinced that talking to a therapist is right for him. The second scenario is a follow-up appointment with Mike, which is set a couple of months after the first. He admits he is still having problems at home, which may be the result of substance abuse.

During both scenarios, the encounter proceeds with a branching storyline as providers respond to a selection of multiple-choice clinical responses. The virtual human patient speaks audibly to the provider and his tone, demeanor and nonverbal behavior indicate how well, or poorly, the interaction is going based on the provider’s utilization of MI skills. The MIND software tracks the practice scenarios to generate a summarized and detailed After Action Review (AAR) based on each provider’s performance. At the conclusion of each practice scenario, the therapist is taken to the summarized review of their performance, and will then proceed to a detailed AAR that shows performance at each decision point, facilitates review of the other response options offered, and provides corresponding feedback on why a given response was ideal, mixed, or suboptimal in each instance.

MIND leverages a software platform previously developed for the Department of Defense under the direction of the Army Research Lab Simulation and Training Technology Center (ARL STTC).

 

What VR Will do for Psychologists: Part I

The history of medicinal VR with Skip Rizzo, from Medium’s Beyond the Headset.

10th Annual USC Global Body Computing Conference

International Conference on Intelligent Virtual Agents (IVA) 2016

IVA 2016 is the 16th congregation of an interdisciplinary annual conference and the main leading scientific forum for presenting research on modeling, developing and evaluating intelligent virtual agents (IVAs) with a focus on communicative abilities and social behavior. IVAs are intelligent digital interactive characters that can communicate with humans and other agents using natural human modalities like facial expressions, speech, gestures and movement.    They are capable of real-time perception, cognition, emotion and action that allow them to participate in dynamic social environments.  In addition to presentations on theoretical issues, the conference encourages the showcasing of working applications.

Who Should Attend?
Students, academics and industry professionals with an interest in learning about and presenting the most cutting-edge research being conducted today in the multi-disciplinary field of intelligent virtual agents should consider coming.   Advances in IVA research are enabling increasingly autonomous agents that are now being utilized across a growing number of fields, including counseling, entertainment, medicine, the military, psychology and teaching. Researchers from the fields of human-human and human-robot interaction are also encouraged to share work with a relevance to intelligent virtual agents.

2nd ICT Workshop on the Use of Virtual Reality to address the Hidden Wounds of War

This workshop is designed to provide an overview of some exciting VR and related technology directions that have been pursued to address the healthcare needs of those who have experienced combat-related trauma. Co-located with and following the International Conference on Disability, Virtual Reality and Associated Technologies (from  September 19-22 http://www.icdvrat.org/), attendance at the VR and Wounds of War event is free and provide a forum for speakers to present on the work they are involved with developing, implementing and evaluating Clinical VR applications to address the behavioral healthcare needs of Service Members and Veterans.

11th International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT)

Conference co-chaired by Skip Rizzo

How Virtual Reality Is Revolutionizing Clinical Therapy and Treatment Rehabilitation

“For many, virtual reality as a medium and a concept had failed. For clinical application, though, it was enough to get the idea of VR-inspired treatment rehabilitation off the ground. “It was sufficient, though costly, difficult to create and not easily modifiable and so what ended up happening was maybe a hundred or so dedicated clinicians hung in there for the last 20 years and gradually the technology got better,” explains psychologist Albert “Skip” Rizzo, the director of medical virtual reality at the University of California’s Institute for Creative Technologies.”

For more, visit the VICE site here.

What Will Augmented and Virtual Reality Technology Do for Healthcare

From treating service members to training clinicians, augmented and virtual reality could soon reshape patient therapy, disease research, medical education and more.

ICT’s Skip Rizzo talks with Healthcare IT News, see the full story here.

Reducing the Psychological Pain of PTSD with Virtual Reality Exposure Therapy

Stroll in Mercury’s Sub-Conscious

Zeit Online talks with Todd Richmond about VR and the upcoming changes in music. Full article can be found here.

Young Researchers Roundtable on Spoken Dialogue Systems

YRRSDS 2016 is an open forum for spoken dialog researchers to discuss their work and research interests. This is the 12th edition of the annual roundtable that provides a networking platform for young researchers in the field. It serves as a playground for stimulating new ideas, sharing tools, and discuss current issues in Spoken Dialog Systems research.

How Technology Can Turn VR into Virtual Therapy

Most people associate virtual reality with gaming, but what if its immersive worlds could be used as therapy? What if virtual reality could help veterans with PTSD? At the University of Southern California Institute for Creative Technologies, clinical psychologist Dr. Albert “Skip” Rizzo and his medical V.R. research team are working on just that. With the help of state-of-the-art processors and lightning-fast computers, they’re giving veterans a chance to confront their past and heal.

 

Full story, click here.

Program Helps Veterans Interview for Civilian Jobs

CBS News St. Louis speaks with Skip Rizzo about using VR to help veterans interview for jobs. “Interviews are challenging for anybody. With veterans, there are additional challenges particularly if they don’t know how to represent what they learned in the military for the civilian job,” said Skip Rizzo, University of Southern California Institute for Creative Technologies.

The segment also ran on CBS News – Greensboro.

 

Image Credit: KMOV

Conference on Special Interest Group On Discourse and Dialogue (SigDial) 2016

The SIGdial venue provides a regular forum for the presentation of cutting edge research in discourse and dialogue to both academic and industry researchers. Continuing with a series of successful sixteen previous meetings, this conference spans the research interest area of discourse and dialogue. The conference is sponsored by the SIGdial organization, which serves as the Special Interest Group in discourse and dialogue for both ACL and ISCA.

Your Brain on VR

In this “Future First” clip presented by Xprize and Popular Science, ICT’s Skip Rizzo discusses how virtual reality is paving the way for new technological therapeutics and how VR exposure therapy will help treat people with anxiety disorders and PTSD. For the full story, click here.

Talk on ICT Game/Simulation-based Technologies

Ryan McAlinden

Expertise

  • Live, Virtual, Constructive Simulations
  • UAS/Drones
  • Game-based technologies
  • Geospatial Data and Technologies
  • Cloud services
  • E-learning

Stefan Scherer

Expertise

  • machine learning
  • social signal processing
  • affective computing
  • human-computer interaction
  • virtual social skill training

Foreign Languages

  • German

More

For more information, click here.

Virtual Reality Treats US Soldiers

La Telier talks with Skip Rizzo and takes a look at ICT’s technologies behind Bravemind. The full story can be found here.

Ben Nye

Expertise

  • AI in Educational Technology
  • Intelligent Tutoring Systems
  • Artificial Intelligence
  • Educational Data Mining
  • Systems Architectures
  • Agent-Based Architectures
  • Learning Science
  • Education for Development
  • Socio-Cognitive Agents
  • Cognitive Modeling
  • Dialog Systems

Additional Information

  • Membership Chair for IAIED (International Society for AI in Education; 2013-Present)
  • Track Chair for Intelligent Learning Technologies at FLAIRS (Florida AI Research Society; 2015-Present)
  • Lifetime Member of AAAI (Association for the Advancement of Artificial Intelligence; 2012-Present)
  • Member of EDM (Educational Data Mining Society; 2014-Present)
  • Member of ARL GIFT Advisory Board (2013-Present)

4 Ways Olympic Viewing Will be Vastly Improved

ICT’s Todd Richmond talks the future of viewing experiences with Inverse just in time for the 2016 Olympics. For the full article, click here.

 

 

Visual Expert to Lead USC Institute’s Graphics Lab

Computer scientist Hao Li aims to modernize the way researchers capture, process and analyze data

By Amy Blumenthal 

Hao Li, an assistant professor of computer science at the USC Viterbi School of Engineering and holder of an Early Career Chair, will take the helm at the USC Institute for Creative Technologies Graphics Laboratory.

The lab developed the light stage that records 3-D appearances of human faces, bodies and objects using computer-controlled illuminations. It has collaborated with entertainment companies such as Sony Pictures, Digital Domain and Weta Digital on films such as AvatarThe Curious Case of Benjamin Button, The HobbitThe Avengers and Gravity, as well as new glasses-free 3-D display technologies in collaboration with USC’s Shoah Foundation—The Institute for Visual History and Education to preserve the testimony of Holocaust survivors.

Before joining USC in 2013, Li was a research lead at Industrial Light & Magic/Lucasfilm, where he developed the next generation of real-time facial performance capture technologies for virtual production and visual effects. He also led the development of the high-end facial animation pipeline at Weta Digital, which was used to “reenact” the late actor Paul Walker in Furious 7.

Named one of the 35 innovators under 35 by MIT Technology Review in 2013, Li plans to blend the expertise of ICT’s Graphics Lab with his existing research in the Department of Computer Science at USC Viterbi, which focuses on more deployable tech-capture solutions such as building a 3-D avatar from a single image.

Digital Doings

Li’s goals include the expansion of ICT’s leading role in human digitization and exploring new augmented reality/virtual reality platforms.

While continuing his teaching duties at USC Viterbi, Li hopes to build databases of 3-D human shapes with the highest possible fidelity.

“We want to focus on real-time and data-driven algorithms that can perceive the 3-D world like humans do and even make sense of it, given limited visual input,” said Li, an expert in the field of 3-D digitization and performance capture.“The most important thing,” Li said, “is to modernize the way we capture, process and analyze data. We want to be able to generate high-fidelity virtual humans instantly without any manual labor, develop deployable and low-cost capture systems that can be accessible by anyone, and enable real-time capabilities wherever possible. Immersive communication, AI [artificial intelligence] and virtual reality … these are the major applications that are pushing our technology.”

USC Viterbi Dean Yannis C. Yortsos said: “Hao Li is a true creative force in vision and graphics worldwide. His new leadership role at ICT will help further cement its pioneering role in shaping the frontiers of virtual reality and will continue grow the impact of USC computer science in the vibrant Silicon Beach ecosystem.”

Virtual Reality is Making Marketing and Training More Effective for Businesses

The term “virtual reality” (VR) might invoke memories of the 1982 cult classic “Tron,” but VR has evolved to the point where yesterday’s fiction is quickly becoming today’s reality. Now, this technology is poised to fundamentally change the way business people interact with, test-drive and market their products.

But the potential uses for VR don’t stop there; experts agree that in the future, VR will offer even more powerful capabilities to businesses and individuals alike.

 

ICT’s Todd Richmond talks Business News Daily, see the full story here.

Image Credit: Barone Firenze/Shutterstock

Tested Tours VR Projects at USC’s Mixed Reality Lab

At USC’s Institute for Creative Technologies, computer scientists and engineers have been tinkering with virtual reality, augmented reality, and everything in between. Tested was given a tour of ICT’s Mixed Reality Lab, where projects explore the intersections of VR and accessibility, avatars, and even aerial drones.

Check out the full video here.

Presentation: ELITE SHARP POST

USC-West Point partnership provides invaluable research, training experiences

A group of 13 cadets is spending time at the Institute for Creative Technologies, shaping technology they may someday use in the classroom or the field.

By Orli Belman

On a recent morning at the USC Institute for Creative Technologies (ICT), researchers gathered to hear a talk titled “Understanding Language through Machine Learning.” The topic is typical for this cross-disciplinary research institute; the speaker, 21-year-old Jareth Long-Garrett, is not. He’s a third-year cadet at the U.S. Military Academy who is getting immersed in basic research as part of a unique partnership between West Point and USC.

Long-Garrett is one of 13 West Point cadets spending time this summer at ICT as part of the academy’s Academic Individual Advanced Development (AIAD) program. Since 2005, more than 100 cadets – including two Rhodes Scholars – have spent three weeks at USC’s Department of Defense-sponsored University Affiliated Research Center, where they take an active role in conducting academic research and in shaping the virtual reality and game-based training technologies that may later help them in the classroom or the field.

“It is always a pleasure to have the West Point cadets here each summer,” said Randall W. Hill Jr., ICT’s executive director and himself a West Point graduate. “As students, they bring bright ideas to helping solve research problems. As our nation’s future Army leaders they provide insights into how we can develop technology tools to help them succeed.”

Cadets also get a glimpse into a work setting where flip-flops are more common than combat boots. In addition to what he learned in computer coding, Long-Garrett says ICT’s diverse, creative environment provided valuable leadership lessons about the power of good communication and collaboration.

“It was cool to be able to talk to different members and get different pieces of the puzzle for the project I was working on,” he said. “I’ve only been through a few classes for computer science but everyone treated me as a member of the team.”

Working together

Participation in the AIAD program is just one example of how ICT and West Point are working together to provide advanced tools and technologies that are helping to educate and train future leaders.

“We can speak the Army for ICT researchers and they can speak the science for what we are trying to do,” said Maj. Charlie Rowan, an instructor in West Point’s engineering psychology department. “That reciprocity is all benefiting the Army, which makes this partnership a great success story.”

ICT has set up a lab at West Point and has a staff member embedded in its Department of Behavioral Sciences to further support the partnership. ICT and the Academy’s engineering psychology program have also collaborated on a series of seminars that bring in ICT researchers and developers to teach cadets how to use ICT’s Virtual Human Toolkit, Roundtable and Rapid Avatar systems, which teams of cadets have then used for their senior capstone projects.

“We are trying to put our cutting-edge technologies in the hands of bright, young future leaders while they’re still formulating ideas about the art of the possible, with the hope they will draw on these experiences when charged with training their own units one day,” said Rich DiNinni, ICT’s liaison to West Point.

Prototypes in service

ICT was established in 1999 with a contract from the U.S. Army to advance the state of the art simulation and training. Since that time, many ICT-developed prototypes have become part of the Army’s training curriculum. These include virtual human role players for reinforcing leadership communication skills and simulated scenarios for practicing decision-making under stress. Beyond the military, results of ICT’s Army Research Lab-funded research can be seen in the commercial low-cost virtual reality headsets currently on the market, in the believable digital characters featured in major motion pictures like Avatar and in virtual characters that are helping autistic young adults prepare for job interviews or assisting doctors-in-training with improving their clinical interview skills.

Summer in Southern California is not all work. The AIAD rotation includes field trips to entertainment industry companies, lunches at Los Angeles’ famed food trucks and also marked a different kind of immersion — Cadet Long-Garrett’s first visit to the Pacific Ocean. Just as he’s done at ICT, he jumped in feet first, no combat boots or flip-flops required.

4 Ways to Invest in Games

Pokemon Go, launched July 6, is worth about $3.65 billion. It was created by the Alphabet spinoff Niantic and is on track to make $740 million in revenue this year, according to Quartz. ICT’s Todd Richmond jumped into the conversation of how investors can engage with gaming for U.S. News and World Report. Below is a snippet, here’s the full story.

In connection, “Apple is poised to make billions of dollars from Pokemon as they get a cut of things from the app store,” says Todd Richmond, IEEE Member and director of prototype development at University of Southern California’s Institute of Creative Technologies.

Capture of Aerial Imagery and the 3D Reconstruction Process

Capture the Two Harbors area to improve the photogrammetric pipeline.

Could VR Unlock the Cure for Vietnam Veterans’ Decades-old PTSD?

A deep dive into Bravemind with ICT’s Skip Rizzo and Digital Trends.

Revisiting Detection Thresholds for Redirected Walking: Combining Translation and Curvature Gains

Army Releases Disaster Response Video Game

National Defense Magazine featured the release of DisasterSim, a new Army video game is taking soldiers into the heart of foreign disaster zones and delivering real-world training from their laptop or tablet.

Disaster Sim was created by the Army Research Laboratory and programmers from the Institute for Creative Technologies at the University of Southern California as a cost-effective training tool for company grade officers and mid to junior non-commissioned officers engaged in foreign disaster relief, said Maj. Timothy Migliore, chief of the Army’s games for training program in the article.

Treating PTSD With Virtual Reality Therapy: A Way to Heal Trauma

ABC News highlighted work by Skip Rizzo of the USC Institute for Creative Technologies to treat PTSD in veterans. “My mission is to drag psychology kicking and screaming into the 21st century,” Rizzo said. He created 14 virtual worlds with customization for patients to help work through his or her trauma.

 

Sigma Spoken Language Understanding System

This thesis proposes the integration of incremental speech processing with language understanding and cognition. Speech signal obtained from a typical speech front end shall be combined with linguistic knowledge in the form of phonetic, syntactic and semantic knowledge sources with cognition selecting the most likely word incrementally. This non-modular, supraarchitectural integration of spoken language has never been attempted in cognitive architectures making this work novel.

The Sigma Architecture 6.1

How Pokémon Go could change the course of technology

Marketwatch quotes Todd Richmond in an article about Pokemon Go and how its popularity may lead to a rise in augment reality applications.

According to the story, Pokémon Go provides a needed example of how the digital and physical worlds might be able to coexist through mobile devices. While virtual reality requires a complete retreat from the surrounding physical world, augmented reality lives within it, potentially making it more attractive to a mainstream audience.

“Augmented reality is the bigger play because humans still get to touch, and still have a better connection with, their immediate real-time physical world,” said Todd Richmond, an IEEE member and director of advanced prototype development at University of Southern California’s Institute for Creative Technologies. “It is an easier transaction to process.”

“We’re in a period of novelty sells right now,” said Richmond. “Augmented reality will impact all verticals in all aspects of life—but right now virtual reality is the shiny object.”

Image Credit: CryZENx

MedCity News also covered this story.

What the Pokemon GO Mania Says About Modern Society

Forbes caught up with Todd Richmond to discuss the new phenomenon:

Todd Richmond, the director of advanced prototype development at University of Southern California’s Institute for Creative Technologies told MarketWatch’s Jennifer Booton “augmented reality will impact all verticals in all aspects of life—but right now virtual reality is the shiny object.” I beg to differ — augmented reality may be underdeveloped as an industry but AR is the real shiny new object here. Expect to see a whole slew of augmented reality games by the end of this year, and I’m even willing to bet one of those first new augmented reality games will show up on Facebook sans the glasses.

Read the full article here.

‘Pokemon Go’ digital popularity is also warping real life

The New York Times ran a widely carried Associated Press story on the popularity of ‘Pokemon Go’ quotes ICT’s Todd Richmond, who says a big debate is brewing over who controls digital assets associated with real world property.

“This is the problem with technology adoption — we don’t have time to slowly dip our toe in the water,” he says. “Tenants have had no say, no input, and now they’re part of it.”

ABC News, The Seattle Times and many other outlets ran this story.

Predictive models of malicious behavior in human negotiations

SoundStage’s virtual music studio may be the trippiest VR app yet

Yahoo News featured a story about ICT’s MxR Lab alum Logan Olson and his new SoundStage app, a new VR music tool for the HTC Vive headset. The story notes that SoundStage is Olson’s first independent project after having previously built room-scale VR at the Institute for Creative Technologies, as well as for theme parks.

Image Credit: SoundStageVR.com

 

5 ways virtual reality is disrupting healthcare

A HealthCare Dive story  cites ICT’s work using virtual reality for treating anxieties and for helping patients better understand medical conditions as two examples of how VR is disrupting healthcare. The first example is ICT’s Bravemind system for treating post-traumatic stress and the second is our work developing the Virtual Care Clinic with the USC Center for Body Computing.

How VR Will Change Sports. And How It Won’t

Skip Rizzo was quoted in a Backchannel article about how VR may change the experience of viewing sports.

“I don’t go to a Dodger game to sit by myself,” Albert “Skip” Rizzo, Director for Medical Virtual Reality at the University of Southern California Institute for Creative Technologies, told me. “And when you’re watching sports, you’re not spending that much time turning around. VR may create some kind of sense of presence, but it can also be underwhelming.”

Image credit: Cam Floyd via Backchannel.com

This LA Nonprofit Uses Virtual Humans to Prep Vets for Job Interviews

PC Magazine featured  ICT’s ViTA for Veterans project, led by Skip Rizzo and Matthew Trimmer, to develop a job interview training program using virtual humans. Originally used to assist people with autism practice for interviews, Rizzo and colleagues created six characters that employ various interviewing techniques that are now also being used by the United States Veterans Initiative in Los Angeles. According to Rizzo, ICT is in the process of making the virtual humans more responsive and conversational.

Time-Offset Conversations on a Life-Sized Automultiscopic Projector Array

Keynote Talk- Rapidly built Character Animation and Simulation

The 3D character will continue to be an integral part of 3D games, movies, and other simulated environments. Recently, the tools and technologies to produce characters have improved and become more affordable. In this talk, I describe my experiences in attempting to rapidly build 3D characters using sensing technology, behavior and modeling algorithms.

When Hollywood does AI, it’s fun but farfetched

A CNET article about AI’s portrayal in Hollywood quotes ICT’s David Pynadath on 2001: A Space Odyssey.

“HAL 9000 was not the first example of evil AI on film, but he marked a watershed moment. Suddenly, homicidal computers were within our grasp,” he said.

Historic Holograms Amplify Holocaust Survivors’ Testimonies

VICE Creator’s Project features New Dimensions in Testimony and notes that the project, produced by the USC Institute for Creative Technologies and the USC Shoah Foundation, combines language processing software, voice recognition technology and visualization, allowing people to interact with a Holocaust survivor in real time and hopes to make 3D interactive exhibits publicly available, where visitors can continue to ask questions of survivors for generations to come.

The advent of virtual humans

A CNET article about the growing application of virtual humans quotes Jon Gratch, ICT’s director of virtual humans research.

“In general, AI is moving into more artificial social intelligence,” he says. Gratch defines that as the ability “to understand people, how they think, how to communicate with them, what their emotional state is.”

Virtual Reality, Real Change

An article in Medical Daily quotes Jon Gratch and cites his research on virtual reality’s role in emotion and disclosure.

The story states that a future in which machines augment our emotions much like they boost our physical abilities is looking increasingly likely. Already in Gratch’s lab, participants have shown an increased willingness to disclose their mental health problems to virtual humans than therapists in real life. The reduced sense of judgement they felt from the computer pushed them to open up and display their sadness more freely.

The virtual Holocaust survivor: how history gained new dimensions

The Guardian featured New Dimensions in Testimony and quotes ICT’s David Traum about the project’s use of natural language processing software to help create an interactive version of video footage, with vocal cues triggering responses from a pool of recorded speech.

“We have to distinguish two concepts of response. Everything that is said comes directly from Pinchas. However, the decision of which recording to present as a response to a specific question is made by the technology, based on its training data of how Pinchas and other people have answered questions,” says Traum.  “In the best cases, the answers are exactly how Pinchas himself would have responded. In some cases, it may seem that he is addressing a somewhat different question, but the response itself is still his.”

How VR Is Helping Soldiers With PTSD

A PC Magazine article features ICT’s Bravemind virtual reality exposure therapy system for treating post-traumatic stress and SimCoach, a virtual human that can interact with users over the web. Rizzo notes that people can reveal more to virtual humans than to real ones.

“That’s the big finding from this work: people will tell the virtual human stuff they don’t feel able to tell someone who is judging them. There’s less worry about impression management, more revelation of sad events, more self-disclosure – it works,” he said.

How Virtual Reality is Gaining Traction in Healthcare

An article in Healthline cites ICT’s work as examples of virtual reality’s growing role in healthcare. The story notes ICT’s Bravemind system for treating post-traumatic stress and our work developing the Virtual Care Clinic with the USC Center for Body Computing.

Sheffield ’16: New dimensions in storytelling

ReelScreen featured New Dimensions in Testimony, ICT’s collaboration with the USC Shoah Foundation to preserve the ability to ask questions of Holocaust survivors. The story notes that the technology is creating a potential new platform for filmmakers to store and present archival material.

Interactive 3D videos help Holocaust survivors tell stories

A BBC Click segment featured ICT’s 3D projection technology and how it is being used to help create a sense of presence in the New Dimensions in Testimony project, a collaboration with the USC Shoah Foundation.

“You can read about the Holocaust in a book or see it on a movie or on TV, but until you actually interact with someone who has actually lived this experience in person, it doesn’t have the same sort of … reality that these things actually happened and how horrendous it was,” said ICT research scientist Andrew Jones.

The story included other applications of ICT’s Light Stage technologies.

Virtual Humans on Mobile Devices

Complex virtual humans are typically found in museums, specialized training installations, and similar settings. In many cases, a virtual human resides in a particular place, and is often only available under formal or brief encounters. Mobile platforms such as smartphones and tablets are a pervasive technology, and are now capable of running the software and hardware components that are necessary for a convincing, interactive Virtual Human. Potentially, a virtual human running on a mobile platform changes three significant relationships with real humans:
(1) a mobile virtual human can be accessible to a mobile user at any time,
(2) a mobile virtual human can be accessible to a mobile user at any location, and
(3) the group of real users who could potentially interact with a mobile virtual human is broadened to all people with a smartphone.
Thus, rich, long-term interactions with a broad range of types of people would now be possible on mobile devices, through a broad set of domains.

We study the effect of using virtual humans on mobile devices and develop architectures for rapidly building virtual humans and mobile devices that include speech, nonverbal behavior, and interaction.

We present studies of how the presentation of a virtual human on a mobile device can affect the user interaction. For example, does an embodied virtual human elicit better copresence than an audio-only interface? Does using a virtual character with video differ from using a virtual character presented using 3D technologies?

Will mobile VR experiences fizzle or take off in China?

A  China Daily story quoted Todd Richmond about mobile VR.

Richmond, director of advanced prototypes at the University of Southern California Institute for Creative Technologies, expressed his optimism about the mobile VR’s future. He said in an interview with the Institute of Electrical and Electronics Engineers this March: “VR/ augmented reality is likely to develop into the next smartphone — it will replace a smartphone or incorporate its features, but it will take 10 years.”

How VR is Helping the Victims of Terrorism

An article on Alpr.com features Skip Rizzo’s work using VR to treat post-traumatic stress, including recent efforts to address the needs of terror attack victims.

So far, Dr Rizzo’s work has focused on soldiers, but it’s set to shift to civilians, including victims of sexual assault, car accidents, extreme weather – such as having your house blown away by a tornado, he says – or terror attacks. VR was used successfully to help treat PTSD in victims of the New York World Trade Center attacks in 2001, and now Dr Rizzo is working with a consortium of European collaborators to build a virtual Paris scenario, states the story.

“Admittedly, it’s hard medicine for a hard problem. But when people say: ‘Why do you do this…I say it’s because we know it works. The science and research – it’s not 100%, but it’s one of the best treatments in clinical trials.”

We’re on the Cusp of Changing How we Treat one of America’s Most Ignored Health Problems

A Business Insider story on new ways to treat mental illness featured ICT’s SimSensei prototype and an interview with Jon Gratch, ICT’s director of virtual human research.

Gratch discussed the making of SimSensei and his research findings into why it easier to be honest online.

One reason, he suggests in the story, is because we feel more anonymous there (whether we actually are is another matter).

“People are more honest on web forms,” says Gratch. “They just feel safer disclosing things that way,” he said.

To Gratch, the future of therapy lies between these two important things: anonymity and rapport. And he thinks a “virtual person” might be the part of the solution.

The article describes SimSensei’s AI-driven rapport capabilities, stating that Sensei can differentiate between when someone is asking a question and when they are making a statement. And, based on certain words she’s been trained to pick up, Sensei can appropriately respond with an expression that either conveys understanding — like an “Uhuh” or a nod — or a sense of empathy, like an “Oh I’m sorry.” To ask a question, Sensei leans in.

Invited Talk: Cognition for Human Performance and Autonomous Systems Workshop

Towards a multi-dimensional taxonomy of stories in dialogue

The literature contains a wealth of theoretical and empirical analyses of discourse marker functions in human communication. Some of these studies address the phenomenon that discourse markers are often multifunctional in a given context, but do not study this in systematic and formal ways. In this paper we show that the use of multiple dimensions in distinguishing and annotating semantic units supports a more accurate analysis of the meaning of discourse markers. We present an empirically-based analysis of the semantic functions of discourse markers in dialogue. We demonstrate that the multiple functions, which a discourse marker may have, are automatically recognizable from utterance surface-features using machine-learning techniques.

Towards Automatic Identification of Effective Clues for Team Word-Guessing Games

Team word-guessing games where one player, the clue-giver, gives clues attempting to elicit a target-word from another player, the receiver, are a popular form of entertainment and also used for educational purposes. Creating an engaging computational agent capable of emulating a talented human clue-giver in a timed word-guessing game depends on the ability to provide effective clues (clues able to elicit a correct guess from a human receiver). There are many available web resources and databases that can be mined for the raw material for clues for target-words; however, a large number of those clues are unlikely to be able to elicit a correct guess from a human guesser. In this paper, we propose a method for automatically filtering a clue corpus for effective clues for an arbitrary target-word from a larger set of potential clues, using machine learning on a set of features of the clues, including point-wise mutual information between a clue’s constituent words and a clue’s target-word. The results of the experiments significantly improve the average clue quality over previous approaches, and bring quality rates in-line with measures of human clue quality derived from a corpus of human-human interactions. The paper also introduces the data used to develop this method; audio recordings of people making guesses after having heard the clues being spoken by a synthesized voice (Pincus and Traum, 2016).

Can Virtual Reality Make us Kinder?

A story on WHYY’s The Pulse focused on the use of virtual reality to change human attitudes, and increase the user’s compassion and empathy.

The story states that ICT’s Skip Rizzo has seen the empathetic benefits of these tools. He and his team designed a combat simulation to help soldiers battling post-traumatic stress disorder. But along the way, he realized the virtual reality experience could also benefit the soldier’s family.

How Tai Chi Can Help Conquer Fear of Falling as We Age

The Wall Street Journal quotes Skip Rizzo in a story about research into fall prevention, including a study of virtual reality exposure therapy to address a fear of falling.

“It shows some promise,” said Skip Rizzo, a psychologist and director of medical virtual reality at the University of Southern California Institute for Creative Technologies, who wasn’t involved in the research. The area is ripe for study, Dr. Rizzo said. “We recognize this is a significant problem that leads to disability and chronic progression into loss of functional independence.”

Rendering the Digital Self

CCTV-America featured research by Ari Shapiro and Andrew Feng to create user-like avatars. Shapiro and his colleagues’ Rapid Avatar Generator work reproduces a digital version of the user within minutes and at a fraction of the cost of existing technology.

“We want to capture that unique aspect of what makes you, you and them embed it into an avatar,” says Shapiro in the video story.

 

How will virtual reality change our lives?

The BBC turned to ICT experts Mark Bolas and Skip Rizzo for a feature on the future of virtual reality. Listen to the interviews here. Read them here.

Mark Bolas:

“I’m worried by our current computer interfaces. I watch people walking around like zombies with cell phones in their hands, and I have to manuever a mouse to fill out little boxes on web forms in a horribly frustrating way. I think VR will allow us to transcend this.

I don’t worry so much about where VR is going, I worry about where we currently are.

_89717337_uscptsd1

Skip Rizzo:

“We know that the brain is quite good at suspending disbelief, so even though people know these aren’t real people, they relate to them as if they are.

This is why VR is so compelling, because whatever is learned in those worlds hopefully will benefit how the person translates their behavior in the real world.”

The Godfather of Virtual Reality Wants to Heal Your Wounds

A “Rising Star” story in online magazine Ozy profiles Skip Rizzo as a virtual reality pioneer who has been at the forefront of medical applications of the technology.

Last year, Rizzo received the Pioneer in Medicine Award from the Society for Brain Mapping and Therapeutics and the Brain Mapping Foundation. Michael Roy, director of military internal medicine at the Uniformed Services University, who met Rizzo 12 years ago, calls him “the godfather of virtual reality.” Another calls him “the king, a very special guy with a special energy” — that’s collaborator Barbara Rothbaum from Emory University, notes the story.

New video game trains response to foreign disasters

An article on the Army homepage features DisasterSim, the ICT developed video game that trains soldiers in how to respond to foreign natural disasters. The article notes that the game can be downloaded for free from the Army’s Milgaming web site https://milgaming.army.mil/.

Developing the game involved: U.S. Army South; the Army Research Laboratory; the Army Games for Training Program, Program Executive Office for Simulation, Training and Instrumentation; the Institute for Creative Technologies at the University of Southern California, and the Office of Foreign Disaster Assistance (OFDA), states the story.

Fierce IT and TRADOC also covered the story.

Towards a Computational Model of Human Opinion Dynamics in >Response to Real-World Events

Accurate multiagent social simulation requires a computational model of how people incorporate their observations of real-world events into their beliefs about the state of their world. Current methods for creating such agent-based models typically rely on manual input that can be both burdensome and subjective. In this investigation, we instead pursue automated methods that can translate available data into the desired computational models. For this purpose, we use a corpus of real-world events in combination with longitudinal public opinion polls on a variety of opinion issues. We perform two experiments using automated methods taken from the literature. In our first experiment, we train maximum entropy classifiers to model changes in opinion scores as a function of real-world events. We measure and analyze the accuracy of our learned classifiers by comparing the opinion scores they generate against the opinion scores occurring in a held-out subset of our corpus. In our second experiment, we learn Bayesian networks to capture the same function. We then compare the dependency structures induced by the two methods to identify the event features that have the most significant effect on changes in public opinion.

How Lifelike Can Your Avatar Get? – Rapid Avatar Capture in PrimeMind

An article in PrimeMind features the rapid avatar generation research from Ari Shapiro and colleagues. The article states that Shapiro and the other researchers at ICT are studying the emotional connections people have with their avatars. It also notes that the group is working with the United States Military Academy at West Point on an application that allows soldiers to see themselves in their virtual squad-training fields. Shapiro believes avatars could be used in myriad training situations, preparing even doctors and teachers to respond appropriately under given circumstances, says the story.

Shapiro says that there’s still work that has to be done in fleshing out these avatars. Now in the second phase of the project, they are working on color-correction issues and more accurate representations of people’s real-life behaviors.

“I don’t think I’m going to get to the finish line anytime soon,” Shapiro says. “People are infinitely complex. You can always go deeper and deeper.”.

Near-Instant Capture of High-Resolution Facial Geometry and Reflectance

Modeling realistic human characters is frequently done using 3D recordings of the shape and appearance of real people across a set of different facial expressions [Pighin et al. 1998; Alexander et al. 2010] to build blendshape facial models. Believable characters which cross the “Uncanny Valley” require high-quality geometry, texture maps, reflectance properties, and surface detail at the level of skin pores and fine wrinkles. Unfortunately, there has not yet been a technique for recording such datasets which is near-instantaneous and relatively low-cost. While some facial capture techniques are instantaneous and inexpensive [Beeler et al. 2010; Bradley et al. 2010], these do not generally provide lighting-independent texture maps, specular reflectance information, or high-resolution surface normal detail for relighting. In contrast, techniques which use multiple photographs from spherical lighting setups [Weyrich et al. 2006; Ghosh et al. 2011] do capture such reflectance properties, but this comes at the expense of longer capture times and complicated custom equipment.

In this paper, we present a near-instant facial capture technique which records high-quality facial geometry and reflectance using commodity hardware. We use a 24-camera DSLR photogrammetry setup similar to common commercial systems1 and use six ring flash units to light the face. However, instead of the usual process of firing all the flashes and cameras at once, each flash is fired sequentially with a subset of the cameras, with the exposures packed milliseconds apart for a total capture time of 66ms, which is faster than the blink reflex [Bixler et al. 1967]. This arrangement produces 24 independent specular reflection angles evenly distributed across the face, allowing a shape-from-specularity approach to obtain high-frequency surface detail. However, unlike other shapefrom- specularity techniques, our images are not taken from the same viewpoint. Hence, we compute an initial estimate of the facial geometry using passive stereo, and then refine the geometry using separated diffuse and specular photometric detail. The resulting system produces accurate, high-resolution facial geometry and reflectance with near-instant capture in a relatively low-cost setup.

The principal contributions of this work are:

  • A near-instantaneous photometric capture setup for measuring the geometry and diffuse and specular reflectance of faces.
  • A camera-flash arrangement pattern which produces evenlydistributed specular reflections over the face with a single photo per camera and fewer lighting conditions than cameras.
  • A novel per-pixel separation of diffuse and specular reflectance using multiview color-space analysis and novel photometric estimation of specular surface normals for geometry refinement.

Virtual reality’s new frontier: peacekeeping, Iraq War therapy, and digital paradises – ICT Research in VOX

An article on Vox.com explores uses for VR beyond gaming. The story describes ICT’s Virtual Iraq and Virtual Afghanistan exposure therapy research and also includes mention of Nonny de la Pena’s immersive journalism work, which has roots at ICT’s Mixed Reality Lab.

What Kind of Stories Should a Virtual Human Swap?

Stories are pervasive in conversation between people [5]. They are used to establish identity pass on cultural heritage, and build rapport. Often stories are swapped when one conversational participant will reply to a story with a story. Stories are also told by virtual humans [1, 6, 2]. In creating or mining stories for a virtual human (VH) to tell, there are a number of considerations that come up about what kinds of stories should be told, and how the stories should be related to the virtual human’s identity, such as whether the identity should be human or artificial, and whether the stories should be about the virtual human or about someone else. We designed a set of virtual human characters who can engage in a simple form of story-swapping. Each of the characters can engage in simple interactions such as greetings and closings and can respond to a set of “ice-breaker” questions, that might be used on a first date or similar “get to know you” encounter. For these questions the character’s answer includes a story. We created 4 character response sets, to have all combinations of identity (human or artificial) and perspective (first person stories about the narrator, or third person stories about someone else). We also designed an experiment to try to explore the collective impact of above principles on people who interact with the characters. Participants interact with two of the above characters in a “get to know you” scenario. We investigate the degree of reciprocity where people respond to the character with their own stories, and also compare rapport of participants with the characters as well as the impressions of the character’s personality.

These Bots Are Ready To Give You The Business – Jon Gratch’s Negotiation Research in Buzzfeed

Buzzfeed covered ICT and USC Marshall School research teaching a computer to negotiate with humans in the hopes of someday using it as a training tool for everything from business to diplomacy.

In the story ICT’s Jonathan Gratch says (normal) people often feel — understandably! — weird about using tactics like feigned anger or lying in a negotiation with another person. In a way, the point of his program is to desensitize students to that natural ~human~ discomfort by letting then practice such methods on a virtual human first.

“People find it awkward if someone is coming at them with a tough negotiation tactic,” Gratch explained. “They want to be collaborative and concede. A lot of negotiation is learning to stand up for your interests, which requires you to be confrontational.”

The story states that Gratch’s virtual humans can be calibrated to be more or less truthful. They can even be programmed to lie outright, saying they don’t want an item they actually do want in order to make it appear less valuable to an opponent. They can also bluff, using human body language (crossed arms, furrowed brow) to intimate frustration, disinterest or disgust.

“If you’re practicing negotiation. you want to start with a straightforward honest person,” said Gratch. “Then you want to ratchet up the difficulty by making them more Machiavellian.”

The Sigma Cognitive Architecture and System

Sigma (Σ) is a nascent cognitive system—i.e., the beginnings of an integrated computational model of intelligent behavior—built around an eponymous cognitive architecture (a hypothesis about the fixed structure underlying cognition).   As such, it is intended ultimately to support the real-time needs of intelligent agents, robots and virtual humans.   Its development is driven by four desiderata—grand unification, generic cognition, functional elegance, and sufficient efficiency—plus a unique blend of ideas from over thirty years of independent work in the areas of cognitive architectures and graphical models.  Work to date on Sigma has covered aspects of learning and memory, perception and attention, reasoning and problem solving, speech and language, and social and affective cognition.  It has also involved the development of multiple distinct types of intelligent agents and virtual humans.  This tutorial covers the rationale behind Sigma, the basics of its design and operation, and a number of the results that have been generated to date with it.

“Do As I Say, Not As I Do:” Challenges in Delegating Decisions to Automated Agents

There has been growing interest, across various domains, in computer agents that can decide on behalf of humans. These agents have the potential to save considerable time and help humans reach better decisions. One implicit assumption, however, is that, as long as the algorithms that simulate decision-making are correct and capture how humans make decisions, humans will treat these agents similarly to other humans. Here we show that interaction with agents that act on our behalf or on behalf of others is richer and more interesting than initially expected. Our results show that, on the one hand, people are more selfish with agents acting on behalf of others, than when interacting directly with others. We propose that agents increase the social distance with others which, subsequently, leads to increased demand. On the other hand, when people task an agent to interact with others, people show more concern for fairness than when interacting directly with others. In this case, higher psychological distance leads people to consider their social image and the long-term consequences of their actions and, thus, behave more fairly. To support these findings, we present an experiment where people engaged in the ultimatum game, either directly or via an agent, with others or agents representing others. We show that these patterns of behavior also occur in a variant of the ultimatum game – the impunity game – where others have minimal power over the final outcome. Finally, we study how social value orientation – i.e., people’s propensity for cooperation – impact these effects. These results have important implications for our understanding of the psychological mechanisms underlying interaction with agents, as well as practical implications for the design of successful agents that act on our behalf or on behalf of others.

The Impact of POMDP-Generated Explanations on Trust and Performance in Human-Robot Teams

Researchers have observed that people will more accurately trust an autonomous system, such as a robot, if they have a more accurate understanding of its decision-making process. Studies have shown that hand-crafted explanations can help maintain effective team performance even when the system is less than 100% reliable. However, current explanation algorithms are not sufficient for making a robot’s quantitative reasoning (in terms of both uncertainty and conflicting goals) transparent to human teammates. In this work, we develop a novel mechanism for robots to automatically generate explanations of reasoning based on Partially Observable Markov Decision Problems (POMDPs). Within this mechanism, we implement alternate natural-language templates and then measure their differential impact on trust and team performance within an agent-based online testbed that simulates a human-robot team task. The results demonstrate that the added explanation capability leads to improvement in transparency, trust, and team performance. Furthermore, by observing the different outcomes due to variations in the robot’s explanation content, we gain valuable insight that can help lead to refinement of explanation algorithms to further improve human-robot interaction.

Advances in photoreal digital humans in film and in real-time

Invited Speaker- The USC Virtual Care Clinic: Solutions for Delivering Borderless Health Care

New technique fools your brain to make VR more immersive – Tech Coverage of Mahdi Azmadi’s Recent Research

Yahoo highlighted a research collaboration with Mahdi Azmandian of the USC Institute for Creative Technologies and colleagues at Microsoft and the University of Waterloo on virtual reality touch interfaces. Engadget and TechRadar also featured the results of the collaboration, which describes a perceptual trick, or  hack, that brings haptics — the sense of touch — to VR environments. Read more in this Microsoft blog.

 

The Next Big Things in Technology: A Preview

Uber is 7 years old. Twitter is 10. Facebook and Google are just a few years older. These companies, now embedded in our economy and culture, went from Bright Idea to Next Big Thing to Really, Really Big Deal in almost a flash. Each has changed the way people live and conduct business across the planet. Innovation and new technologies continue to outpace our expectations — and our ability to keep up. So what’s the next big thing? Will virtual reality, for example, live up to its promise? Equally important, what’s the next big disappointment? Google Glass? Perhaps Apple Watch? No one knows for sure, but our panel of Silicon Valley experts will provide an inside look at promising technologies that could be part of our lives within the next decade.

Unlocking Intelligence: Education within Technology and Virtual Worlds – VR-AR-Immersive

Holocaust survivors: Never forget genocide of World War – NDT in the Desert Sun

A story in the Desert Sun covered a Holocaust Remembrance Day speech by Stephen Smith of the USC Shoah Foundation. The story notes that Smith mentioned that Pinchas Gutter and other Holocaust survivors are now telling their stories by “hologram” through the USC Shoah Foundation and the Institute for Creative Technologies.

Learning Representations of Affect from Speech

There has been a lot of prior work on representation learning for speech recognition applications, but not much emphasis has been given to an investigation of effective representations of affect from speech, where the para-linguistic elements of speech are separated out from the verbal content. In this paper, we explore denoising autoencoders for learning paralinguistic attributes, i.e. categorical and dimensional affective traits from speech. We show that the representations learnt by the bottleneck layer of the autoencoder are highly discriminative of activation intensity and at separating out negative valence (sadness and anger) from positive valence (happiness). We experiment with different input speech features (such as FFT and log-mel spectrograms with temporal context windows), and different autoencoder architectures (such as stacked and deep autoencoders). We also learn utterance specific representations by a combination of denoising autoencoders and BLSTM based recurrent autoencoders. Emotion classification is performed with the learnt temporal/dynamic representations to evaluate the quality of the representations. Experiments on a well-established real-life speech dataset (IEMO-CAP) show that the learnt representations are comparable to state of the art feature extractors (such as voice quality features and MFCCs) and are competitive with state-of-the-art approaches at emotion and dimensional affect recognition

Learning Representations of Affect from Speech

There has been a lot of prior work on representation learning for speech recognition applications, but not much emphasis has been given to an investigation of effective representations of affect from speech, where the paralinguistic elements of speech are separated out from the verbal content. In this paper, we explore denoising autoencoders for learning paralinguistic attributes i.e. categorical and dimensional affective traits from speech. We show that the representations learnt by the bottleneck layer of the autoencoder are highly discriminative of activation intensity and at separating out negative valence (sadness and anger) from positive valence (happiness). We experiment with different input speech features (such as FFT and log-mel spectrograms with temporal context windows), and different autoencoder architectures (such as stacked and deep autoencoders). We also learn utterance specific representations by a combination of denoising autoencoders and BLSTM based recurrent autoencoders. Emotion classification is performed with the learnt temporal/dynamic representations to evaluate the quality of the representations. Experiments on a well-established real-life speech dataset (IEMOCAP) show that the learnt representations are comparable to state of the art feature extractors (such as voice quality features and MFCCs) and are competitive with state-of-the-art approaches at emotion and dimensional affect recognition.

Inside North America’s first all-digital hospital – Modern Healthcare Covers the USC Virtual Care Clinic

An article about the move toward digitization in health care features the USC Virtual Care Clinic. The article notes that the clinic will begin pilots with patients this summer, with a broader rollout by year-end. The article notes that USC has assembled eight partners for the initiative, including its own Institute for Creative Technologies, which is best known for developing computer-generated “virtual humans.” In the case of the Virtual Care Clinic, the technology allows people to interact with a simulation of a USC specialist. The article notes that the technology is similar to Siri on an iPhone. It can recognize key parts of a question and provide information. But it also can be personalized for individual patients. The story states that USC’s Roski Eye Institute will be among the first specialty centers to incorporate this virtual reality technology into patient care as part of the Virtual Care Clinic.

Practical Multispectral Lighting Reproduction

This talk discussed how actors can be surrounded by multispectral LED lights in a studio to light them precisely how they would appear in a real-world location or to believably integrate them into a virtual set, with no manual image adjustment required, even with spectrally complex light sources.

How the tone of a person’s VOICE can reveal if they’re suicidal – Stefan Scherer in the Daily Mail

Daily Mail (UK) featured research by Stefan Scherer of the USC Institute for Creative Technologies on how to assess if an individual is suicidal. Scherer developed software to help clinicians identify changes in a patient’s voice. “If you want to assess a person’s risk to attempt suicide it’s important to look at what they say, as well as how they say it,” Scherer said.

New Approaches Emerging To Preserve Shoah Memory – NDT in Jewish Week

The  Jewish Week covered New Dimensions in Testimony and the recent recording of  Holocaust survivor Eva Kor’s testimony for the project.

Th story states that Eva Kor’s interview is slated to appear eventually in holographic form, using advanced speech recognition, natural language recognition and audio triggers to allow future viewers to interact with interviewees like Kor by asking questions that the survivors, no longer alive, will be able to answer in recorded fashion. The story notes that the project is a collaboration between University of Southern California’s Shoah Foundation, and the USC Institute for Creative Technologies and Conscience Display.

Read the full story.

Photo: Pinchas Gutter being interviewed in the Light Stage for New Dimensions in Testimony.

To Save Suicidal Teens, Listen to their Voice

Suicide rates among Americans are on the rise, underscoring the need for early intervention. USC researchers tested and found high-tech acoustic software can identify teens who are a suicide risk to help clinicians intervene before it is too late.

By Andrew Good

The difference between suicidal and non-suicidal patients can be as subtle as the breathiness of their speech and the tension or pitch of their voices.

Through speech analysis of teen patients, researchers at the University of Southern California and Cincinnati Children’s Hospital Medical Center have identified specific vocal cues as indicators of suicide risk. Researchers were surprised to realize that some of the cues they had identified were nonverbal.

“If you want to assess a person’s risk to attempt suicide it’s important to look at what they say, as well as how they say it,” said lead author Stefan Scherer of the USC Institute for Creative Technologies.

The study was published in IEEE Transactions on Affective Computing in January.

Suicide is the third-highest cause of death for American teens; identifying risk is critical among this group in particular, because they are often hesitant to let others know they need help. It’s also a key problem among military veterans who worry that seeking out therapy or admitting to mental health problems might stigmatize them.

window-view-1081788.153258

“We want to develop software and algorithms to help clinicians objectively measure these changes or have a ‘warning light’ as to suicide risk,” Scherer said.

The researchers used software to analyze the voices of 60 patients at the Cincinnatti hospital – 30 of whom were suicidal. The patients, aged 13 to 18, had been interviewed in 2011 for a study by John Pestian, a professor of pediatrics, psychiatry, and biomedical informatics at the University of Cincinnati and CCHMC. A trained social worker interviewed them for their background and family history, as well as asked the patients open-ended questions about their fears, secrets and emotional struggles.

Scherer’s team analyzed the interviews using computer software that identified both verbal and non-verbal cues. Verbal content, such as mentioning death, repeated references to the past or heavy use of first-person pronouns (I, me, myself) were all common in the speech of suicidal patients. But what was surprising to researchers were the nonverbal cues.

Especially significant were the differences between suicidal and non-suicidal subjects. Suicidal subjects had breathier speech, differences in pitch and other subtle changes in the tenseness or harshness of their voices.

All of these cues are significant precisely because they are nonverbal, Scherer said.

“If you want to assess a person’s risk to attempt suicide it’s important to look at what they say, as well as how they say it,” he said.

Nonverbal cues are much easier to identify because you can ask the patient anything. In the case of the clinical interviews, some of the open-ended questions weren’t even specifically about emotion. Patients were asked about their sleep habits and Internet usage, as well as more direct questions about past traumas.

Scherer characterized the study as part of a wave of innovative mental health research that utilizes technology in new ways.

“Technology brings a different set of eyes and ears to the field so doctors can focus on their actual work with an individual,” Scherer said. “Doctors don’t have a lot of time, and they’re trying to assess an individual’s risk. It enables them to go back into the nitty gritty details of what the behavior was really like with an individual and maybe make better assessments.”

Scherer characterized the study as part of a wave of innovative mental health research that utilizes technology in new ways. It also marks the latest study by USC researchers to use voice analysis for signs of psychological trouble.

Last year, researchers at the USC Viterbi School of Engineering developed algorithms analyzing speech between couples – even predicting whether couples will stay together more accurately than relationship experts.

The study was sponsored by the U.S. Army Research Laboratory and CCHMC’s Innovation Fund.

See this story on USC News.

Holocaust Museum, new 3-D technology bring survivor stories to life: NDT in the Chicago Tribune

Chicago Tribune featured work by the USC Shoah Foundation to host interactive survivor stories in a museum setting. The pilot program, hosted by the Illinois Holocaust Museum, was developed by ICT and the  USC Shoah Foundation, in partnership with Conscience Display. The project merges ultra-sophisticated filming techniques, innovative speech-recognition technology and three-dimensional presentation.

“I would expect five years from now many Holocaust museums would have these (interactive) testimonies embedded into their current exhibitions,” said Stephen Smith of the USC Shoah Foundation.

Virtual Reality — Real Or Still Virtual

Virtual reality is the newest – and hottest – medium to take the world by storm.
Find out what it’s all about by those who practice it – and live it.
Will it replace other forms of communication, like Google supplanted newspapers?
What is the historical perspective compared to the printing press, movies, radio and television?

Learn how it’s becoming THE medium for narrative storytelling and historical capture
For health…
For tackling post-traumatic stress and other disorders…
For educational instruction…
…And more

Presenting Tech Demos

New Army interactive game provides sexual assault, harassment training – ELITE SHARP CCT in the Leavenworth Times

An article in the Leavenworth Times covered the Army-wide release of the ELITE SHARP CCT, a new interactive game developed for the Army is designed to train command teams how to respond to reports of sexual harassment and sexual assault.

Members of the Army can download the interactive game for free.

“It gives them a good overview of how to respond to sexual assault, sexual harassment incidents,” said Maj. Greg Pavlichko, chief of the Games for Training Program.

The story notes that the game was developed by ICT.

The Topeka Capital Journal covered the story as well.

Where Virtual Reality Prototypes Are Hatched – David Nelson and MxR in Documentary Magazine

An article in Documentary Magazine featured the role of universities in incubating the new medium of virtual reality.

“I’d say when you ask where we are now, I liken it to the end of the 19th century in the early days of cinema,” says David Nelson, special project manager of the MxR Lab, at the University of Southern California’s Institute for Creative Technologies, and producer and director of the documentaries Naked States and Positively Naked. “We’re Thomas Edison shooting films for the kinetoscope, or we’re the Lumières shooting a train pulling into a station right now. There’s the element of novelty. People are experiencing it for the first time.”

Nelson also discussed VR role as a facilitator for documentary storytelling. “I think the documentary form is working quite well in VR now,” he explains. “Maybe better than fiction because the rules of fiction are maybe a bit more formal than the rules of nonfiction.”

Nelson cited Nonny de la Pena’s 2012 work Hunger in Los Angeles, which was commissioned by the Institute for Creative Technologies, as an early example of the successful use of VR documentary storytelling.

 

USC Virtual Reality Program Helping Treat Vets with PTSD

ABC 7 reporter Leslie Lopez covered Bravemind, ICT’s virtual reality therapy for treating post-traumatic stress. Her story featured ICT’s Skip Rizzo and U.S Marine Veteran Chris Merkle, who was treated with the technology at the Long Beach VA.

Watch the world’s first surgery streamed in virtual reality live from London – The Telegraph Quotes Skip Rizzo

The Telegraph (UK) highlighted research by Skip Rizzo of the USC Institute for Creative Technologies on using virtual reality to treat veterans with PTSD and others who experience trauma. According to Rizzo, virtual reality has immense potential in the healthcare sector. “This could be a real revolution in clinical care,” Rizzo said.

Virtual Reality Explodes Onto Consumer Market, But Will it Disappoint? – Todd Richmond on Voice of America

Voice of America covered the consumer virtual reality market and interviewed ICT’s Todd Richmond about VR content.

“I think what you’re going to find is that right now there’s very little compelling content,” said Todd Richmond, director of Advanced Prototypes and Transition at the University of Southern California Institute for Creative Technologies.  “We’re really in the Wild Wild West of content creation for virtual environments.”

 

 

Virtual interviews help people with autism land a job

USC Institute for Creative Technologies, Dan Marino Foundation partner on technology to improve social skills

By Orli Belman

Sergio Cano recently landed his first job — he’s a busboy at a restaurant in Fort Lauderdale, Fla. The hard-working 29-year-old never doubted his ability to rapidly clean tables and keep dishes stocked, but he did have concerns about getting through his job interview.

“I was nervous that I would fidget or that it would take me a long time to answer questions, especially if the manager asked me something I didn’t understand,” he said.
Like an estimated 3.5 million Americans, Cano was diagnosed with autism spectrum disorder (ASD). He’s intelligent and capable, but certain social situations — like job interviews — can be challenging.

A new program called the Virtual Interactive Training Agent, or ViTA DMF, is working to change that.

ICT_VirtualHuman_SergioCano_web-267x400

Practicing virtual interviews put Sergio Cano at ease.

Developed by the USC Institute for Creative Technologies in partnership with the Dan Marino Foundation (DMF), the program leverages more than 15 years of U.S. Army-funded research and development in the creation of virtual humans to give students with ASD a technology-based solution to help them prepare for job interviews.

“Our main objective is to prepare young adults for what comes in life,” said Dan Marino, the former pro football quarterback who chairs the namesake foundation with his wife, Claire. “This unique technology can help people experience certain life-skill situations, and we are excited to offer it to students.”

At the Marino campus, DMF’s postsecondary school in Fort Lauderdale for young adults with ASD and other developmental disabilities, students go through a minimum of four practice sessions with the virtual interviewers in ViTA DMF. These computer-generated characters are programmed to present varied races, genders and personality types. Students like Cano can find themselves facing softball questioners or hard-nosed interrogators.

“Practicing with ViTA DMF helped me to be less nervous and to answer questions better,” said Cano, who graduated from the Marino campus’ school-to-career track in 2014 and has been working at the restaurant for more than a year.

Teaching with technology
Reports show that young adults with ASD face high rates of unemployment, up to 66 percent in some cases. The hope is that practice with ViTA DMF will help bring those numbers down.

“Until ViTA DMF, there really hasn’t been any innovative tools for young people to practice interview skills, the first obstacle they face in getting a job,” said Mary Partin, DMF’s chief executive officer. “VITA DMF offers something new. It is a cost-effective technology, easily distributed even to areas with limited resources.”

20160413_Kevin_Virtual-Human-web-824x549

Partin said the majority of students who used ViTA DMF have successfully interviewed and been hired for jobs.

Built with natural-language capabilities, the virtual interviewers pose questions and then reply to an interviewee’s answers. Students interact with ViTA DMF over a computer monitor, and each exchange is video-recorded and reviewed afterward, with instructors providing feedback about an interviewee’s responses, body language and eye contact.

A recent DMF evaluation found that improvement in a student’s interview performance occurred after each interaction with the ViTA DMF system, with the most substantial gains occurring after the second and third sessions. Formal studies to document the efficacy are being planned.

Future work could also include incorporating sensing software so that the virtual interviewer can quantify behaviors — like whether a student is making eye contact or speaking fluidly — and provide automated performance feedback to improve job in
terviewing.

Widening the reach
DMF is now making the virtual reality system available via download to other nonprofit organizations and schools through the ViTA DMF Community. DMF was just named a recipient of a Google Impact Challenge: Disabilities grant from Google.org to enhance the system with new responsive scripts and a user-based logging and feedback system. Along with these enhancements, the grant will fund a national distribution campaign to have ViTA DMF housed in at least 100 organizations nationwide.

“After seeing ViTA DMF in action, we realized there is limitless potential to help in many of the soft skill areas where folks on the autism spectrum struggle, both in and out of the workplace,” said psychologist Skip Rizzo, ICT’s director of medical virtual reality, who co-leads the project. “We can provide experiential practice with a virtual human to help students practice a range of social and vocational skills, including how to take turns properly in a discussion, how to respond when someone says something inappropriate or even how to make small talk.”

Rizzo also noted that a modified version of ViTA will soon be applied to help veterans practice their job-interviewing skills. The software is currently being set up at the Long Beach VA and at the downtown Los Angeles U.S. Vets facility.

Filling a role
ViTA DMF’s virtual interviewers are the latest in a line of immersive tools developed at ICT, a Los Angeles-based research and development lab that specializes in immersive technology to help improve human interactions. Other institute projects include virtual reality therapy for treating post-traumatic stress and virtual human role players that help train junior military leaders in how to address real-world counseling issues, such as financial troubles, alcohol abuse and sexual harassment and assault.

ICT_Dan-Marino_web-600x400

“It has been really exciting to work with the U.S. Army on developing technologies that benefit our soldiers and veterans and then see these platforms expand to help other groups, such as young adults with ASD,” said Matthew Trimmer, co-lead of the VITA project at ICT.

Virtual role players offer the benefit of being available at any time and providing consistency for the interviews. For people with ASD, they also offer a relief from some of the social anxieties that can accompany talking to a real person.

“With ViTA DMF, the fact that students are practicing with a virtual character allows them to feel more comfortable than if they were facing a live interviewer,” said Robert Ahlness, ViTA DMF manager. “With reduced anxiety, they can focus more on learning and building skills to the point where what they have practiced can transfer to the real world.”

Ahlness added that Marino campus students reported that they preferred ViTA DMF and felt they learned more than with a live role player.

Most autism funding and programs are focused on helping school-age children and school-age students. ViTA DMF is geared toward young adults wishing to enter the work force and lead a more independent life. If Cano’s story is any indication, this focus benefits their potential employers as well.

“Sergio impressed me enough to want to hire him, and I am extremely glad I did,” said Kevin Sheahan, manager of the Fort Lauderdale restaurant. “He takes pride in his work, he’s never late and he asks for help when he needs it. He’s an exceptional employee and a great person. Everyone here is so happy that he joined our staff.”

See this story on USC News.

Army Cuts Ribbon on West Coast Research Lab at USC ICT

Declaring it a great day for USC, the Army and the nation, leaders from USC, the U.S. Army, the Department of Defense and the White House marked the official opening of ARL West with a ribbon cutting ceremony at the USC Institute for Creative Technologies in Playa Vista, the new home of ARL West.

“I’m excited that today we open a facility dedicated to the art of the possible where the next “ah ha” moment could happen, leading to game changing discoveries and innovations that will forever change mankind,” said Dr. Thomas Russell, director of ARL, in his ribbon cutting remarks.

Joining Dr. Russell in cutting the official ribbon were Lt. Gen. Larry Wyche, deputy commanding general of the U.S. Army Materiel Command; Maj. Gen. John F. Wharton, commanding general of the U.S. Army Research, Development and Engineering Command; C.L. Max Nikias, USC President; Dr. Melissa Flagg, Deputy Assistant Secretary of Defense for Research; Dr. Chris Fall, White House Office of Science and Technology Policy Assistant Director for Defense Programs;  and Dr. Randall W. Hill, Jr., executive director of ICT.

ARL West, the Army’s largest university research lab and the first one west of the Mississippi, represents a new model for Army, academic and industry partnerships that aims to spur innovation by opening access to Army labs. It is also part of a larger DoD strategy to increase collaboration with top scientists and engineers in California.  ARL West will leverage ICT, USC and regional expertise in areas including virtual reality, data visualization and human-robot interaction.

“ARL West’s open campus means USC’s outstanding scientists and engineers will work side by side with the best and brightest from the Army and from the regional tech industry,” said USC President C.L. Max Nikias in his remarks. “Whether through data analysis, robotics, wearable electronics, or virtual reality headsets, we expect this joint venture to carve new roads that no map-maker could anticipate.”

Approximately 300 people from the military, academia and industry attended the opening day activities, which included an open-house with talks, tech demos and research presentations.

See photos of the day on our Facebook page.

LA Tech Digest and ABC 7 also covered the event.

 

 

 

 

 

ARL West Open House and Ribbon Cutting

ARL West, part of the open campus initiative, is an effort to co-locate Army research and development personnel on the West Coast in order to gain access to subject matter experts, technical centers and universities not well represented on the East Coast. Closer collaboration with universities, start-ups and established companies working in simulation and training, electronics, information science, intelligent systems, human-system interaction, etc. will directly benefit the Soldier and enhance educational pursuits. This initiative ensures our nation’s future strength and competitiveness in critical scientific engineering and the creative fields.

While technology has enabled telecommuting and working together despite distance and different time-zones, this digital collaboration cannot replace the deeper connectivity and in-the-moment brainstorming that happens when people work side-by-side. Housing ARL researchers, related staff, and others at the USC ICT facility will provide a foundation to fuel innovative teamwork on these tasks. The co-location will create more face-to-face collaboration between researchers, academic institutions, students and local industry experts, where this interaction leads to better solutions and faster impact.

A collage of groups will be represented at ARL West with backgrounds in areas such as computational sciences, sensors and human factors. The Department of Defense has had limited engagement with the small technology start-ups and the venture capitalists on the West Coast. The relationship ARL has with the University of Southern California Institute for Creative Technologies, University of California, Santa Barbara and Stanford University opens up opportunities to collaborate with a new set of innovators.

An open house and ribbon-cutting will be held on Tuesday, April 13. This is an invitation-only event.

 

These Machines Want to Make You a Deal

Future business leaders could learn how to negotiate from virtual humans.

By Andrew Good

At USC, researchers are studying how to train the next generation of negotiators – and doing so will require teaching machines how to convincingly lie. Using training programs called virtual humans, computer scientists want to help tomorrow’s leaders realize when the person sitting across from them is bluffing their way to a better deal. Virtual humans already exist to train users in leadership and communication skills; someday soon, they could be a normal part of a business education.

Jonathan Gratch, director of the USC Institute for Creative Technologies Virtual Humans Research team, will present a conference paper this May outlining one of the challenges for building successful negotiation programs. The Misrepresentation Game: How to Win at Negotiation While Seeming Like a Nice Guy, will be presented at the Autonomous Agents and Multiagent Systems International Conference in Singapore. The paper was co-authored by doctoral students Zahra Nazari and Emmanuel Johnson, and sponsored by the National Science Foundation and the U.S. Army.

As the study title suggests, the negotiation technique the study explored was all about bluffing while seeming fair. In negotiation, there’s a technique known as the “fixed-pie lie.” The idea is that people arrive at a negotiation expecting a win-lose outcome; they don’t think to ask what their opponents are willing to compromise on, and will cede more than they have to if their opponent keeps turning down each deal.

In a study Gratch led, participants were fooled into accepting worse deals when their computer opponent expressed disappointment. Gratch and his colleagues recruited 75 study participants from Amazon’s Mechanical Turk and asked them to negotiate over baskets of fruit. The computer would claim to want all the fruit –  though in reality it only cared about certain kinds. When the participants gave in and split the fruit evenly, the computer would begrudgingly accept, saying “I’m not happy, but I want to be fair.”

That “concession” tricked the human participants into thinking the computer was giving up more than it really was.

“People tend to believe we’re fighting over the same things, so you’re inclined to believe the fixed-pie lie,” Gratch said. “With this technique, if you realize early on that you can grow the pie, you can pretend that it’s fixed to make your opponent believe they got half of the pie.”

In future experiments, he said, subject participants should be taught how and when to make counteroffers. These could force the computer opponents to reveal that they don’t really want the same things as the human participants. It could also highlight the risks of misrepresentation: you look untrustworthy, which hurts your ability to create future deals.

Gratch is working closely with USC Marshall School faculty who teach negotiations skills. Currently, these skills are taught through classroom lectures and pen-and-paper roleplaying. Virtual humans, which are already used by agencies like the U.S. Army to teach leadership and communication, could provide believable negotiation scenarios in a consistent way.

Virtual humans are also useful because negotiation is an inherently anxiety-provoking task, Gratch said.

“Many people are anxious about a salary negotiation,” he said. “You feel safer in a scenario like this. You don’t worry about getting things wrong. And it provides scaffolding: you learn the easy stuff before you get to the harder stuff.”

As more courses move online and negotiation happens in more virtual spaces, being able to access training programs from anywhere in the world could make these skills easier to develop.

“The thing I’m excited about is, you can really put this in a concrete mathematical framework,” Gratch said. “We can start proving things and covering different negotiation scenarios. The next step is putting virtual human agents on the web.”

Virtual humans mimic realistic social behavior in customizable, replayable scenarios. Users can see how their interactions with co-workers, employees – or in this case, negotiators – can be modified for a more desirable outcome.

Read the story at USC News.

Invited Speaker- The Innovation Partnership Program (IPP)

Robots are Learning to Fake Empathy – Motherboard Features SimSensei

Vice’s Motherboard features ICT’s SimSensei project and Skip Rizzo and Stefan Scherer.

The article states that developers at the University of Southern California Institute for Creative Technologies (ICT) have turned their attention to adding emotional intelligence to the AI they install in their virtual agents—animated, human-like interfaces that engage a user in conversation. The result is “empathic” virtual agents that can read, understand, and respond to human behavior.

“So we learned from that, that the appearance of a character is less important than the level of interaction. And therein is the kernel of the whole thing about AI,” said Rizzo.

The article notes Stefan Scherer leads the speech analysis part of the SimSensei project. He says there are a number of acoustic indicators for depression like an absence of variances in volume and pitch, and increased tension in the vocal tract and folds. But those markers can be easily missed by the human ear, making this one area in which Ellie excels compared to her human colleagues.

The Power And Problem Of Grit – Gale Lucas on NPR’s Hidden Brain

NPR‘s “Hidden Brain” featured research by Gale Lucas of the USC Institute for Creative Technologies on the downsides of “grit.” People with strong perseverance are often highly accomplished, Lucas said, but can also become fixated on a task even when they’re failing at it. This is especially important for gritty students, who can become stuck on SAT tests. Lucas’ interview begins at 20:50

Is Virtual Reality Real or Still Virtual? – Times of San Diego Features Talk by Arno Hartholt

A story in the Times of San Diego notes that ICT’s Arno Hartholt will take part in“Virtual Reality — Real Or Still Virtual?”, a half-day conference on April 22, sponsored by the Asian Heritage Society and San Diego Press Club on the campus of the University of San Diego.

Hartholt will join leading experts in the field as they explore how virtual reality is used in narration, not only in covering news events but in providing historical perspective, how it is already being used in the operating room and tackling psychological problems such post-traumatic stress, and how it already is on the way to replacing teachers in the classroom. Opportunities will be available to experience virtual reality firsthand, as well.

The story notes that Hartholt, an ICT computer scientist, is the project leader for Integrated Virtual Humans group, including applications for tackling post-traumatic stress;

Army and USC to Cut Ribbon for Opening of ARL West

Media are invited to attend ceremony, tour facility and see cutting-edge tech demonstrations. See below for more information on what demos will be on display.

Contact: Joyce M. Conant at (301) 394-3590, cell: (443) 221-9801, email: .

Contact: Orli Belman at (310) 301-5006, cell: (310) 709-4156, .

WHAT: The U.S. Army Research Laboratory and the University of Southern California will hold a ribbon cutting ceremony and tour of ARL West. Part of the open campus initiative, ARL West is an effort to co-locate Army research and development personnel on the West Coast in order to gain access to subject matter experts, technical centers and universities not well represented on the East Coast. The new facility, based at the USC Institute for Creative Technologies, will house up to 70 researchers and is expected to be the laboratory’s largest university research outpost.

WHEN: 3:00-4:00 p.m. Wednesday, April 13 (tours and demos will run until 6:00 p.m.)

WHERE: 12015 Waterfront Drive, Playa Vista, California

WHO: Speakers include Lt. Gen. Larry Wyche, deputy commanding general of the U.S. Army Materiel Command, Maj. Gen. John F. Wharton, commanding general of the U.S. Army Research, Development and Engineering Command, Dr. Thomas P. Russell, ARL Director, C.L. Max Nikias, USC President, and White House Office of Science and Technology Policy Assistant Director for Defense Programs, Dr. Chris Fall.

RSVP: All media planning to attend must RSVP by noon, Tuesday, April 12 to ensure access to the event. All media must also be credentialed and wear identification badges with their media affiliation.

RSVP here: Orli Belman at .

MORE: ARL Director Thomas P. Russell said, “ARL West is part of our hub and spoke approach to increasing the science and technology ecosystem the Army needs to maintain its technological edge in the future that will ensure continued dominance and overmatch for our Armed Forces in a complex and ever-evolving world.”

USC President C. L. Max Nikias said, “USC takes great pride in being selected by the Army Research Laboratory as the home of ARL West. This new partnership marks a key extension of our collaboration with the U.S. Army and enhances our capacity to deliver impactful results for those who serve our country, and ensure our security.”

Read the earlier ARL/USC press release announcing ARL West.

Demos: Technology demos will include the latest ICT research in virtual and augmented reality, the latest discoveries from the Army Research Lab and more.

SimSensei: A virtual human interviewer that can identify signals of depression and other mental health issues by providing real-time tracking and analysis of a person’s facial expressions, eye gaze, body posture and voice.
Digital Actors – See Academy Award-winning facial scanning technology used in Avatar, Spider-Man 2, Benjamin Button and more that transforms a real person into a convincing digital double.

Rapid Avatar Generator: A two minute-process that creates realistic video game-ready versions of people using commodity hardware and ICT software.
ELITE: A virtual human- based training system for teaching leadership and counseling skills.

3-D “Hologram-like” Projections: This demo requires no glasses and projects a perspective correct 3-D image of a real person.
New Dimensions in Testimony collaboration with USC’s Shoah Foundation, which aims to preserve the ability to ask questions of a Holocaust survivor long into the future.

Bravemind: An immersive virtual environment for use by trained therapists to help treat combat-related post-traumatic stress disorder.

Terrain: An effort to create the most realistic, accurate and informative 3-D representations of the physical and non-physical landscape for use in training and simulations.

###

About U.S. Army Research Laboratory:
The U.S. Army Research Laboratory of the U.S. Army Research Development and Engineering Command is the Army’s corporate laboratory. For more information, visit www.arl.army.mil. There you can link to the ARL Facebook page, Twitter feed, and the ARL News Channel on YouTube.

About the University of Southern California Institute for Creative Technologies:
Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center (UARC) working in collaboration with the U.S. Army Research Laboratory. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more. www.ict.usc.edu

Achieving Photoreal Digital Actors

New Army game trains leaders to handle sexual assault in ranks – Army Times Covers ELITE SHARP

The Army Times covered  Army-wide release of the ICT-developed ELITE-SHARP Command Team Trainer. The story noted that it will be available April 1 on the Army’s MILGAMING website alongside the ELITE Lite counseling tool, which was the basis for the game.

The story states that Soldiers using the ELITE-SHARP Command Team Trainer will interact or train with a standardized avatar. The game provides command teams with animated scenarios on sexual assault and harassment and highlights the right and wrong ways to handle such situations, according to information from the Army. There also is a portion of the game where commanders interact with virtual soldiers who have been victims of sexual assault or harassment.

The game moves away from the typical PowerPoint or classroom training soldiers usually get, Maj. Greg Pavlichko, chief of the Army’s Games for Training program at the Combined Arms Center, Fort Leavenworth, Kansas, said in a statement.

“We’re getting away from non-professional role players and just getting beaten to death with slide shows, and making it more engaging,” Pavlichko said. “Plus, for a lot of younger people, gaming is kind of innate and organic to them, so they understand it right away.”

Virtual reality ho! Startups race to stake a claim in new field – Todd Richmond in the San Francisco Chronicle

The San Francisco Chronicle quoted Todd Richmond of the USC Institute for Creative Technologies about the hype behind virtual reality. (Subscription required to access article.)

Army to debut new game-based SHARP training tool developed by ICT

A story on Army.mil announced the April release of a new ICT-developed Army training video game that will help prepare company, battalion and brigade commanders to deal with sexual assault and harassment in their ranks.

The  article explains that ELITE-SHARP Command Team Trainer is an interactive video game that will make its debut, April 1, just in time for Sexual Assault Prevention Month. The game will post on the Army’s MILGAMING website at milgaming.army.mil, alongside the already successful ICT-developed “ELITE Lite counseling tool” on which it was based.

Unlike traditional slide show-based training, the ELITE-SHARP CTT game provides command teams with animated scenarios regarding sexual assault and harassment that illustrate both the right way and the wrong way to handle such situations, and then moves into an interactive portion where commanders meet face-to-face with virtual Soldiers who have been victims of a sexual assault or sexual harassment.

“We hear feedback that using slide shows for training is very ineffective,” said Monique Ferrell, director of the Army SHARP Program. “This is an avatar-based platform. When a new commander takes command of a unit, by regulation there is a requirement for them to meet with their SHARP professional, their SARC, within the first 30 days. What this tool does, the ELITE-SHARP CTT, is it facilitates that discussion between the Sexual Assault Response Coordinator and the commander.”

Beware the perils of ‘Oculus face’ – Evan Suma Quoted on Sim Sickness in Daily Mail

An article in the UK’s Daily Mail noted that Evan Suma spoke about the issue of simulation sickness at Games Developers Conference in San Francisco.

“The challenge is that people’s sensitivity to motion and simulator sickness varies wildly,” he said.

USC’s Professor Mark Bolas Talks Future of VR at FMX 2015

As part of AnimationWorld Network’s Professional Spotlight series — a series of exclusive video interviews shot during FMX 2015 –  Mark Bolas, professor and researcher at USC’s Institute for Creative Technologies, discusses applications for virtual reality beyond entertainment and the challenges that virtual reality faces along the path to widespread consumer adoption. Watch the videos here.

Five things to do with latest 3D ‘toy’ – The Independent Features ICT’s VR Therapy and Skip Rizzo

New Zealand’s The Independent featured the work of ICT’s Skip Rizzo in an article exploring uses for virtual reality headsets, like the newly available Oculus Rift.

The article noted that when speaking to The Verge in 2013, Dr Albert Rizzo from the University of Southern California’s Institute for Creative Technologies said: “I have no question that Oculus will revolutionise virtual reality for clinical purposes. This system is going to be about so much more than playing games.”

Rizzo has used VR to treat veterans suffering from post-traumatic stress disorder (PTSD) by putting them through exposure therapy. He uses the Rift to virtually place soldiers at the scene of a traumatic battlefield event in a controlled situation, helping them process their anxious feelings and overcome the worst of the disorder, stated the story.

Army Sees Lasers, Hoverbikes and Nano Drones in Future Force – Military.Com Notes ARL West Opening at ICT

A story on Military.com noted the Army’s Open Campus program and covered the April opening of ARL West at ICT.

“The initial focus of ARL West will be in the area of Human Information Interaction, which involves research into how humans generate and interact with data to make decisions more effectively and efficiently,” said Thomas Russell, director of ARL, in 2015 when the West Coast campus was first announced.

“As we develop this new campus, we will also be establishing new relationships on the West Coast with other researchers in academia and industry to augment work ongoing there and work at our other sites. This collaboration, in an open work environment, will further develop the work we do for our service members and the nation.”

Army technology leaders hope those collaborations will help the service spur new innovation in the cyber, directed energy and electronic warfare fields.

The article covered a speech made by Mary Miller, the deputy assistant secretary for Research and Technology, at the Association of the U.S. Army’s Global Force Symposium & Exposition that stated the Army has remained committed to its modernization plan in the face of continual budget cuts by protecting early stage research funding. Miller mentioned ARL West at ICT in her remarks.

 

Commander says stakes are high for Army research-and-development arm – ICT and Randy Hill in Inside Defense Article

An Inside Defense article covered how RDECOM is working to support the so-called “Third Offset Strategy” outlined by Deputy Secretary of Defense Bob Work, which emphasizes autonomous systems, robotics, and manned-unmanned teaming, and aims to “augment the human dimension.”  The article noted the opening of ARL West at ICT and quoted Gen John Warton.

It also states that one of Wharton’s allies in this work is Randall Hill, executive director of ICT. As Hill explained to Inside the Army, “we listen to what TRADOC says, we work closely with the Army Research Laboratory and [the] research community . . . and we listen to what those needs are; and we look out on the frontiers of where the science is and where it’s going, and what’s possible.”

ICT’s mission, he said, is “to do the research and development and the prototyping — advanced prototype development — for immersive technologies.” These can include such innovations as virtual reality headsets used for ground vehicle training, for example.

Hill described the concept of “cyber humanism,” which he said focuses on “amplifying human capabilities,” as closely aligned to DOD’s Third Offset strategy. The concept focuses on “reducing what we think of as the impedance between humans and machines,” making their interactions more intuitive. To that end, his team is working to “push the boundaries on virtual reality and augmented reality,” and is using artificial intelligence to craft “virtual humans” for use in training.

12 of the most impressive students at Stanford right now – Former ICT MxR Intern Makes the Business Insider List

Former ICT summer intern Aashna Mago was named one of the 12 most impressive Stanford students by Business Insider. The article noted that Mago landed a summer internship with virtual-reality expert Mark Bolas  in the Mixed Reality Lab at ICT, where she honed skills in programming, 3D modeling and printing, and design.

VoiceBox Announces Scientific Advisory Board to Focus Aggressive Investments in Advanced Technology – Includes ICT’s David Traum

VoiceBox Technologies, a global provider of contextual voice technologies and natural language understanding (NLU) for automotive, mobile and IoT products, announced that ICT’s David Traum has been named to their newly created Scientific Advisory Board.

Keynote Talk- Beyond Video Games: A Virtual Reality Revolution in Behavioral Health

Over the last 20 years, Virtual Reality has moved from being perceived as a “failed technology” to becoming touted as “the next big thing” in media consumption! While multiple forecasters predict that Entertainment (computer games, etc.) will garner the largest VR market share, many place VR healthcare right behind in 2nd place. Now that the technology has caught up with the vision, the established two decades of research on Clinical VR now stands poised to drive the availability of scientifically informed therapeutic consumer products. From VR systems designed to treat Phobias and PTSD to Virtual Human role play systems for teaching social and job interview skills to persons on the autism spectrum, VR will have a significant impact on the future of behavioral healthcare. The talk will touch on the history leading up to this point and where we are heading in the future!

Automated Path Prediction for Redirected Walking Using Navigation Meshes

Head Mounted Projection for Enhanced Gaze in Social Interactions

Lessons to Game Developers from IEEE VR

Ethics for a Combined Human-Machine Dialogue Agent

Can Virtual Reality Cure Phobias – Skip Rizzo and VR Therapy in the Guardian

An article in the UK’s Guardian covered research showing that VR headsets are proving to be a useful therapeutic tool.

The story stated that though clinical use of VR is in its infancy in the UK, the US has been applying this technology for years, specifically to treat servicemen returning from Iraq and Afghanistan who are suffering from post-traumatic stress disorder.

Albert “Skip” Rizzo, director of medical virtual reality at the University of Southern California’s Institute for Creative Technologies, has worked with many soldiers, and explains how it works. “Traditional exposure therapy to treat PTSD relies on the person imagining the situation related to the trauma. But one of the key symptoms of PTSD is avoidance of the cues and reminders of the trauma. So it’s hard to expect someone to create a vivid mental image of something they’re trying to avoid.

“We place the person in VR simulations that the clinician can control in real time, and customise based on that person’s experience, but in a safe environment.” To do this, Rizzo and his team created 14 virtual worlds, varying from a large Middle Eastern city to remote outposts.

A Realistic Walking Model for Enhancing Redirection in Virtual Reality

Redirected walking algorithms require the prediction of human motion in order to effectively steer users away from the boundaries of the physical space. While a virtual walking trajectory may be represented using straight lines connecting waypoints of interest, this simple model does not accurately represent typical user behavior. This paper presents a more realistic walking model for use in real-time virtual environments that employ redirection techniques. We implemented the model within a framework that can be used for simulation of redirected walking within different virtual and physical environments. Such simulations are useful for the evaluation of redirected walking algorithms and the tuning of parameters under varying conditions. Additionally, the model can also be used to animate an artificial humanoid “ghost walker” to provide a visual demonstration of redirected walking in virtual reality.

For virtual reality creators, motion sickness a real issue – AP Quotes Evan Suma

An Associated Press story about the issue of motion sickness in VR quoted ICT’s Evan Suma.

“The challenge is that people’s sensitivity to motion and simulator sickness varies wildly,” said Evan Suma, an assistant professor who studies VR at the University of Southern California, during a talk at the recent Game Developer’s Conference in San Francisco.

The New York Times and ABC News also ran this story.

A Visit to the Future of Therapy Practice

Since the 1990s, virtual reality (VR) has rapidly evolved from an expensive computer toy into an affordable, increasingly popular clinical tool for assessing, managing, and treating such conditions as anxiety disorders, PTSD, acute pain, autism spectrum disorder, and ADHD. In this comprehensive overview of the therapeutic applications of VR, the hands-on demo of the PTSD treatment system shows how to:

  • Enhance exposure therapy with anxiety disorders and PTSD through immersing clients in simulated experiences
  • Add a new treatment dimension to cognitive and motor rehabilitation, as well as pain distraction
  • Use VR for highly interactive clinical training and as an online healthcare support
  • Advance the clinical research and practice by leveraging the immersive and interactive components of the technology in such a fashion as to be considered “the ultimate Skinner box.”

People are our Advantage: The Impacts of Leader Development on Readiness

Panel Moderator: BG Peter Palmer United States Army Retired Director, EDGE Innovation Network General Dynamics

Panel Chair: LTG Robert B. Brown Commanding General United States Combined Arms Center

Panel Members:

Randy Hill, Ph.D. Executive Director Institute for Creative Technologies University of Southern California

MG John F. Wharton Commanding General United States Army Research, Development and Engineering Command

High School Students: Apply Now for the ICT Research Academy

ICT is launching a new summer program for high school students.

The ICT Research Academy is designed for Los Angeles area high school students (rising sophomores through seniors) who already have a track record of STEM commitment and engagement. Research Academy students will be placed in a research lab where the students will be integrated with ongoing research projects. The students will participate in many different facets of research while gaining valuable insight and experience. Each student will be paired with a research lab based on the student’s interests and the availability within the research lab. Additional activities of the ICT Research Academy include weekly seminars from researchers.

Applications are due April 18. Don’t miss this great opportunity.

View our flyer.

Virtual Physicians: The Future of Healthcare

In the not-too-distant healthcare future, “virtually cloned” physicians will operate in AI-enabled, mixed reality clinics that offer 24/7 access and consultation. Aided by IoT-enabled wearable devices, implantable and ingestible sensors, virtually cloned physicians will collect on-demand diagnostic data to fuel proactive, seamless patient care extending beyond the walls of clinics, labs and hospitals. This will enable doctors to extend their expertise anywhere, to anyone, at any time, eliminating global healthcare access barriers and quality of care. Join us to explore the revolutionary role virtual humans may play in your healthcare future! Part of the IEEE Tech for Humanity Series.

Presenters

Andrew Thompson, CEO, Proteus Digital Health

Jay Iorio, Director of Innovation, IEEE

Leslie Saxon, Executive Director and Founder, USC Center for Body Computing

Todd Richmond, Director of Advanced Prototypes and Transition, USC Institute for Creative Technologies

Lighting Hollywood’s Real and Virtual Actors

How does photoreal digital actor technology work and what does it mean for the future of acting and filmmaking? This talk will describe several projects using USC ICT’s Light Stage facial scanning systems which have created photoreal digital actors for movies, video games, and cultural heritage, including The Curious Case of Benjamin Button, Avatar, Maleficent, Furious 7, and the pioneering research projects Digital Emily, and Digital Ira. Recently, a mobile Light Stage helped create a 3D Portrait of President Barack Obama, and a full-body Light Stage is being used to three-dimensionally record and project testimony of survivors of the Holocaust.

Virtual reality therapy is coming. One academic’s predictions of how this will transform health care – Skip Rizzo in the Washington Post

Washington Post article on the expected demand for virtual reality health care applications featured Skip Rizzo.

“What clinical VR provides is the opportunity to put a patient into a world that is different from where they’re sitting — one that is designed to have some therapeutic ingredient built into it,” said Albert “Skip” Rizzo, director for medical virtual reality at the University of Southern California’s Institute for Creative Technologies.

The story noted that Rizzo has worked on VR applications for post-traumatic stress disorder, autism, physical rehabilitation, emotional coping and even job interview training.

Rizzo said perhaps the most important advantage of VR therapy is that it lowers the barrier of access to mental health care for patients.

Trust Calibration within a Human-Robot Team: Comparing Automatically Generated Explanations

Researchers Personalize In-Game Character Design With Free Tools – Tech Times Covers Tools for Character Animation and Simulation

Tech Times covered the ICT Character Simulation and Animation group’s release of tools to make digital versions of people in under four minutes.

“We’re trying to foster innovation,” said Ari Shapiro, one of the project’s leads, adding that he and his colleagues are interested in releasing the software in the public domain to see what kinds of uses people will come up with for it.

How a Virtual Human could be your Coach – BBC News

The BBC News program Click featured a demonstration with Rachel, an ICT virtual human who can interact over mobile phone. David Krum was interviewed for the story and discussed his research looking into whether engaging with virtual humans in this this way can help change behavior for the better.

ELITE SHARP CTT

Download a PDF overview.

The Emergent Leader Immersive Training Environment Sexual Harassment/Assault Response & Prevention Command Team Trainer (ELITE SHARP CTT) is a laptop training application that gives United States (US) Army Command Teams the knowledge, skills, confidence, and ability to successfully execute the SHARP program. Developed under the guidance of the SHARP Program Management Office and in collaboration with the US Army SHARP Academy, the ELITE SHARP CTT content incorporates US Army-approved SHARP doctrine (AR 600-20), evidence-based instructional design methodologies. ELITE SHARP CTT also leverages USC ICT research technologies such as virtual humans, story-based scenarios, and intelligent tutoring technology to help create a challenging yet engaging training experience.

The ELITE SHARP CTT software features 13 scenarios based on relevant real-world incidents that Command Teams may face on the job. Each scenario offers the user a chance to practice skills required by Command Teams to successfully respond to incidents of Sexual Harassment, Sexual Assault, and retaliatory behavior. The training experience includes three phases: Up-front Instruction, Practice Environment, and an After Action Review (AAR).

The total training time for ELITE SHARP CTT is flexible based on user and training needs. For example, time may vary depending on user experience level, performance, and engagement. The software allows users to review missed concepts based on how well they respond to quiz questions, and offers demonstration support through training vignettes and step-by-step skills comparisons. There is also the option for users to practice all three scenarios, and engage in the AAR after each practice environment.

ELITE SHARP CTT offers US Army Command Teams a unique opportunity to learn and practice SHARP skills so they’re better prepared for the incidents they may encounter. Upon completion of the ELITE SHARP CTT training, users will be able to demonstrate their understanding of the critical Command Team roles and responsibilities when leading the successful implementation of the US Army SHARP Program in their organizations.

ELITE SHARP CTT is available for download on the MilGaming web portal: https://milgaming.army.mil/

ELITE Counseling

Download a PDF overview.

The Emergent Leader Immersive Training Environment (ELITE) Counseling is a laptop training application used to teach interpersonal skills to United States (US) Army junior leaders by presenting real-world instructional scenarios in an engaging, self-reinforcing manner. The purpose of the training experience is to provide junior leaders with an opportunity to learn, practice and assess interpersonal communication skills for use in basic counseling. The ELITE Counseling content incorporates Army-approved leadership doctrine (ATP 6-22.1), evidence-based instructional design methodologies, and USC ICT research technologies, such as virtual humans and intelligent tutoring, to create a challenging yet engaging training experience.

The ELITE Counseling software has five scenarios. Users may choose the role of an Officer or NCO when playing through each scenario. All of the scenarios are based on real-world counseling issues such as financial troubles, post-deployment readjustment, and alcohol-related performance issues as well as scenarios dealing pre- Sexual Harassment/Assault Response and Prevention (SHARP) encounters. These scenarios offer users a chance to practice the interpersonal communication skills they learn during the ELITE Counseling instruction. The package includes three phases: Up-front Instruction, Practice Environment, and an After Action Review (AAR).

The total training time for ELITE Counseling is anywhere from one to two hours depending on a user’s proficiency. Time will vary depending on user experience level, performance, and engagement. Some users may take time to review missed concepts based on how well they respond to quiz questions. Some users may choose to watch all suggested training vignettes and comparisons. Some users may thoroughly engage in the AAR after an interaction in the practice environment. Some users may choose to practice all five scenarios for their given rank.

In the contemporary US Army, leaders must not only be prepared for the tactical side of leadership, but also the personal and soft side of leadership as well. Effective communication between leaders and their subordinates is paramount towards maintaining the combat effectiveness of the force, and ELITE Counseling offers young US Army leaders a unique opportunity to learn and practice interpersonal skills so they are better prepared for the interactions they will encounter.

ELITE Counseling is available for download on the MilGaming web portal: https://milgaming.army.mil/

INOTS Counseling

Download a PDF overview.

The Immersive Naval Officer Training System (INOTS) Counseling is a laptop training application used to teach interpersonal skills to United States (US) Navy junior leaders by presenting real-world instructional scenarios in an engaging, self-reinforcing manner. The purpose of the training experience is to provide junior leaders with an opportunity to learn, practice and assess interpersonal communication skills for use in basic counseling. The INOTS Counseling content incorporates Navy-approved leadership doctrine (Division Officer Leadership Course), evidence-based instructional design methodologies, and USC ICT research technologies, such as virtual humans and intelligent tutoring, to create a challenging yet engaging training experience.

The INOTS Counseling software has nine scenarios. All of the scenarios are based on real-world counseling issues such as fraternization, workforce management, peer-to-peer communications, Sexual Harassment and physical fitness issues. These scenarios offer users a chance to practice the interpersonal communication skills they learn during the INOTS Counseling instruction. The package includes three phases: Up-front Instruction, Practice Environment, and an After Action Review (AAR).

The total training time for INOTS Counseling is anywhere from one to two hours depending on a user’s proficiency. Time will vary depending on user experience level, performance, and engagement. Some users may take time to review missed concepts based on how well they respond to quiz questions. Some users may choose to watch all suggested training vignettes and comparisons. Some users may thoroughly engage in the AAR after an interaction in the practice environment. Some users may choose to practice all nine scenarios.

In the contemporary US Navy, leaders must not only be prepared for the tactical side of leadership, but also the personal and soft side of leadership as well. Effective communication between leaders and their subordinates is paramount towards maintaining the combat effectiveness of the force, and INOTS Counseling offers young US Navy leaders a unique opportunity to learn and practice interpersonal skills so they are better prepared for the interactions they will encounter.

Recognizing Human Actions in the Motion Trajectories of Shapes

People naturally anthropomorphize the movement of nonliving objects, as social psychologists Fritz Heider and Marianne Simmel demonstrated in their influential 1944 research study. When they asked participants to narrate an animated film of two triangles and a circle moving in and around a box, participants described the shapes’ movement in terms of human actions. Using a framework for authoring and annotating animations in the style of Heider and Simmel, we established new crowdsourced datasets where the motion trajectories of animated shapes are labeled according to the actions they depict. We applied two machine learning approaches, a spatial-temporal bag-of-words model and a recurrent neural network, to the task of automatically recognizing actions in these datasets. Our best results outperformed a majority baseline and showed similarity to human performance, which encourages further use of these datasets for modeling perception from motion trajectories. Future progress on simulating human-like motion perception will require models that integrate motion information with top-down contextual knowledge.

Conversing with Computers – National Science Foundation Features David DeVault’s Research

The homepage and blog of the  National Science Foundation showcased recent natural language advances that let machines understand speech with human-like speed, accuracy.

In a recent research paper, DeVault and his students, Ramesh Manuvinakurike and Maike Paetzel, described the creation and evaluation of a high-performance game-playing agent called Eve.

In the game, users describe the pictures they see on their computer screen and the agent tries to guess which picture they are talking about as fast and accurately as it can. By using “incremental” (word-by-word) speech processing algorithms, the agent’s speed of understanding and response is so fast that its game performance rivals that of human-human teams playing the same game.

When compared to alternative versions of the agent that wait until a user’s speech is finished to try to understand and respond, users rate their interactions with the more incremental version of Eve as more efficient, more natural, and having better shared understanding.

“These findings underscore the importance of enabling systems to not only understand what users are saying, but to do so as quickly as a human would,” DeVault said.

The research received a Best Paper Award at the 16th Annual SIGdial Meeting on Discourse and Dialogue (SIGDIAL 2015). It points toward the creation of voice interfaces for other applications that users may find more natural and efficient to use. Read the full post.

Watch a video:

<iframe width=”220″ height=”147″ src=”https://www.youtube.com/embed/emCgqbeatJs” frameborder=”0″ allowfullscreen></iframe>

ICT Distinguished Lecture Series: The synchronous massive online course (SMOC) and new model of online education

Abstract:
Growing out of research on social and personality psychology and computerized text analysis, Pennebaker and his colleague Sam Gosling have developed a live online class that relies on daily testing, small online discussion groups, and a TV talk show format. Based on four classes ranging from 800 to 1500 students, significant gains were seen in performance in the class over previous courses. More striking, students taking the SMOC do better in courses in subsequent semesters and achieve large reductions in the achievement gap.

Bio:
Pennebaker is the Regents Centennial Professor of Psychology and Executive Director of Project 2021 at the University of Texas at Austin. Author of over 300 publications and 9 books, his work on physical symptoms, expressive writing, and language psychology is among the most cited in psychology and the social sciences. He has received multiple research and teaching awards.
Light refreshments will be served.

Calling all Undergrads: Apply Now for our REU Summer Research Program

The USC Institute for Creative Technologies (ICT) offers a 10-week summer research program for undergraduates in interactive virtual experiences. A multidisciplinary research institute affiliated with the University of Southern California, the ICT was established in 1999 to combine leading academic researchers in computing with the creative talents of Hollywood and the video game industry. Having grown to encompass over 130 faculty, staff, and students in a diverse array of fields, the ICT represents a unique interdisciplinary community brought together with a core unifying mission: advancing the state-of-the-art for creating virtual experiences so compelling that people will react as if they were real.

Reflecting the interdisciplinary nature of ICT research, we welcome applications from students in computer science, as well as many other fields, such as psychology, linguistics, art/animation, interactive media, and communications. Undergraduates will join a team of students, research staff, and faculty in one of several labs focusing on different aspects of interactive virtual experiences, with projects focusing on virtual reality, human-computer interaction, graphics, machine learning, natural language understanding, and other areas.

In addition to participating in seminars and social events, students will present their projects to the rest of the institute at the end of summer research fair.

Students will receive $5000 over ten weeks, plus an additional $3450 stipend for housing and living expenses.  Non-local students can also be reimbursed for travel up to $600.  The ICT is located in West Los Angeles, just north of LAX and only 10 minutes from the beach.

This Research Experiences for Undergraduates (REU) site is supported by a grant from the National Science Foundation.

Students can apply online at: http://ict.usc.edu/reu/

Application deadline: March 7, 2016 (extended)

For more information, please contact Evan Suma at reu@ict.usc.edu.

Keynote Talk- Virtual Reality: The History, Hype and the Holodeck

Personal Assistant for Life Long Learning (PAL3)

Download a PDF overview.

PAL3 is a system for delivering engaging and accessible education via mobile devices. It is designed to provide on-the-job training and support lifelong learning and ongoing assessment.

The system features a library of curated training resources containing custom content and pre-existing tutoring systems, tutorial videos and web pages.  PAL3 helps learners navigate learning resources through:

  • An embodied pedagogical agent that acts as a guide
  • A persistent learning record to track what students have done, their level of mastery, and what they need to achieve
  • A library of educational resources that can include customized intelligent tutoring systems as well as traditional educational materials such as webpages and videos
  • A recommendation system that suggests library resources for a student based on their learning record
  • Game-like mechanisms that create engagement (such as leaderboards and new capabilities that can be unlocked through persistent usage)

PAL3 allows students to find and suggest new content, which is then vetted by instructors.

A customizable interactive agent, “Pal” is designed specifically to engage and motivate via amusing animations and dialog with students through natural language processing (voice and text).

The initial PAL3 prototype addresses knowledge decay and retention as sailors move from one schoolhouse to another. The PAL3 platform  integrates with military education, training and career management systems and can serve as a guide to many different types of learning experiences designed to meet learners where they are and help them get where they need to go.

This project is a collaboration  between ICT,  Arizona State University and the University of Memphis and is funded by the Office of Naval Research.

Go Inside Virtual Reality – ABC’s Good Morning America Features Skip Rizzo and Bravemind

ABC News’ “Good Morning America” featured an interview with Skip Rizzo of the USC Institute for Creative Technologies about the technology behind virtual reality. The report highlights the institute’s work treating veterans with post-traumatic stress disorder.

Conference: Commonsense Interpretation of Triangle Behavior

The ability to infer intentions, emotions, and other unobservable psychological states from people’s behavior is a hallmark of human social cognition, and an essential capability for future Artificial Intelligence systems. The commonsense theories of psychology and sociology necessary for such inferences have been a focus of logic-based knowledge representation research, but have been difficult to employ in robust automated reasoning architectures. In this paper we model behavior interpretation as a process of logical abduction, where the reasoning task is to identify the most probable set of assumptions that logically entail the observable behavior of others, given commonsense theories of psychology and sociology. We evaluate our approach using Triangle-COPA, a benchmark suite of 100 challenge problems based on an early social psychology experiment by Fritz Heider and Marianne Simmel. Commonsense knowledge of actions, social relationships, intentions, and emotions are encoded as defeasible axioms in first-order logic. We identify sets of assumptions that logically entail observed behaviors by backchaining with these axioms to a given depth, and order these sets by their joint probability assuming conditional independence. Our approach solves almost all (91) of the 100 questions in Triangle-COPA, and demonstrates a promising approach to robust behavior interpretation that integrates both logical and probabilistic reasoning.

Comforting a virtual ‘child’ can help people with depression – Wired UK Highlights ICT’s Virtual Reality Therapy

ICT’s virtual reality exposure therapy for PTSD is included in a Wired article on ways virtual reality experiences are being used to treat mental illness and neurological damage.

Survios’ First-Person Shooter Shows How Addictive VR Will Be – MxR Lab in TechCrunch

TechCrunch mentioned that Survios, a virtual reality startup that raised $4.2 million in funding, began when its founders worked at the USC Institute for Creative Technologies’ Mixed Reality Lab.

Now You REALLY Can Get into the Game – Daily Mail Features Rapid Avatar Generator

The  UK’s Daily Mail  featured a set of digital tools by Ari Shapiro of the USC Institute for Creative Technologies and colleagues that will allow anyone to scan and animate themselves in just four minutes.

Shapiro, who led the project along with Evan Suma and Andrew Feng, explained that the researchers plan to make the software free for the public, to open the door for creativity.

‘We’re giving everyone the ability to scan and animate themselves for free,’ Ari Shapiro, head of the Character Animation and Simulation research group said.

‘We’re trying to foster innovation,’ Shapiro said.

The Tech Times also covered.

Machine Learning and the Profession of Medicine – SimSensei in JAMA Viewpoint

An viewpoint article in the Journal of the American Medical Association cites Ellie, the clinical interviewer in ICT’s SimSensei project as an example of a machine learning technology. The article states that the profession of medicine has a tremendous opportunity and an obligation to oversee the application of machine learning technology to patient care, noting that the world has entered a period of unprecedented innovation, bringing a wealth of possibilities to clinical medicine. With this extraordinary opportunity, however, comes the obligation of the medical profession to serve the human good, states the writers.

Come on, Science. Give Us Animal Superpowers Already – Skip Rizzo in Wired

An article about science journalist Kara Platoni’s new book We Have the Technology: How Biohackers, Foodies, Physicians, and Scientists Are Transforming Human Perception, One Sense at a Time quotes her discussing Skip Rizzo and his work treating post-traumatic stress with virtual reality exposure therapy.

“So the scientist that I met, that I went to Colorado with, is Dr. Skip Rizzo, and he had taken this idea and said, ‘Let’s take it to a digital native generation of soldiers. Let’s create virtual Iraq and virtual Afghanistan.’ So they had made these simulations that I was just describing, these very wrap-around experiences where they would use sight and smell and audio—all of these things—to recreate this virtual combat environment. And once again the soldier would be in the company of his therapist, and he’d be talking about what happened, and she would be manipulating the environment along with his memories, and over time they found, yes, people got better.”