‘We Have the Technology,’ by Kara Platoni – SF Chronicle Book Review Notes ICT Research

A San Francisco Chronicle review of Kara Platoni’s “We have the Technology” includes mention of the book’s coverage of Skip Rizzo’s medical virtual reality work.

A subscription may be required to view the full article.

KPCC-FM’s “AirTalk” Features ICT Grit Research and Interviews Gale Lucas

KPCC-FM’s “AirTalk” featured research by ICT’s Gale Lucas and colleagues about perseverance. People with “grit” – the quality that leads them to persevere in the face of challenges – are often so overtaken by the need to engage with a challenge that they ignore simpler solutions to their problems, Lucas said.

Creative Commons photo by cliff1066 https://www.flickr.com

Nonny de la Peña: Journalist. Virtual reality pioneer. Occasional painter – Tech Republic Features ICT Virtual Reality Tech

A story about Nonny de la Peña notes that Hunger in Los Angeles her first immersive journalism piece was developed with a prototype USC’s Institute for Creative Technologies, specifically Mark Bolas, now-Oculus founder Palmer Luckey, Thai Phan, and Evan Suma. The piece premiered at the Sundance Film Festival in 2014.

USC Research Looks a Lot Like Star Wars

USC News covered USC researchers who are developing technologies reminiscent of those found in Star Wars.

When it comes to holograms, for instance, Paul Debevec, ICT’s chief visual officer at the Institute for Creative Technologies, said the public will always remember one hologram above all others: “Princess Leia coming out of R2-D2 and telling Obi-Wan Kenobi that he was her only hope sticks in the public consciousness of what a hologram is supposed to be.”

That one depiction, from a movie 30 years back – which wasn’t a hologram at all, but a composited video image – has shaped what people expect from holographic displays.

At ICT, Debevec and his colleagues are working on three-dimensional displays created from a spinning surface. The effect is that of an object that looks real, floating in space, without the need of virtual-reality goggles.

They are also working to create a 3-D video conferencing system, one Debevec said was actually inspired by Star Wars. “There’s a moment when Yoda is supervising some operations on the Wookie planet but he still has to go to the Jedi Council meeting, so he shows up at the meeting as a hologram,” he said.

How virtual reality is going to change our lives – The World Weekly Covers ICT Virtual Reality Therapy

A feature story about virtual reality in the World Weekly highlights the virtual reality work of Skip Rizzo, noting that his largely military-funded systems have been used to treat over 2000 veterans in hospital sites around the country, and are now being tailored to treat other sorts of trauma, such as the type experienced in the wake of terrorist attacks like the Paris bombings or the World Trade Center tragedy.

“This could be a real revolution in clinical care,” says Rizzo.

FX Guide Story on How Weta Digital Created a CG Paul Walker Mentions USC ICT Work

FX Guide interviews Weta Digital senior visual effects supervisor Joe Letteri about the sensitive task of re-creating Paul Walker’s performance in James Wan’s Furious 7 after the actor passed away mid-production. In the interview Letteri discusses scanning done in ICT’s Light Stage and mentions Koki Nagano’s skin strech work.

Keynote Talk – Making Small Spaces Feel Large: Practical Illusions in Virtual Reality

The Telegraph Highlights Skip Rizzo and Virtual Reality Therapy in a Story about How VR Will Change Lives

The Telegraph (U.K.) highlighted research by Skip Rizzo of the USC Institute for Creative Technologies about the use of virtual reality in the treatment of PTSD. Rizzo’s systems have been used to treat more than 2,000 veterans and are being tailored to treat other kinds of trauma, including terrorist attacks. “Essentially, it helps the patient repeatedly confront and process very difficult emotional memories, while they narrate the scene they experienced in real life,” Rizzo said.

Invited Speaker – Machine Learning for Spoken Language Understanding and Interaction

Virtual reality gives peek inside New England Patriots’ locker room – ICT Tech in Christian Science Monitor

An article exploring virtual reality’s expanding role noted that Palmer Luckey’s background with virtual reality traces back to University of Southern California’s Institute for Creative Technologies, reports Wired, where therapists were using the technology to transport veterans back to the battlefield in an effort to treat post-traumatic stress disorder.

As it becomes more sophisticated, the technology has applications beyond gaming and sports immersion that industries from healthcare to journalism are starting to explore, stated the story.

Agent-Based Modeling of Life or Death Decisions Following an Urban Biological Catastrophe

To make informed policy decisions on how to both prepare and respond to disasters in urban environments, we need to understand the decision making of the local residents who are affected. Their responses to both the disaster and the policies implemented will shape both the immediate impact as well as the long-term prognosis for recovery. Simulation methods provide a powerful method for analyzing the possible outcomes to hypothetical scenarios, where analysts can explore the interactions among possible disasters, policy decisions, and resident behaviors. Agent-based modeling offers the promise of simulations that use computational models that capture the relevant uncertainties and objectives that drive resident decision-making in these scenarios. Unfortunately, constructing models that represent the complexity of human-decision making with any fidelity is often a time-consuming, trial-and-error process. In this work, we construct an automated pipeline that constructs agent models that capture the beliefs, objectives, and behaviors represented by people’s responses to surveys within a simulated disaster scenario. We use data gathered in a prior study using a video simulation of news reports on an evolving biological terrorist attack scenario. Our model construction process takes this data and extracts decision models in the form of PsychSim agents, using a multiagent framework for social simulation. PsychSim agents represent the beliefs and objectives of the survey respondents in decision-theoretic terms. We can thus compare the extracted PsychSim agent models against the original decision models manually extracted from the data. More importantly, we can then use the agents to conduct simulations and profile possible outcomes under hypothetical conditions not included in the original simulated scenario.

New Dimensions in Testimony Project and ICT Research Featured in Australia’s Saturday Paper

The Saturday Paper, an Australian long-form journalism publication, featured New Dimensions in Testimony, a collaboration between the USC Shoah Foundation and ICT. The article describes how ICT’s highly realistic digital display and state-of-the-art speech recognition systems are preserving the ability to ask questions of Holocaust survivors far into the future.

This is a really critical time,” said ICT’s David Traum. “Children today will be the last generation to hear from living survivors.”

The story explains how ICT’s speech recognition and natural language technologies enable the video version of Holocaust survivor Pinchas Gutter to answer questions.

This year, the story noted, Traum’s team reported that the video version of Gutter has answers to more than 95 per cent of the questions people tend to ask him.

The story states that it was a challenge to get the system to understand spoken queries in order for it to find the right answer in its bank of responses.

“It’s a very hard problem,” said ICT’s Kallirroi Georgila.

The article cites ICT research finding that the system gets it right about two thirds of the time and notes that a small study of children who met with a survivor in person as well as the video projection of Gutter showed no “meaningful” differences between their experiences. For future plans the story mentions an ICT prototype that can display people in 3-D and reports that three other Holocaust survivors have been interviewed for the project.

It also discussed other uses for the technology. Traum believes that in time we may all be creating virtual effigies of our loved ones. “The cameras on your phone could record a sick grandfather, so that future generations could have an interactive dialogue with him,” he said. Georgila proposes a virtual companion for the elderly. “If you don’t have anyone to talk to, you could talk to a dialogue system,” she said.

Rolling Stone Quotes Paul Debevec about Celebrity “Holograms”

A Rolling Stone story covered the trend and technology behind celebrity “holograms”, including a planned-projection of deceased singer Whitney Houston.
The story quotes ICT’s Paul Debevec in explaining the technique used to create the effect and the role of high resolution scans.

Most holograms we see are actually just a modern twist on an optical illusion known as “Pepper’s Ghost,” which dates back 150 years, and involves little more than the reflection of a 2D image through an angled piece of translucent plastic — what one expert in 3D projection, USC professor Paul Debevec, described as “a giant piece of Saran Wrap,” stated the story.

For now, the main obstacle to bringing someone back to life is that a decent 3D image is needed to start with. “If I were one of these folks concerned about their legacy,” said Debevec, “I would say, ‘Before you get a day older, get yourself scanned in high resolution. Preserve yourself!'”

Read more: http://www.rollingstone.com/music/features/inside-the-celebrity-hologram-trend-20151204#ixzz3txVrO0DV
Follow us: @rollingstone on Twitter | RollingStone on Facebook

Creative Help: A Story Writing Assistant

We present Creative Help, an application that helps writers by generating suggestions for the next sentence in a story as it being written. Users can modify or delete suggestions according to their own vision of the unfolding narrative. The application tracks users’ changes to suggestions in order to measure their perceived helpfulness to the story, with fewer edits indicating more helpful suggestions. We demonstrate how the edit distance between a suggestion and its resulting modification can be used to comparatively evaluate different models for generating suggestions. We describe a generation model that uses case-based reasoning to find relevant suggestions from a large corpus of stories. The application shows that this model generates suggestions that are more helpful than randomly selected suggestions at a level of marginal statistical significance. By giving users control over the generated content, Creative Help provides a new opportunity in open-domain interactive storytelling.

New Dimensions in Testimony: Digitally Preserving a Holocaust Survivor’s Interactive Storytelling

We describe a digital system that allows people to have an interactive conversation with a human storyteller (a Holocaust survivor) who has recorded a number of dialogue contributions, including many compelling narratives of his experiences and thoughts. The goal is to preserve as much as possible of the experience of face-to-face interaction. The survivor’s stories, answers to common questions, and testimony are recorded in high fidelity, and then delivered interactively to an audience as responses to spoken questions. People can ask questions and receive answers on a broad range of topics including the survivor’s experiences before, after and during the war, his attitudes and philosophy. Evaluation results show that most user questions can be addressed by the system, and that audiences are highly engaged with the resulting interaction.

Procedural Reconstruction of Simulation Terrain Using Drones

Photogrammetric techniques for constructing 3D virtual environments have previously been plagued by expensive equipment, imprecise and visually unappealing results. However, with the introduction of lowcost, off-the-shelf (OTS) unmanned aerial systems (UAS), lighter and capable cameras, and more efficient software techniques for reconstruction, the modeling and simulation (M&S) community now has available to it new types of virtual assets that are suited for modern-day games and simulations. This paper presents an approach for fully autonomously collecting, processing, storing and rendering highly-detailed geospecific terrain data using these OTS techniques and methods. We detail the types of equipment used, the flight parameters, the processing and reconstruction pipeline, and finally the results of using the dataset in a game/simulation engine. A key objective of the research is procedurally segmenting the terrain into usable features that the engine can interpret – i.e. distinguishing between roads, buildings, vegetation, etc. This allows the simulation core to assign attributes related to physics, lighting, collision cylinders and navigation meshes that not only support basic rendering of the model but introduce interaction with it. The results of this research are framed in the context of a new paradigm for geospatial collection, analysis and simulation. Specifically, the next generation of M&S systems will need to integrate environmental representations that have higher detail and richer metadata while ensuring a balance between performance and usability.

Toward Acquiring a Human Behavior Model of Competition vs Cooperation

One of the challenges in modeling human behavior is accurately capturing the conditions under which people will behave selfishly or selflessly. Researchers have been unable to craft purely cooperative (or competitive) scenarios without significant numbers of subjects displaying unintended selfish (or selfless) behavior (e.g., Rapoport & Chammah, 1965). In this work, rather than try to further isolate competitive vs. cooperative behavior, we instead construct an experimental setting that deliberately includes both, in a way that fits within an operational simulation model. Using PsychSim, a multiagent social simulation framework with both Theory of Mind and decision theory, we have implemented an online resource allocation game called “Team of Rivals”, where four players seek to defeat a common enemy. The players have individual pools of resources which they can allocate toward that common goal. In addition to their progress toward this common goal, the players also receive individual feedback, in terms of the number of resources they own and have won from the enemy. By giving the players both an explicit cooperative goal and implicit feedback on potential competitive goals, we give them room to behave anywhere on the spectrum between these two extremes. Furthermore, by moving away from the more common two-player laboratory settings (e.g., Prisoner’s Dilemma), we can observe differential behavior across the richer space of possible interpersonal relationships. We discuss the design of the game that allows us to observe and analyze these relationships from human behavior data acquired through this game. We then describe decision-theoretic agents that can simulate hypothesized variations on human behavior. Finally, we present results of a preliminary playtest of the testbed and discuss the gathered data.

LA Times Features USC President Nikias Speech at Silicon Beach Event at ICT

Los Angeles Times featured outreach by USC President C. L. Max Nikias to Silicon Beach tech companies. He spoke to executives from SpaceX, Google, Belkin and more. Nikias said that USC is hiring professors into several academic units at once in order to increase interdisciplinary thinking. The story highlighted several related USC initiatives, including the Iovine Young Academy and the World Bachelor in Business program. “Innovation isn’t something that you make the investment and you anticipate,” he said. “It comes when you’re being surprised, because you have provided this environment and the right network of contacts.”

USC News also covered the event, which also included Nikias being scanned in ICT’s Light Stage and demonstrations of several ICT virtual reality technologies.

(USC Photo/Gus Ruelas)

Building Trust in a Human-Robot Team with Automatically Generated Explanations

Technological advances offer the promise of robotic systems that work with people to form human-robot teams that are more capable than their individual members. Unfortunately, the increasing capability of such autonomous systems has often failed to increase the capability of the human-robot team. Studies have identified many causes underlying these failures, but one critical aspect of a successful human-machine interaction is trust. When robots are more suited than humans for a certain task, we want the humans to trust the robots to perform that task. When the robots are less suited, we want the humans to appropriately gauge the robots’ ability and have people perform the task manually. Failure to do so results in disuse of robots in the former case and misuse in the latter. Real-world case studies and laboratory experiments show that failures in both cases are common. Researchers have theorized that people will more accurately trust an autonomous system, such as a robot, if they have a more accurate understanding of its decisionmaking process. Studies show that explanations offered by an automated system can help maintain trust with the humans in case the system makes an error, indicating that the robot’s communication transparency can be an important factor in earning an appropriate level of trust. To study how robots can communicate their decision making process to humans, we have designed an agent-based online test-bed that supports virtual simulation of domain-independent human-robot interaction. In the simulation, humans work together with virtual robots as a team. The test-bed allows researchers to conduct online human-subject studies and gain better understanding of how robot communication can improve human-robot team performance by fostering better trust relationships between humans and their robot teammates. In this paper, we describe the details of our design, and illustrate its operation with an example human-robot team reconnaissance task.

Discover Magazine Covers ICT’s Virtual Reality Exposure Therapy for PTSD

An article in Discover Magazine cited ICT’s virtual reality exposure therapy for treating post-traumatic stress as an example of the potential for virtual reality to trigger personal memories of a particular place or experience.

Keynote Talk – Do’s and Don’ts for Software Companions

ABC News Los Angeles Features ICT’s Virtual Reality Therapy for PTSD

ABC News Los Angeles affiliate KABC-TV featured virtual reality research by Skip Rizzo of the USC Institute for Creative Technologies and colleagues to treat PTSD in military veterans.

Yahoo News Features ICT’s Research In Defense of (Sometimes) Giving Up

Yahoo News featured research by ICT’s Gale Lucas and colleagues on perseverance, popularly described as “grit.” While it’s an important trait for driving success, grit can also cause people to be unwilling to reevaluate unsuccessful strategies.

Creative Commons photo by cliff1066 on flikr.

Public Speaking Training with a Multimodal Interactive Virtual Audience Framework

We have developed an interactive virtual audience platform for public speaking training. Users’ public speaking behavior is automatically analyzed using multimodal sensors, and ultimodal feedback is produced by virtual characters and generic visual widgets depending on the user’s behavior. The flexibility of our system allows to compare different interaction mediums (e.g. virtual reality vs normal interaction), social situations (e.g. one-on-one meetings vs large audiences) and trained behaviors (e.g. general public speaking performance vs specific behaviors).

Detection and Computational Analysis of Psychological Signals of Distress Using a Virtual Human Interviewing Agent

Facial expressions, body gestures, and vocal parameters play an important role in human communication and the signaling of emotion. Advances in low cost computer vision and behavioral sensing technologies (webcams, depth-sensing cameras, and 3D microphones) can now be applied for capturing human behavior and subsequently providing probabilistic inferences regarding the psychological state of a person when they interact with a live or virtual human interviewer. The SimSensei Virtual Human (VH) interviewing system provides participants with the opportunity to engage in a private conversational interview with a VH character who asks an interviewee open ended questions that are responded to with natural language. During the course of the interview, cameras and a microphone capture the user’s facial/body and vocal communication “signals” for quantification and interpretation. We will present SimSensei interview data from a National Guard unit (n=25) obtained prior to and immediately following a combat deployment to Afghanistan. Initial analyses suggest differences in pre to post interviews were found on automatic measures of facial expression related to displays of “joy” and “fear”. Interviewees were also more likely to endorse the presence of adverse psychological symptoms to the virtual human (Ellie) compared to their responses on the PDHA questionnaire.

Asian Age Quotes Paul Debevec on Current and Future Uses of Holograms

Asian Age quoted Paul Debevec of the USC Institute for Creative Technologies on holograms of dead singers being used in concerts.

Debevec said that creating holograms of dead people was much more challenging than the actual concert design. “The technology that is new and interesting is the ability to create a human likeness to someone that is dead — Tupac, Michael Jackson,” he said. “The technology of putting it on stage is completely uninteresting.”

Paul Debevec Named to 3D World Hall of Fame

3D World Magazine announced that Paul Debevec is this year’s entrant to the 3D Hall of Fame. A post on the magazine’s website noted that Debevec is visionary computer graphics researcher with a long history of innovation. It noted, in particular, that his work on image-based lighting and HDR techniques have resonated around the CG and VFX world, and the ‘Light Stages’ he created are still widely used today.

LA Times Covers Army Research Lab Plan to Open Largest Ever University Outpost at ICT

The LA Times covered the establishment of ARL West at ICT. The Army Research Lab will maintain up to 70 researchers at ICT starting next year. The story states that this largest U.S. Army Research Laboratory (ARL) outpost outside of Maryland aims to become into a key source of ideas on data analysis, robotics and virtual reality.

According to the article, USC was a natural spot for the Army group to land in Southern California. ICT’s work with the military and Hollywood already has had a role in big developments, including stress therapy and disaster response apps, graphics technology used in the film “Avatar” and the recent reemergence of virtual reality headsets.

“We believe that establishing West Coast research teams at USC and ICT will lead to new research directions, new perspectives, new industry partners and ultimately new discoveries that will help the Army solve current and future challenges,” said Army Research Lab Director Thomas Russell.

USC News and the Army Research Lab both ran articles announcing ARL West.

Army’s New Campus to House 70 Researchers at USC Institute for Creative Technologies

USC first West Coast University to attract Army Research Laboratory

Contact: Orli Belman, USC Institute for Creative Technologies, 310 301-5006, belman@ict.usc.edu

Joyce Conant, Army Research Lab Public Affairs, (410) 278-8603, joyce.m.conant2.civ@mail.mil

As part of an initiative to spur scientific breakthroughs the U.S. Army Research Laboratory (ARL) plans to recruit up to 70 researchers to be based at the University of Southern California Institute for Creative Technologies (ICT) in Playa Vista.

ARL West will be the laboratory’s largest university outpost and the first one west of the Mississippi. It will leverage USC and regional expertise to broaden its abilities for the discovery, innovation and transition of science and technology. ARL, part of the Research Development and Engineering Command based in Maryland, is the Army’s central laboratory for internal and external fundamental research.

“USC’s federally-funded computer science research and development has led to technological advances that improve lives for active service members and veterans,” said USC President C. L. Max Nikias. “Today’s announcement serves as recognition that ICT has established itself at the national forefront of this important work, and that we have the opportunity to achieve even greater societal benefit through the establishment of this new model of government-university collaboration.”

ARL West was announced today at ARL’s open house at Aberdeen Proving Ground, MD. Part of ARL’s recently launched Open Campus program, it will expand upon ICT virtual reality and immersive technology research through co-location and collaboration.

“The initial focus of ARL West will be in the area of Human Information Interaction, which involves research into how humans generate and interact with data to make decisions more effectively and efficiently,” said Dr. Thomas Russell, director of ARL. “As we develop this new campus, we will also be establishing new relationships on the West Coast with other researchers in academia and industry to augment work ongoing there and work at our other sites. This collaboration, in an open work environment, will further develop the work we do for our service members and the Nation.”

ICT was founded in 1999 with a U.S. Army contract to partner with film and game studios to advance the art and science of simulations and computer-supported training. It is estimated that hundreds of thousands of service members and veterans have used ICT training systems and health therapies, which include:

Bravemind, a virtual reality therapy for treating soldiers and veterans with post-traumatic stress.
MILES, a virtual veteran that provides training for social workers who treat service members.
DisasterSim, a tool to prepare people for planning and conducting natural disaster response and relief.

“Our partnership with USC leads to solutions for soldiers,” said Russell. “We believe that establishing West Coast research teams at USC and ICT will lead to new research directions, new perspectives, new industry partners, and ultimately new discoveries that will help the Army solve current and future challenges.”

The impacts of ICT’s Army-funded research extend far beyond the military, including:
• The latest, low-cost, high fidelity virtual reality headsets, like Oculus Rift, Samsung Gear and Google Cardboard.
• Award-winning computer graphics in movies like Avatar.
• Hologram-like projections of Holocaust survivors who answer questions about their experiences as part of New Dimensions in Testimony, an educational collaboration with the USC Shoah Foundation.
• Virtual humans that identify signs of depression or post-traumatic stress.

The area around ICT has been deemed Silicon Beach for the concentration of creativity and technology companies now based near Playa Vista, Venice and Santa Monica. ARL West will capitalize on LA’s culture of innovation by bringing scientists and engineers together in this important research hub.

ARL West research will not be classified. Researchers, expected to begin arriving in early 2016, will include locally recruited scientists, students, and representatives from other Army organizations.

“This new collaboration will further advance research in emerging areas like robotics, haptics and data visualization and provides a model for building open and diverse teams to tackle complex problems, “said Randolph Hall, USC’s vice president of research. “It also provides another way for USC to support those who serve our country.”

In addition to the groundbreaking work at ICT, USC has maintained a decades-long commitment to veterans and the military.

The Playa Vista area has a long history of defense-inspired research, much of it developed at the former Hughes Aircraft facilities, at USC’s Information Sciences Institute in Marina del Rey and at ICT.

“Bringing experts from different fields together for a common research goal has been a hallmark of research at ICT and has led to many important technology transitions,” said Randall W. Hill, Jr., executive director of ICT. “We are honored that the Army Research Laboratory selected us to expand this type of collaboration and to help them achieve their goals.”


Download a PDF overview.

CounterNet is a single-player, web-based counter-terrorism game that teaches how to identify, track, counter and thwart online terrorist activity.

A player takes on the role of a government professional trying to prevent an attack by a fictional eco-terrorist group called Terala. The player must prevent Terala from utilizing platforms for social media (e.g. Twitter, Instagram, Vine), content creation and dissemination such as Youtube, payment systems (e.g. Paypal) and email. The game centers around 10 fundamental uses of the internet that cyber-terrorists can exploit to their advantage:
• Propaganda
• Financing
• Training
• Planning
• Execution
• Recruitment
• Incitement
• Radicalization
• Public Information
• Secret Communication

Learning Objectives include:
• Knowing when and how to take action and how to mobilize the broader society
• Understanding tradeoffs and costs (2nd and 3rd order effects) that need to be anticipated and managed
• Learning how to work alongside other government organizations, the private sector and the international community to keep one step ahead of cyber-terrorists

CounterNet is part of the Global Ecco suite of online games. (NPS). It was funded by funded by the Naval Postgraduate School and developed for the George C. Marshall European Center for Security Studies as part of their Program on Cyber Security Studies (PCSS) core curriculum. It has also has been used in two previous PCCS courses at the George C. Marshall European Center for Security Studies and continues to be used by a broader audience of cyber-security professionals and students.

A Multimodal Predictive Model of Successful Debaters or How I Learned to Sway Votes

Interpersonal skills such as public speaking are essential as sets for a large variety of professions and in everyday life. The ability to communicate in social environments often greatly influences a person’s career development, can help resolve conflict, gain the upper hand in negotiations, or sway the public opinion. We focus our investigations on a special form of public speaking, namely public debates of socioeconomic issues that affect us all. In particular, we analyze performances of expert debaters recorded through the Intelligence Squared U.S. (IQ2US) organization. IQ2US collects high-quality audiovisual recordings of these debates and publishes them online free of charge. We extract audiovisual nonverbal behavior descriptors, including facial expressions, voice quality characteristics, and surface level linguistic characteristics. Within our experiments we investigate if it is possible to automatically predict if a debater or his/her team are going to sway the most votes after the debate using multimodal machine learning and fusion approaches. We identify unimodal nonverbal behaviors that characterize successful debaters and our investigations reveal that multimodal machine learning approaches can reliably predict which individual (∼75% accuracy) or team (85% accuracy) is going to win the most votes in the debate. We created a database consisting of over 30 debates with four speakers per debate suitable for public speaking skill analysis and plan to make this database publicly available for the research community.

Innovative Technology, Opening the Way to the Future

Paul Debevec, U.S. researcher on CG technology, famous for his R&D achievements on “Spider-Man” and “Avatar,” will talk about the current trend of CG technology and how he is obtaining new technology ideas. He recently developed “An Auto-multiscopic Projector Array for Interactive Digital Humans” at USC. As it received “DC EXPO 2015 Special Prize,” he will also explain about that invention and its future application.

Later, panelists from Japanese academic, business, and governmental circles will join to make suggestions as to an environment for new technology ideas to come out continuously, necessary approaches for industrial application of technology, which are necessary for future generations to effectively work in research and development and utilize technology in industry.

San Antonio Express Quotes David Krum on Virtual Reality for Education

ICT’s David Krum was quoted in a story about the use of virtual reality in schools.

Virtual reality devices that can track users’ head movement and adjust the picture accordingly have been around since the 1960s, said David Krum, associate director of the Mixed Reality Lab at the University of Southern California’s Institute for Creative Technologies.

The technology has improved over the years but until recently, its expense made it impractical for widespread use. Now, with companies like Facebook, Google and Samsung entering the virtual reality sphere, devices are available for as little as the cost of a smartphone and a cardboard headset.

That matters for education because being in a place — or a realistic simulation of one — can help people put what they’re learning in context, Krum said.

The Atlantic and Christian Science Monitor and USC News Cover ICT Grit Research

Several outlets covered ICT research finding that grit is not always good.

In a story headlined “Give Up”, the Atlantic quoted study lead author Gale Lucas.

““Right now, there’s an effort to push everyone to be more gritty,” said Lucas. “There’s no reason not to make people grittier, but it’s important to know when to quit and reevaluate rather than blindly push through.”

The Christian Science Monitor noted that knowing when to switch tracks when a strategy is not working is relevant for the military and that the Army and Air Force provided funding for the research that Lucas conducted with ICT’s John Gratch and colleagues from Northeastern University.

USC News and the Syndey Morning Herald also covered the research.

Photo/Adam Kubalica

Wall Street Journal Covers New Dimensions in Testimony in Story about Museums of the Future

A Wall Street Journal article looking at the future of museum exhibits featured the New Dimensions in Testimony collaboration between ICT and the USC Shoah Foundation. The story noted that the project is being piloted at the Illinois Holocaust Museum where vistors ask a video image of survivor Pinchas Gutter questions and natural language technology software retrieves an appropriate response, as if he were in the room. It also described plans to continue this work with more survivors with a goal of using ICT’s Light Stage and 3-D multistereo projection system to create a hologram-like image.

Keynote Talk: AI in Education as Distributed Services: A New Ecosystem for Education and Research

Artificial Intelligence in Education (AIED) is scaling to large numbers of persistent users, which offers new opportunities and new difficulties. As AIED scales, it will need to follow recent trends in service-oriented and ubiquitous computing: breaking AIED platforms into distinct services that can be composed for different platforms (web, mobile, etc.) and distributed across multiple systems. This will represent a move from learning platforms to an ecosystem of interacting learning tools. Such tools will enable new opportunities for both user-adaptation and experimentation. Traditional macro-adaptation (problem selection) and step-based adaptation (hints and feedback) will be extended by meta-adaptation (adaptive system selection) and micro-adaptation (event-level optimization). The existence of persistent and widely-used systems will also support new paradigms for experimentation in education, allowing researchers to understand interactions and boundary conditions for learning principles. The ability to perform this type of ongoing and powerful research studies will completely change the landscape of educational research. New central research questions for the field will also need to be answered due to these changes in the AIED landscape. This talk will describe recent efforts to tackle some of these research questions, including three recent integrations of multiple separate intelligent tutoring systems. Successes and challenges will be discussed, from the perspective of technological, social, and economic factors.

If the going gets tough, when should the tough give up?

Perseverance is essential for success, but researchers caution that “gritty” individuals need to know when to quit.

Contact: Andrew Good at (213) 740-8606 or .

If there’s a “secret sauce” for success, it just might be grit. As defined by psychologist Angela Duckworth, it refers to a kind of persistence that only gets stronger in the face of adversity. It’s a hallmark of overachievers, whether in the office or the classroom, and drives them on even when they’re staring down failure.

But a new study suggests there are risks in having too much grit to quit. Researchers found the trait can be hard to switch off – and those who have it sometimes set themselves up for failure by overreaching. The study results have big implications for how to best help students prepare for college and the SAT.

“Grit is something the U.S. Department of Education is promoting, and people are looking at how to increase it among students,” said lead author Gale Lucas of the USC Institute for Creative Technologies. “Grit is great, but not a lot of research has been done to identify the downsides. This study helps ensure that when we’re evaluating grit, we’re doing it in a round and complete way.”

The study was co-authored by Jonathan Gratch, also of ICT; Lin Cheng, formerly of ICT and now with Northeastern University; and Stacy Marsella of Northeastern University. It was published in the Journal of Research in Personality on September 8.

Read the full story at USC News.

Fusion Features SimSensei

A video on Fusion features ICT’s SimSensei project in a story about virtual humans that, for some patients, may prove more effective than real-life therapists in managing depression. In this episode of Asking for a Friend, Cleo Stiller speaks to Skip Rizzo. and explores the amazing technologies being developed—and who could benefit most.

Overnight hackathon shows the promise of virtual reality as a health care tool

Event sponsored by USC Center for Body Computing, USC Institute for Creative Technologies and IEEE Standards Association delivers digital tools to improve health

They didn’t sleep. But they did dream big.

During an overnight event held at the USC Institute for Creative Technologies, teams of biomedical engineers, computer scientists, physical therapists and even toy designers took part in “Hacking Virtual Medicine,” a design, development and coding marathon that challenged participants to create virtual reality health care solutions for patients and providers.

Using game design software and smartphones enhanced with cardboard 3-D viewers on Oct. 3-4, six teams addressed problems of disease management, social isolation, on-the-job training and doctor-patient communication.

“Virtual reality provides the perfect medium to explore these areas, and I was blown away by what the participants produced,” said Leslie Saxon, a Keck Medicine cardiologist and executive director of the USC Center for Body Computing (CBC), which organized the hackathon. “This event showed the promise of virtual reality as a health care tool and also the power of technologists, scientists, artists and health care professionals joining forces to tackle problems in unexpected ways.”
Presenting prototypes

To build empathy — and encourage treatment compliance — a USC team simulated what diabetic patients might experience if they lose their eyesight due to disease progression. California Institute of Technology contestants captured the chaos of a code blue emergency in a hospital room to prepare budding physicians before they experience the real thing. And a trio with day jobs at the Walt Disney Co. designed a way to immersively display internal organs so doctors can better explain what is going on inside of bodies.

“Virtual reality used to be out of reach in terms of cost and capabilities, but that is no longer the case,” said Todd Richmond, director of advanced prototype development at ICT. “The challenge now is discovering how to craft content and create experiences that matter. Hacking medicine is a great place to start.”

Read the full story at USC News

Hacking Virtual Medicine

Let’s hack healthcare.

With the advancement of Virtual Reality, we now have the potential to unlock incredible new ways to share medical experiences by allowing patients and physicians to interact remotely without sacrificing the benefits of being physically present.

Imagine an immersive walk through a surgical procedure, or potential outcomes of a difficult medical decision; a simulated role reversal between physician and patient; a VR tool for smoking cessation…the possibilities are endless!

Virtual Reality has brought a new personal level of interaction to communication and immersion in our lives already, having endless implications in the rapidly evolving digital health space.

We’ve already seen VR’s effective uses for years in training, diagnosing, and treating medical conditions, with examples including exposure therapy, PTSD treatment, pain management, and surgical training. But how might we push these experiences and applications further?

The answers are up to you.

Join the USC Center for Body Computing, USC Institute for Creative Technologies & IEEE Standards Association for an inspiring weekend of brainstorming and building innovative solutions with like-minded engineers, clinicians, designers, developers and business visionaries as we hack medicine and see what extraordinary new possibilities come to life! A $10,000 prize will be awarded.

Submissions at the end of the hackathon will be narrowed down to 4 finalists. The finalists will have 4 more days to continue to hack their solution to the best possible product to be presented to a panel of judges on Thursday October 8th. The winner will be awared the cash prize and added to the Body Computing Conference program as a presenter on Friday October 9th. F

Captivating Virtual Instruction for Training (CVIT)

Download a PDF overview.

CVIT aims to shape the future of distributed learning (dL) through the delivery of course material that is not only informative and educational but engaging and stimulating for participants.

The goal is to develop a comprehensive strategy for educators and courseware developers when designing, producing and deploying remote course material. The foundation to this strategy is a technique-to-technology mapping that aligns successful instructional techniques used by live classroom instructors (humor, motivation, tone, pace) with core enabling technologies (virtual humans, casual games, augmented reality, intelligent tutors and narrative-driven experiences) so that dL content may be delivered with not just educational value, but emotional impact and user captivation.

For this multi-year research effort, ICT is working with the Maneuver Center of Excellence at Ft Benning, Georgia to develop a prototype centered around the Army’s Advanced Situational Awareness – Basic (ASA-B) training program. The principles, ontology, assessment criteria and overall instructional design for CVIT are intended to be domain independent and relevant to inform other efforts attempting to ‘digitize’ influential and quality instructors and courses.

Building on the CVIT principles developed for ASA-B, two additional courses are being developed, a course on the Army Information Architecture (CVIT-IA) for Ft. Huachuca and the Supervisor Development Course Refresher (SDC-R) for the Army Management Staff College, Fort Leavenworth.  CVIT-IA leverages game-like scenarios where the learner role-plays different intelligence missions, to help soldiers understand the connections and roles of military intelligence sections and systems.  CVIT-SDC-R reinvents existing courseware to feature engaging instructional videos and to allow the learner to practice their supervisor interpersonal skills and procedures in a game-like interaction with their virtual employees.

The CVIT prototype is intended as refresher training for the application of ASA-B course skills, and as a tool for trainers to use back at their home stations for training those unable to attend a resident course. The CVIT methodology and technology platform is not intended to replace training courses or content, but rather to provide a framework
and set of recommendations for how to digitally deliver the material in a convincing manner.

This project is funded by the U.S. Army Research Lab through ICT’s UARC contract.

The Verge: Inside USC’s crazy experimental VR lab

The Verge featured virtual reality research from ICT, focusing on “redirected walking,” which fools users into thinking they’re in a big virtual space. The story notes how drones are being incorporated in current research. It also mentions the ICT Light Stage, Project Blueshark, research by Skip Rizzo on the use of VR to treat PTSD, and Mark Bolas, the head of ICT’s Mixed Reality Lab. The article notes that ICT is sponsored by a contract with the U.S. Army which allows for researchers to focus on hard problems and evaluate technologies like VR.

“We’re really struggling to figure out augmented and virtual and mixed reality, and what those spaces mean and how we’re going to have interactions that are meaningful,” says ICT’s Todd Richmond. “And the only way to do it is by doing it.”

The Verge also uploaded this video to YouTube.

The Guardian Features ICT Virtual Humans for Mental Health Support

The Guardian featured ELLIE, a virtual human interviewer developed by the USC Institute for Creative Technologies. In ICT’s SimSensei project, ELLIE is designed to respond to a patients social cues and make it easier for a real therapist to make a diagnosis. The story cites ICT research that suggests people are more willing to share personal details with a computer character than a real human being.

TechCrunch Cites ICT’s VR Health Care Work

An article on innovation in health care virtual reality notes that ICT is developing prototypes and conducting research in many areas of medical virtual reality.

Huffington Post Cites ICT Work Advancing Treatment for Post-Traumatic Stress and Traumatic Brain Injury

A Huffington Post story on advances in treatment for PTS and TBI notes Skip Rizzo’s virtual reality research.

USC News Features Evan Suma’s Redirected Walking Research

An article on USC News features Evan Suma and his researching aiming to create whole-body VR experiences.

“There is a cognitive cost to using joysticks and sitting in front of a monitor that can prevent you from really feeling like you are in another place,” Suma said. “Moving around in a virtual environment is still an unsolved problem, but people are now paying attention in a way they have not before.”

French TV ARTE Covers ICT Virtual Humans Research

The French program Future Mag features ICT research on virtual humans and emotion. Jump to 22 minutes in to see Jon Gratch and coverage of ICT.

Army News Service Covers ELITE Trainer for Sexual Assault and Harassment Prevention

A story on the Army homepage featured a recent beta test of ELITE SHARP Command Team Trainer, or CTT, a spin-off of the ICT-developed ELITE counseling trainer. The story noted that participants felt the new application would provide current and future command teams with a useful tool to better prepare for dealing with an SHARP incident.

It also stated that the test was the result of collaboration between the U.S. Army SHARP Program Office, the U.S. Army SHARP Academy, Army Research Laboratories, the University of Southern California Institute for Creative Technologies, and the National Simulation Center – Army Games for Training program. It explains that feedback from the testing will be incorporated into program revisions.

Executive.Gov also covered.

Fortune Interviews Todd Richmond about Virtual and Augmented Reality

Fortune quoted Todd Richmond of the USC Institute for Creative Technologies about the role virtual reality and augmented reality could play in video games.

“There is no doubt that these immersive technologies will be the most important innovation of the next gaming generation, and will also impact every other aspect of our lives,” said Richmond.

Exploring Feedback Learning Strategies to Improve Public Speaking: An Interactive Virtual Audience Framework

Good public speaking skills convey strong and effective communication, which is critical in many professions and used in everyday life. The ability to speak publicly requires a lot of training and practice. Recent technological developments enable new approaches for public speaking training that allow users to practice in a safe and engaging environment. We explore feedback strategies for public speaking training that are based on an interactive virtual audience paradigm. We investigate three study conditions: (1) a non-interactive virtual audience (control condition), (2) direct visual feedback, and (3) nonverbal feedback from an interactive virtual audience. We perform a threefold evaluation based on self-assessment questionnaires, expert assessments, and two objectively annotated measures of eye-contact and avoidance of pause fillers. Our experiments show that the interactive virtual audience brings together the best of both worlds: increased engagement and challenge as well as improved public speaking skills as judged by experts.

New Scientist Features ICT Advances Using Virtual Humans for Mental Health

New Scientist highlighted research by Jonathan Gratch and Gale Lucas of the USC Institute for Creative Technologies finding that people are more likely to engage with a virtual versus a human therapist. The study found that people are more comfortable admitting psychological problems to an avatar than a human being, though that doesn’t replace the need for human therapists.

“When you’re faced with a human, all these social processes activate. You’re more likely to cooperate with a person, more likely to apply racial stereotypes, more likely to feel judged,” said Gratch.

Re/Code Quotes Evan Suma on Consumer VR

Re/code quoted Evan Suma of the USC Institute for Creative Technologies about virtual reality as a consumer product.

“I’ve been doing this for over a decade, and now when I put on a display at a convention or a conference and am just sitting down — I don’t have a reaction to that any more,” said Evan Suma, a research assistant professor at USC’s Institute for Creative Technologies. “The novelty effect wore off.”

Virtual Reality LA Summer Expo

2:35pm – The Art of Live Action VR, presented by IM360

How can VR be used to tell stories? What different types of narrative can be used by including the technology? What techniques can you use and what ones will the future bring? These questions and more will be addressed in a panel presented by IM360, who recently announced a new app TFF: Sinatra at 100. Featuring Andre McGovern of IM360, Sajid Sadi of Samsung Research America’s ‘Think Tank’ Team and Vrandom Zamel, founder and CEO of Springbol Entertainment. Moderator duty will be taken on by David Nelson of the USC Institute for Creative Technologies.

4:25pm – Light Fields and Photoreal Virtual Actors for Virtual Reality

A talk by Paul Debevec Chief Visual Officer at ICT that will look at the essential components for engaging VR content: characters with realism and reactive viewpoints. Debevec will be looking into how technology being developed at ICT helps with these topics.

Calling all coders: USC Center for Body Computing hosts ‘Hacking Virtual Medicine’ 2-day hackathon with ICT and IEEE Standards Association

October 3- 4 event to create a new medical experience evolution using Virtual Reality

Information for potential participants can be found at the event website.

Sherri Snelling at (949) 887-1903 or sherri.snelling@med.usc.edu (USC CBC)
Orli Belman at (310 709-4156 or belman@ict.usc.edu (USC ICT)

As an academic leader in the digital health revolution, the University of Southern California (USC) Center for Body Computing (CBC), part of the Keck School of Medicine of USC, announced today its first hackathon, “Hacking Virtual Medicine,” using Virtual Reality (VR) tools to create a consumer experience evolution in health and medicine. Software and hardware programmers, developers, and designers will join innovative engineers, clinicians, business people, and others for a 2-day marathon brainstorming and building event October 3-4. Held on the university’s Silicon Beach campus at the USC Institute for Creative Technologies (ICT) in Playa Vista, five finalists will be chosen to continue hacking until October 8, when a panel of expert judges will award the top prize winner a $10,000 grant to further develop the concept.

The USC CBC hackathon, sponsored by USC and IEEE Standards Association (IEEE-SA) with support from Akido Labs, USC D-Health @HTE and Doctor Evidence, will utilize the Google Cardboard smartphone viewer. Training and on-site support will be provided by the MxR Lab. MxR, part of ICT and the USC School of Cinematic Arts, is at the forefront of developing low-cost VR solutions, including the FOV2G0, a low-cost DIY smartphone viewer that predates Google Cardboard. The challenge is to take VR smartphone capability and change consumer behavior when it comes to health.

The USC CBC has developed provocative questions that need answers such as, “What will I look like in 20 years if I don’t stop smoking?” “Why do I need a colonoscopy?” or “Can a VR game teach my child about diabetic glucose monitoring?” Participants are challenged to think about role reversals between physicians and patients and to develop concepts that create an immersive experience that can help with difficult medical procedures or other health issues.

“Our digital health mission is to make the world more open and connected while also providing continuous monitoring and personalized care delivered remotely,” said Leslie Saxon, M.D., a Keck Medicine cardiologist and executive director of the USC CBC.

“Virtual reality is the new prescription in how we can treat patients – it enables consumers to adopt healthier behaviors and gain knowledge through their own experience. It’s an exciting time for anyone at the intersection of technology and medicine and this is just the starting point for a whole new approach to health care, added Dr. Saxon.”

Virtual reality is an emerging technology tool for physicians, surgeons, and other health care professionals to change the doctor-patient dynamic and empower patients to become engaged, to communicate, to connect, to share, and ultimately to be healthier – all through personalized virtual experiences. Diverse fields including medicine, psychology, neuroscience, and physical and occupational therapy are seeing evidence-based patient benefits from the advancement of VR.

“Immersive technologies can break down barriers – both geographically and emotionally – with some patients by allowing them to feel more empowered and in control,” said Todd Richmond, director of advanced prototype development at USC ICT. “By putting on a headset, consumers are transported anywhere without having to leave their physical location. We’ve seen success in using VR to treat patients with PTS and other mental health issues, to manage pain, to overcome phobias and even to enhance surgical training. With VR, patients can be more engaged, ultimately helping health care professionals become more successful in patient outcomes.”

“Virtual reality and body-computing technologies provide tremendous opportunities to deliver groundbreaking services by removing geographical boundaries, completely changing patient-doctor dynamics,” said Jay Iorio, director of innovation, IEEE Standards Association. “This approach requires new types of content, and providing an open environment for content creators from disparate industries to collaborate and innovate very early in the development process will ensure VR-enabled personalized health care reaches its full potential.”

Participants for the USC Center for Body Computing “Hacking Virtual Medicine” need to apply by the deadline: Monday, September 21 at 11:59 p.m. (PST). Entrants should have the imagination to dream up big solutions. The hackathon will provide sample clinical data, medical expertise, and some technical/programming tutorials for from ICT’s MxR Lab’s developer team. Individuals/teams who register should bring their ideas, design, art, and programming skills (Unity 3D game engine).

Each year the USC CBC offers a competition for designers, programmers, clinicians, and others to use digital tools to promote imaginative, innovative, disruptive, next-generation digital health products that empower consumers. Last year’s SLAM contest, sponsored by Skullcandy, was won by SingFit, a mobile music app that aids in therapy for those with everything from autism and depression to chronic pain and Parkinson’s disease. Previous year’s competition winners have included LumoBack, a start-up with a posture sensor, and Vampire Rancher, a mobile social gaming platform for children with diabetes.

About the USC Center for Body Computing Conference
The 9th Annual USC Center for Body Computing Conference is an event designed to bring digital and life sciences executives, sensor and mobile app inventors, strategists, designers,
investors and visionaries from health care, entertainment, fashion and technology together
for one day to showcase an international array of digital and mobile health ideas. Founded by Leslie Saxon, an international digital health guru and trained USC cardiologist who has spoken at TED MED and Wired international conferences, the USC CBC conference is her promise to make consumers “the heroes of their own health stories.” The conference will be held on the USC campus in downtown Los Angeles on October 9. For more information or to register for the conference, visit: uscbodycomputing.org

About USC Institute for Creative Technologies (ICT)
At the University of Southern California Institute for Creative Technologies (ICT) leaders in artificial intelligence, graphics, virtual reality and narrative are advancing low-cost immersive techniques and technologies to solve problems facing service members, students and society. Established in 1999, ICT is a DoD-sponsored University Affiliated Research Center (UARC) working in collaboration with the U.S. Army Research Laboratory.. ICT brings film and game industry artists together with computer and social scientists to study and develop immersive media for military training, health therapies, education and more.

About the IEEE Standards Association
The IEEE Standards Association, a globally recognized standards-setting body within IEEE, develops consensus standards through an open process that engages industry and brings together a broad stakeholder community. IEEE standards set specifications and best practices based on current scientific and technological knowledge. The IEEE-SA has a portfolio of over 1,100 active standards and more than 500 standards under development.

Towards Adaptive, Interactive Virtual Humans in Sigma

Sigma is a nascent cognitive architecture/system that combines concepts from graphical models with traditional symbolic architectures. Here an initial Sigma
-based virtual human (VH) is introduced that combines probabilistic reasoning, rule-based decision-making, Theory of Mind, Simultaneous Local ization and Mapping and reinforcement learning in a unified manner. This non-modular unification of diverse cognitive, robotic and VH capabilities provides an important first step
towards fully adaptive and interactive VHsin Sigma.

MIT Technology Review Quotes Mark Bolas on Real-World Objects in Virtual Worlds

MIT Technology Review quoted Mark Bolas of the USC Institute for Creative Technologies about the interplay between virtual reality environments and real-world objects.

Bolas, an associate professor at the University of Southern California and director of mixed reality research at its Institute for Creative Technologies, says such intersections of real-world objects with virtual reality are “grounding moments,” making it more likely that people will believe that the virtual experience is real.

“Every time you have one of those you let go of the real world a little bit more,” he says.


Download a PDF overview.

DisasterSim is a game-based training tool focused on international disaster relief. Trainees take on the role of a joint task force staff member coordinating the US Department of Defense’s (DoD) humanitarian aid and disaster relief efforts in a foreign country following a natural disaster.

Among the tasks, trainees must attempt to restore essential services, reconstruct civil infrastructure and provide humanitarian assistance, all while managing interactions with local civil authorities, non-governmental organizations, and other US government relief organizations. They must use their judgment to prioritize and execute lifesaving tasks while operating within DoD limits related to medical relief and infrastructure repairs. Trainee actions in the exercise can impact future interactions and may also influence overall scenario.

DisasterSim is built upon the UrbanSim platform. UrbanSim is an ICT-developed cognitive trainer widely used in Army classroom and operational settings to practice executing the “Art of Mission Command” in asymmetric or irregular warfare environments, including counterinsurgency and stability operations.

Both training games are driven by a novel story engine that interjects events and situations based on the real-world experiences and lessons learned. They include an intelligent tutoring system, which provides guidance to trainees during execution, as well as after action review capabilities.

UrbanSim officially transitioned to the Army as part of two program of records, Games for Training and the Low Overhead Training Toolkit. It is available at the Army’s MilGaming portal.

DisasterSim is funded by PEO STRI and is developed in partnership with the US Agency for International Development’s (USAID) Office of Foreign Disaster Assistance (OFDA). Final delivery is expected by fall 2015.

Opponent Modeling for Virtual Human Negotiators

Comparing behavior towards humans and virtual humans in a social dilemma

Although virtual agents have become more realistic over time, however does this mean that the behavior of humans against virtual agents is similar to that against real humans? During this study the social behavior of participants was studied within an “iterated prisoner’s dilemma” scenario, while playing against either human or “virtual human” opponents. The virtual humans that were used during these studies are three-dimensional embodied agents rendered on a computer and are capable of displaying facial expressions and other animations. The displayed behavior against these virtual humans was compared with that shown against other humans. This study was done at the USC Institute for Creative Technologies, using an existing framework that allows participants to play the prisoner’s dilemma against each other. This framework was adapted to allow participants to play the prisoner’s dilemma against an autonomous virtual human using a specific strategy. The studies showed that there is significant difference in how people treat a virtual human compared to a real human. People will display less social behavior towards virtual human than against human opponents. These results offer several possible ways to explore that might improve and advance the behavior of virtual humans in order to receive more social treatment.

A Platform for Building Mobile Virtual Humans

We describe an authoring framework for developing virtual
humans on mobile applications. The framework abstracts many elements
needed for virtual human generation and interaction, such as the rapid
development of nonverbal behavior, lip syncing to speech, dialogue management, access to speech transcription services, and access to mobile
sensors such as the microphone, gyroscope and location components.

Poster Presentation: Smart Mobile Virtual Humans: Chat with Me!

In this study, we are interested in exploring whether people would talk with 3D
animated virtual humans using a smartphone for a longer amount of time as a
sign of feeling rapport, compared to non-animated or audio-only characters
in everyday life. Based on previous studies users prefer animated characters in emotionally engaged interactions when the characters were displayed
on mobile devices, yet in a lab setting. We aimed to reach a broad range of users
outside of the lab in natural settings to investigate the potential of our virtual
human on smartphones to facilitate casual, yet emotionally engaging conversation. We also found that the literature has not reached a consensus regarding the
ideal gaze patterns for a virtual human, one thing researchers agree on is that
inappropriate gaze could negatively impact conversations at times, even worse
than receiving no visual feedback at all. Everyday life may bring the experience of awkwardness or uncomfortable sentiments in reaction to continuous
mutual gaze. On the other hand, gaze aversion could also make a speaker think
their partner is not listening. Our work further aims to address this question of
what constitutes appropriate eye gaze in emotionally engaged interactions.

Virtual Human Toolkit Tutorial

In order to reach the full potential that intelligent virtual agents offer, they not only require a range of individual capabilities, but also for these to be integrated into a single framework that allows us to efficiently create characters that can engage users in meaningful and realistic social interactions. This integration requires in-depth, inter-disciplinary understanding few individuals, or even teams of individuals, possess. The ICT Virtual Human Toolkit helps to address this challenge by offering a flexible framework for exploring a variety of different types of virtual human systems. Due to its modularity, the Toolkit allows researchers to mix and match provided capabilities with their own, lowering the barrier of entry to this multi-disciplinary research challenge.

This tutorial focuses on 1) the general architectural concepts supporting the Toolkit, 2) how to install and run the Toolkit main example, 3) how to create your own basic character, and 4) pointers on how to extend the Toolkit and/or use its capabilities within your own work.

USA Today Notes Pioneering VR Work of MxR Lab and Mark Bolas

USA Today’s “College” section featured USC student Cosmo Scharf, co-founder of VRLA, an expo event for virtual reality developers, enthusiasts and pioneers. The story noted that USC’s MxR Lab, run by Mark Bolas of the USC Institute for Creative Technologies, has pioneered many innovations in the field. The story noted that Bolas was instrumental in mentoring Oculus Rift founder Palmer Luckey. Vice also covered the story.

Wired Features Evan Suma and Redirected Walking

An article in Wired covered redirected walking research and the potential for virtual reality to allow people to explore immersive virtual spaces—like buildings or even whole cities—on foot in head-mounted displays.

“This problem of how you move around when in VR is one of the big unsolved problems of the VR community,” said Evan Suma.

The article noted that Suma has taken advantage of a phenomenon called change blindness: If we don’t see an object move, we don’t really notice its position change. For example, he rearranges the virtual world behind a participant’s back by rotating the location of a door by 90 degrees. It states that he tested 77 participants who each went through a dozen such rotated doors—and only one person noticed anything funky going on. By using this technique of rotating doors, he’s able to create a virtual building of several thousand by several thousand feet in a 16 by 16 room.

Virtual Acquisition Career Guide (VACG)

Download a PDF overview.

The Virtual Acquisition Career Guide provides a digital forum for quickly and accurately obtaining career advice and guidance. Designed for Army Acquisition Workforce members, the web-based system features an integrated, interactive ICT virtual character, known as ELLIE, to serve as a personalized career mentor and a virtual help desk.

VACG specializes in helping people monitor their individual progress in specific areas such as certification status, professional development courses and individual development plans. It can assist workforce members with applying for certification, editing their Acquisition Career Record Brief, and registering for Defense Acquisition University courses. ELLIE has the ability to answer 20,000 variations of commonly-asked questions. Users can either type into the free-text entry box, or click one of the common question buttons to retrieve an answer.

ELLIE can provide personal checks to all users by reading their data and identifying any delinquencies in their career records. Users will be able to access career models and guidelines that help them understand opportunities available for education, self-development, and experience.

The long-term objectives of the VACG are to:

  • Realize cost and time savings by reducing the load on U.S. Army Acquisition Support Center personnel
  • Supplement the face-to-face relationship of mentors and protégés with an online capability for workforce members to obtain answers to career-related questions and assistance in navigating various Army online resources
  • Provide users with immediate access to their personal career information and records.

VACG is built upon ICT’s SimCoach and RoundTable platforms, which include support for web-based virtual humans and comprehensive web-based tools for interaction design. It uses Ellie, an ICT virtual human who also serves an interviewer for ICT’s SimSensei health care support project.

This project is funded by the U.S Army Acquisition Support Center (ASC) and was developed in partnership with ASC and the U.S. Army Research Laboratory, Simulation & Training Technology Center.

One World Terrain (OWT)

Download a PDF overview.

The objective of the One World Terrain (OWT) research effort is to provide a set of 3D global terrain capabilities and services that can replicate the coverage and complexities of the operational environment.

This is important because the current geospatial ecosystem is vast and fragmented, with a multitude of collection methods, databases, and organizations within it. Manual terrain reconstructions can take months, the fidelity of databases are nowhere near the current rendering capabilities, and the DoD has spent significant amounts of money to generate terrain for training and simulation systems. We have to ensure that decision-makers receive accurate, up-to-date geospatial data when and where they need it.

The OWT project has begun to address this need. Researchers have done so by diving deep into every phase of geospatial data science: Collection, Processing, Storage, Distribution, and Application.

OWT research has successfully transitioned a number of its advanced prototypes to the DoD, most recently with its rapid 3D terrain capture and reconstruction pipeline, which is a pillar of the Marine Corps’ Tactical Decision Kit (TDK). The OWT team is currently making key advances in Processing, Storage, and Distribution. Using classification, segmentation and modelization approaches, they are making geospecific data simulation-ready, and with custom cloud solutions, they are providing processed data to decision-makers efficiently and securely.

OWT is also a pillar of the Army’s Synthetic Training Environment, which will provide collective training to the point-of-need using the latest in immersive technologies. This aligns with the geospatial capabilities of the future, which must be intuitive, authoritative, and able to fuse disparate data types into a singular experience.

Ultimately, the solutions created with the One World Terrain project will enable warfighters at different echelons with different requirements to experience a seamless, realistic geospatial foundation when executing their training, and at a level of interactive quality never seen before.

View all OWT videos here.


Simulated Threat Assessment Trainer (STAT)

Download a PDF overview.

The Simulated Threat Assessment Trainer (STAT) provides a laptop training capability to help K-12 school administrators and psychologists practice conducting threat assessment interviews with students and responding if needed. STAT presents real-world instructional scenarios and incorporates best practices and evidence-based threat assessment response processes. Developed in collaboration with the Santa Ana Unified School District (SAUSD) and the University of Southern California Institute for Creative Technologies (USC ICT), STAT features research technologies such as virtual humans and intelligent tutoring to create a challenging yet engaging learning experience.

In STAT, users assume the role of an administrator, then a psychologist as they play through scenarios in which they must respond to a threat, conduct threat assessment interviews with a virtual human student, and decide how to handle the situation.

The STAT experience will take place after administrators and psychologists have learned threat response techniques and the following threat management principles: Calm, Ask, Listen, and Motivate (CALM). These skills are the foundation for the STAT system design, and are tracked during the interaction with the virtual human student. The CALM skills drive the performance feedback provided to users, and were developed by SAUSD and USC ICT.

STAT leverages a software platform previously developed for the Department of Defense under the direction of the U.S. Army Research Lab Simulation and Training Technology Center (ARL STTC).

The STAT collaborative effort between USC ICT and SAUSD was made possible through a grant from the Alliance of Schools for Cooperative Insurance Programs (ASCIP).

ELITE Lite Sexual Harassment/Assault Response & Prevention (SHARP)

The Verge, Gizmodo, Motherboard and CNET Cover Realistic CG Skin Advances from ICT Graphics Lab

The Verge, in a story about this year’s SIGGRAPH conference, highlighted the ICT Graphics Lab’s digital skin stretch research, which was presented at the conference. The demo showed off the highly-detailed texture of skin as photo-realistic, digital faces changed expression. “It’s just a matter of time before our video game heroes and villains become that extra bit more human,” the story noted.

This research was also featured in Gizmodo, Motherboard, CNET, FxGuide and more.

Near-Field VR Wins Immersive Realities Contest at SIGGRAPH 2015

The MxR Lab has been hard at work creating a unique immersive experience called “Discovering Near-Field VR: Stop Motion with a Touch
of Light-Fields and a Dash of Redirection,” which just won the Immersive Realities AR/VR Contest at SIGGRAPH 2015.

The Immersive Realities AR/VR contest had 48 submissions from all over the world. A total of ten pieces were demonstrated at the VR Village, with the contest winner selected between the top three finalists.

This effort brought together the contributions of numerous people across from ICT, the USC School of Cinematic Arts and the USC Viterbi School of Engineering.

Congrats to everyone involved. Read more here.

Scanning and Printing a 3D Presidential Portrait

State-of-the-art technologies including a custom-built mobile light stage to 3D scan and print President Barack Obama for the nation’s first 3D presidential portrait.

Skin Microstructure Deformation With Displacement Map Convolution

Synthesizing the effects of skin microstructure deformation by anisotropically filtering a high-resolution displacement map to match normal distribution changes in measured skin samples. Facial animation made with the technique shows more realistic and expressive skin under deformation and strain.

Poster: Creating Near-Field VR Using Stop-Motion Characters and a Touch of Light-Field Rendering

This new type of VR experience and pipeline was created using light-field techniques to provide a rich immersive visual quality for near-field VR. The pipeline was integrated with Unity3D and hand-crafted stop-motion puppets were introduced to the VR medium for surrealistically compelling experiences.

Real Time Live: My Digital Face

This project puts the capability of producing a photorealistic face into the hands of nearly anyone, without an expensive rig, special hardware, or 3D expertise.

Using a single commodity depth sensor (Intel RealSense) and a laptop computer, the research team captures several scans of a single face with different expressions. From those scans, a near-automatic pipeline creates a set of blendshapes, which are puppeteered in real time using tracking software. An important stage of the blendshape pipeline is automated to identify and create correspondences between the geometry and textures of different scans, greatly reducing the amount of texture drifting between blendshapes. To expand the amount of control beyond individual shapes, the system can automatically include blendshape masks across various regions of the face in order to mix effects from different parts, resulting in independent control over blinks and lip shapes.

The results are photorealistic and sufficiently representative of the capture subjects, so they could be used in social media, video conferencing, business communications, and other places where an accurate representation (as opposed to an artistic or stylized one) is desired or appropriate.

During the demo, the team scans two people who then puppeteer their own faces in real time.

Driving High-Resolution Facial Scans With Video Performance Capture

Animating facial geometry and reflectance using video performances, borrowing geometry and reflectance detail from high-quality static expression scans. Combining multiple optical-flow constraints weighted by confidence maps eliminates drift.

Time Magazine Features Mark Bolas and his Virtual Reality Research

Time highlighted research by Mark Bolas of the USC Institute for Creative Technologies on the potential impact of virtual reality. “I believe we’re in the virtual world now more than the real world already. It’s just that our interface sucks,” Bolas said.

The story notes that Bolas, who is also a professor in the Interactive Media Division at the USC School of Cinematic Arts, open-sourced the FOV2GO, which allowed thousands of people to create $5 homemade virtual reality headsets. More recently he’s been building virtual reality worlds with stopmotion Claymation, trying to get at
the interactive, worldbending experiences VR can offer. Because, unlike movies, virtual reality can make you feel dumb or successful by reacting to you.

“Presence has to go both ways. The world has to acknowledge that you’re in it,” he said. “This is what I have film students for. To figure out what I do with this.”

Blendshapes From Commodity RGB-D Sensor

This talk demonstrates a near-automatic technique for generating a set of photorealistic blendshapes from facial scans using commodity RGB-D sensors, such as the Microsoft Kinect or Intel RealSense. The method has the potential to expand the number of photoreal, individualized faces that are available for simulation.

FlashMob: Near-Instant Capture of High-Resolution Facial Geometry and Reflectance

A near-instant method for acquiring facial geometry, diffuse color, specular intensity, specular exponent, and surface orientation. Six flashes are fired in rapid succession with subsets of 24 DSLR cameras, which are arranged to produce an even distribution of specular highlights. Total capture time is below the 67ms blink reflex.

Emerging Technologies: An Auto-Multiscopic Projector Array for Interactive Digital Humans

Automultiscopic 3D displays allow a large number of viewers to experience 3D content simultaneously without the hassle of special glasses or head gear. This display uses a dense array of 216 video projectors to generate images with high angular density over a wide field of view. As users move around the display, their eyes smoothly transition from one view to the next. The display is ideal for displaying life-size human subjects, as it allows for natural personal interactions with 3D cues such as eye-gaze and spatial hand gestures.

The installation presents “time-offset” interactions with recorded 3D human subjects. A large set of video statements was recorded for each subject, and users access these statements through natural conversation that mimics face-to-face interaction. Conversational reactions to user questions are retrieved through speech recognition and a statistical classifier that finds the best video response for a given question. Recordings of answers, listening, and idle behaviors are linked together to create a persistent visual image of the person throughout the interaction. This type of time-offset interaction can support a wide range of applications, from creating entertaining performances to recording historical figures for education.

Intelligent Agents for Virtual Simulation of Human-Robot Interaction

Wired UK Covers ICT Virtual Reality Advances for Health Care

The July issue of Wired UK highlights the story of Chris Merkle, a U.S. veteran who used ICT’s virtual reality therapy for treatment of his PTSD. Encountering virtual reality in the summer of 2013 was “an amazing thing”, he says. “I was so much calmer with the road rage, and my family. It unleashed a lot. I would never have been able to talk like this a year ago.”

The story quotes Skip Rizzo and covers other ICT mental health VR work, including introducing a clinical trial for victims of sexual trauma in the military, and building Strive, a pre-deployment tool that depicts scenes that PTSD sufferers typically report as significant to their trauma.

The story also notes that Oculus founder Palmer Luckey contacted ICT’s Mark Bolas, who pioneered VR in the 80s, and, in 2011.  Bolas, now director of Mixed Reality Research at ICT, helped Luckey get a job in his lab, where Luckey worked with Rizzo, who needed a cheaper, better-quality HMD for his PTSD research.

“We knew this guy had talent and vision for how to create these headsets, and Mark helped him and provided guidance on the open-source design,” Rizzo says. “The vision was always there. What we needed was wide-field view, a more comfortable head-mounted display and lower-cost built-in tracking. That’s Oculus.”


San Francisco Chronicle Features Skip Rizzo and Virtual Reality Therapy for PTSD

San Francisco Chronicle highlighted research by Albert “Skip” Rizzo of the USC Institute for Creative Technologies on how virtual reality can be used to combat PTSD.

Science Magazine Features ICT Work Addressing Post-Traumatic Stress

A feature story in Science Magazine highlights ICT’s SimSensei project and how ICT is using virtual humans and technology to break down barriers between computers and humans and address mental health issues like PTSD. The story notes a recent ICT study that suggested people were more willing to open up to a virtual human over a real one.

Because patients interact with a digital system, the project is generating a rare trove of data about psychotherapy itself. The aim, says ICT’s Skip Rizzo in the story, is nothing short of “dragging clinical psychology kick- ing and screaming into the 21st century.”

Washington Post Features Ellie and ICT Research to Improve Health Care Communication

A story in the Washington Post featured Ellie, an ICT virtual human created to help patients feel comfortable talking about themselves so they’ll be honest with their doctors. The story states that Ellie was born of two lines of findings: that anonymity can help people be more truthful and that rapport with a trained caregiver fosters deep disclosure.

According to the story, technology can help people feel comfortable sharing private information.

“Some people make the mistake when they see Ellie — they assume she’s a therapist and that’s absolutely not the case,” says Jonathan Gratch, director for virtual human research at USC’s Institute for Creative Technologies.

West Point Magazine Features Cadets Using ELITE for Leadership Training

The summer issue of West Point magazine spotlights ELITE and how it has been incorporated into PL300, the required military leadership course. The story notes that ELITE was developed at ICT in collaboration with the U.S. Army Simulation and Training Technology Center (STTC), TRADOC Capability Manager for Gaming, Center for Army Leadership, Sexual Harassment/Assault Response & Prevention Management Office, and the Maneuver Center of Excellence (MCoE)at Fort Benning, Georgia. The story also highlights the relationship between ICT, West Point’s instructors and Sim Center.

“Working with Colonel Schnack and her team has been a great experience. Her approach to employing ELITE to supplement classroom instruction is a model for how this type of instructional technology can be very effective,” said Rich DiNinni, who serves as ICT’s representative at West Point.

To tweet or not to tweet: The question of emotion and excitement about sporting events

Huffington Post Mentions ICT’s VR Therapy for Post-Traumatic Stress

An article about virtual reality for job training mentioned ICT’s VR therapy for treating post-traumatic stress.

Jewish Journal Features New Dimensions in Testimony

An article in the Jewish Journal provided an in-depth look at New Dimensions in Testimony, an effort to preserve the ability to ask questions of Holocaust survivors far into the future. The project is a collaboration between the USC Shoah Foundation and ICT, in partnership with Conscience Display.

The article notes that the project relies on the ICT’s development of a natural language processor, which filters the questions through speech-recognition software and then searches for the most appropriate response. The institute also lent its experience with holographic display to make the survivor look three-dimensional without requiring viewers to wear special eyeglasses.

The Economist Covers ICT Research on Depression Detection

The Economist featured research by Stefan Scherer of the USC Institute for Creative Technologies and a colleague that could identify depression through digital analysis of speech. They had previously developed algorithms for detecting depression through minute facial signals. While a computerized diagnostic tool could never replace a trained therapist, it could help observe a patient during times when a doctor can’t. “Working together, human and machine should bring faster, more reliable diagnosis of depression,” the story noted.

USC Alumni Veterans Visit ICT

Veterans from Korean War through Operation Enduring Freedom try out technologies helping train and treat service members and vets today

Read the full story at USC News


USC Standard Patient Awarded Gold Medal at 2015 Serious Play Conference

The USC Standard Patient project won a gold medal in the Healthcare category of the the 2015 Serious Play Awards. This recognition program honors outstanding titles that deliver a high quality of engagement and measurable training or learning opportunities. The Virtual Standard Patient project allows doctors in training to talk to the virtual patients, get realistic responses, and then receive feedback with an automated report on how they can keep improving. The technology works through web browsers and has a free online community where medical and other clinical educators can create new patients. Funded by the Defense Department, it is hoped that this new artificial intelligence-based technology will make medical education less expensive and improve patient safety.

The award was announced at the 5th Annual Serious Play Conference,  a leadership conference for professionals who embrace the idea that games can revolutionize learning.

USC News Covers MxR and Viterbi Collaboration Using Drones for VR

Current virtual reality goggles prevent soldiers from seeing anything outside of their virtual world, making teamwork impossible. Using quadrotors, Nora Ayanian, the Gabilan Assistant Professor of Computer Science at USC Viterbi, is working with Mark Bolas and the Mixed Reality Lab to create and capture 3-D imaging for military training, making it more cost efficient and portable. This work can also be used to allow for teamwork in virtual environments. Read the full story here.

CBS News Covers ICT and Playa Vista Tech Scene

In a  story about the expanding tech presence in Playa Vista,  CBS News Los Angeles affiliate KCBS-TV highlighted ICT’s research into virtual reality therapy to treat PTSD in military veterans. ”

“I think in the last three years, if you look at this area and Santa Monica, it’s exploded,”” said Clarke Lethin, ICT’s managing director. USC Alumni Veterans Network organized a group of veterans to try out ICT’s virtual reality research.

Fast Company and Gizmodo Feature Graphics Lab Innovations for Creating Realistic Skin

Fast Company featured ICT Graphics Lab research on how to create super-realistic CGI skin. Their innovative technique captures pores and wrinkles of a human face to a new level of detail, making digital avatars much more life-like. Gizmodo also ran a story. The research was conducted by Koki Nagano, Paul Debevec, Graham Fyffe and Oleg Alexander at ICT, along with Hao Li of the USC Viterbi School and colleagues.

BBC Highlights ICT Virtual Reality Research

BBC highlighted virtual reality research by the USC Institute for Creative Technologies. Mark Bolas, director of ICT’s Mixed Reality Research lab, said much of their work is driving the next generation of operations and communications tools for the U.S. military. His lab employed Palmer Luckey, founder of Oculus Rift, until just before Luckey left to develop the VR headset. “I can’t think of an area which [VR] is not going to affect,” Bolas said. BBC also highlighted Bolas’ research in a video.

Marketplace Quotes Mark Bolas on Virtual Reality at E3

Mark Bolas was quoted in a Marketplace story about virtual reality.

Virtual reality grew in the gaming world. But where it will meet the masses is in mobile technology, stated the story.

Mark Bolas, director of ICT’s Mixed Reality Lab at the Institute for Creative Technologies, said Facebook, Samsung and Google are already in. “They’re all jumping in in a way that’s slightly skewed toward the thing that they’re best at,” he said.


New York Times Quotes Mark Bolas on Virtual Reality

Mark Bolas was quoted in a New York Times story on the rise of virtual reality and how the medium will affect people both online and off.

When you’re in a room with somebody and you have a mobile device in front of you, they think they have your presence,” said ICT’s Mark Bolas, also an associate professor of interactive media at the University of Southern California. “In a virtual environment, they know they don’t have your presence. You almost want to leave that person alone.”

Bolas believes social virtual reality games will come up with “bounding boxes” — essentially, force fields that players can place around their avatars to keep others away, stated the story.

“We are going to have new conventions that deal with that,” he said.

Wired Quotes Mark Bolas on Near Field VR

Mark Bolas was quoted in a Wired story about the Oculus Touch, a two-handed interface that brings haptic feedback and a sense of touch to the VR experience.

“There is a special place in virtual reality—we call it Near-Field VR,” says Mark Bolas, director for mixed reality research at USC’s Institute for Creative Technologies. “It is the place that is within arm’s reach of a user, and it is magical, as it provides the best stereoscopic and motion cues of VR. Hands are very important to enable interaction in this region.”

Mashable Features ICT’s VR Treatment for PTSD

Mashable featured Bravemind, ICT’s virtual reality exposure therapy treatment for PTSD as part of a a series on technology that empowers and inspires. The story focused on veteran Chris Merkle who found relief from his PTSD when he used the system as part of a VR trial. ICT’s Skip Rizzo, the developer of Bravemind, was quoted in the story.

“Done in a safe environment, with a therapist who probes and asks you to re-live and go over and over the trauma, the potency of the fear conditioning diminishes by a process called extinction,” explained Skip Rizzo, a clinical psychologist and the director of Medical Virtual Reality at the University of Southern California’s Institute for Creative Technologies.

“Patients start to feel empowerment as they confront and process,” he added. “They’re empowered to accept the horrible things and see life ahead.”

CBS News Features Bravemind, ICT’s VR Exposure Therapy for PTSD

Austin affiliate KEYE-TV featured Bravemind, a virtual reality program designed by the USC Institute for Creative Technologies to treat PTSD. Veterans can experience virtual wartime environments, 3D sound and even actual smells. “It has been proven that if the veteran experiences that terrified phenomenon again and again and again then the symptoms of PTSD decay with time,” said USC President C. L. Max Nikias.

Combining Distributed Vector Representations for Words

Recent interest in distributed vector representations for words has resulted in an increased diversity of approaches, each with strengths and weaknesses. We demonstrate how diverse vector representations may be inexpensively composed into hybrid representations, effectively leveraging strengths of individual
components, as evidenced by substantial improvements on a standard word analogy task. We further compare these results over different sizes of training sets and find these advantages are more pronounced when training data is limited. Finally, we explore the relative impacts of the differences in the learning methods themselves and the size of the contexts they access.

NPR Covers SimSensei: Using Virtual Humans to Help Detect Depression and PTSD

NPR’s All Things Considered and Planet Money featured ICT’s SimSensei project which uses a virtual human and sensing technologies to help detect signs of depression and PTSD. The stories explained that Ellie, ICT’s virtual human interviewer, listens to answers and analyzes non-verbal behaviors like tone and facial expressions.

“Contrary to popular belief, depressed people smile as many times as non-depressed people,” said ICT’s Skip Rizzo. “But their smiles are less robust and of less duration. It’s almost like polite smiles rather than real, robust, coming from your inner-soul type of a smile.”

The stories note that Ellie is funded by the military as a tool to help detect PTSD and depression.

Virtual Humans and Jon Gratch Highlighted in the Harvard Business Review

A Harvard Business Review article on trust and robots featured ICT research on how to make computers more like humans.

“The more you add lifelike characteristics, and particularly the more you add things that seem like emotion, the more strongly it evokes these social effects,” says Jonathan Gratch, a professor at the University of Southern California who studies human-machine interactions. “It’s not always clear that you want your virtual robot teammate to be just like a person. You want it to be better than a person.”

The article states that, in his own research Gratch has explored how thinking machines might get the best of both worlds, eliciting humans’ trust while avoiding some of the pitfalls of anthropomorphism. In one study he had participants in two groups discuss their health with a digitally animated figure on a television screen (dubbed a “virtual human”). One group was told that people were controlling the avatar; the other group was told that the avatar was fully automated. Those in the latter group were willing to disclose more about their health and even displayed more sadness. “When they’re being talked to by a person, they fear being negatively judged,” Gratch says.

Making machines more humanoid might create too much faith in their abilities.

Gratch hypothesizes that “in certain circumstances the lack of humanness of the machine is better.” For instance, “you might imagine that if you had a computer boss, you would be more likely to be truthful about what its shortcomings were.” And in some cases, Gratch thinks, less humanoid robots would even be perceived as less susceptible to bias or favoritism.

VICE Creator’s Project on Digital Humans Features Paul Debevec and Ari Shapiro

The VICE Creators Project’s new documentary, Hollywood’s Digital Humans, highlights the digital double work of Paul Debevec and the ICT Graphics Lab. The story explains how Debevec teamed up with Activision R&D to create Digital Ira, a real-time 3D rendering of a ICT’s Ari Shaprio’s own motion-captured face. It also covers Debevec’s pioneering work on feature films, including Avatar.

NBC Today Show Features New Dimensions in Testimony

NBC News’ “Today” featured the  New Dimensions in Testimony project, which creates interactive holograms of Holocaust survivors through a collaboration between ICT and the USC Shoah Foundation. The story covered ICT research in natural language and 3D projections to preserve the ability to ask questions of survivors far into the future.

“Pinchas and the Holocaust survivors are able to talk to people now,” said ICT’s David Traum. “This is the last generation that has that opportunity. We need a way to preserve this interactive experience.”

ICT’s Paul Debevec was also quoted in the story.


An Effective Conversation Tactic for Creating Value over Repeated Negotiations

Daily Mail, Defense Systems, Soldier Technology Magazine and Army.Mil Cover ICT Virtual Humans

Articles about ICT’s work creating virtual humans that are used throughout the Army ran in the Daily Mail, Defense Systems and Soldier Technology Magazine and on the Army’s homepage Army.mil.

The stories noted that ICT, an Army-sponsored university affiliated research center, has a body of research in creating virtual humans and related technologies that are focused on expanding the ways Soldiers can interact with computers, optimizing performance in the human dimension, and providing low-overhead, easily accessible and higher-fidelity training.

The mission of the Los Angeles-based institute is to conduct basic and applied research and create advanced immersive experiences that leverage research technologies and the art of entertainment and storytelling to simulate the human experience to benefit learning, education, health, human performance and knowledge.

Toward that goal, much effort focuses on how to build computers – virtual humans and also robots – that can interact with people in meaningful ways, including through natural language.

Current applied projects using ICT natural language research include the Virtual Standard Patient, or VSP, and Emergent Leader Immersive Training Environment.

Towards Rapid Creation of Pervasive, Controllable 3D Characters

The 3D character will continue to be an integral part of 3D games, movies, and other simulated environments. Recently, the tools and technologies to produce characters have improved and become more affordable. In this talk, I describe my experiences in attempting to rapidly build 3D characters using sensing technology, behavior and modeling algorithms.

The Science Behind VR: Light Fields for Virtual Reality

Today’s VR headsets track the user’s head to produce low-latency viewpoint shifts in the virtual environment to allow you to believe that you are really present in the scene.  Unfortunately, live action recordings of concerts, news events, and sports are recorded from fixed eye positions and cannot take advantage of providing this crucial cue for presence.  Even recording left-eye and right-eye panoramas for 3D stereoscopic VR fails to record motion parallax and misses the most compelling aspect of virtual reality which is otherwise easily achieved with game environment scenes.  Light Field photography may provide a solution by recording arrays of images from different perspectives in a way which allows a continuous range of viewpoints to be generated after recording, including viewpoints never before photographed.  This presentation will introduce the concept of light field capture and rendering, provide a history of its applications in computer graphics, and discuss the opportunities and challenges in using light field capture and rendering to record and play back live-action content with the ability to move your head around and see the appropriate shifts in perspective.

VR Inventors: A Conversation with Virtual Reality Pioneer Mark Bolas

Mark Bolas has for decades been a pioneering and groundbreaking innovator in virtual reality. Many of what we think of as recent advances in the field were actually developed by Mark and his collaborators a surprisingly long time ago. The continuing work of his research group at USC through the years has now become highly relevant to today’s rapidly changing landscape of computer mediated content, as virtual reality is poised to make the transition from research tool to consumer level experience. Ken Perlin, a long time follower and fan of Mark’s work, talks with him about the fascinating journey of VR, and its profound implications for the future evolution of our shared experience of media and culture.

Evolving Narratives: Losing Control

Marketplace Features VITA, ICT’s Virtual Reality Job Interview Trainer

Marketplace featured ICT’s Virtual Interactive Training Agent, or VITA, and an interview with Skip Rizzo.

The project was developed in partnership with the Dan Marino Foundation and aims to help people on the autism spectrum navigate a job interview.

“We can set them to have three different behavioral dispositions. They can be that really nice, light-touch job interviewer, the neutral interviewer or the real stress interviewer…,” Rizzo said. “We can have people practice how they will respond to these types of characters under a range of conditions.”

Aaron Brown-Coats went through the VITA program as part of the curriculum at the Dan Marino Foundation, and says working with the virtual characters helped change the way he interacts at his own job.


MedCity News Covers ICT Work Creating Virtual Doctors with USC’s Center for Body Computing

MedCity News interviewed Dr. Leslie Saxon, director of the Center for Body Computing at the University of Southern California, about work she is doing with ICT.

“We’re developing an army of virtual doctors,” Saxon said, noting that ICT created a virtual representation of Saxon that even she admitted looks surprisingly real. (The human version is on the right in this short video.)

“I could be doing a procedure and a virtual me could be taking a history,” said Saxon, chief of cardiovascular medicine at USC Keck School of Medicine.

Patients can autonomously answer questions without feeling like they are pressuring their doctors. “I feel like we are getting better information,” Saxon said. “We don’t want patients to feel ashamed or inhibited.”

The virtual Dr. Saxon was created  by ICT’s Ari Shapiro in an extension of his 3D digital faces work.  The advancement is that the digital character took less than two days to construct, making it an economically viable option for many medical applications.


The use of Virtual Standardized Patients in Medical Education

Single-Shot Reflectance Measurement from Polarized Color Gradient Illumination

Achieving Photoreal Digital Actors

We have entered an age where even the human actors in a movie can now be created as computer generated imagery. Somewhere between “Final Fantasy: the Spirits Within” in 2001 and “The Curious Case of Benjamin Button” in 2008, digital actors crossed the “Uncanny Valley” from looking strangely synthetic to believably real. This talk describes how the Light Stage scanning systems and HDRI lighting techniques developed at the USC Institute for Creative Technologies have helped create digital actors in a wide range of recent films. For in‐depth examples, the talk describes how high‐resolution face scanning, advanced character rigging, and performance‐driven facial animation were combined to create “Digital Emily”, a collaboration with Image Metrics (now Faceware) yielding one of the first photoreal digital actors, and 2013’s “Digital Ira”, a collaboration with Activision Inc., yielding the most realistic real‐time digital actor to date. A recent project with USC’s Shoah Foundation is recording light field video of interviews with survivors of the Holocaust to allow interactive conversations with life-size automultiscopic projections.

Early Synthetic Prototyping (ESP)

Download a PDF overview.

Early Synthetic Prototyping (ESP) is a research project sponsored by ARCIC and ASA(ALT) looking at ways to leverage emerging synthetic immersive environments to foster innovative design and testing. ESP seeks to bring the Soldier (i.e. the end user) into the design and testing process at very early stages, helping to connect those that design/build (engineers) and those that employ (Soldiers). ESP also is being designed to enable testing of very nascent concepts and explore not only the art of the possible for today, but tomorrow as well.

ESP is very different than existing game/simulation engines. At the core of ESP are a new generation of metrics and analytics that focus on the wants and needs of the user, tracking not only their performance – *what* they did – but also *how* and *why* they did things. Current synthetic environments track fairly traditional metrics giving data largely as scores with easily quantifiable outcomes. In order to provide useful information back to a designer/engineeer, ESP will need to assess a number of softer metrics such as user frustration. In addition, deeper granularity will be tracked as well – e.g. source of frustration (equipment design, team members, opponents, system performance, etc.

ESP is currently in the early prototype stage, and in fact is using the working ESP schema to facilitate understanding the requirements that enable creativity and innovation through ESP. These exploratory environments are multi-player and are looking at not only the design of next-generation vehicles, but also their use in a variety of contexts. Users can make modifications on-the-fly, and help find new ways to not only build but also employ the systems.

Right now the ESP effort is focused on four main areas of research:

1. idea ingest – how you bring an idea or concept into the ESP environment
2. Emerging interfaces – wearable sensors, AR/VR/MR and how/why to use it effectively
3. Analytics – next-generation soft-metrics that are user-focused
4. Community – how to include a larger number of user to leverage a wide body of expertise

The broader ESP effort also includes research work at Naval Postgraduate School, TARDEC, and other Army partners. The initial prototypes are undergoing testing and will inform the ESP design and requirements, with FY16 efforts focused on building a v1 system along with ongoing research into the four vectors listed above.

Symposium: Unified Theories of Cognition: Newell’s Vision after 25 Years

It has been 25 years since Unified Theories of Cognition was published (Newell, 1990). In it, Newell outlines a vision to inspire generations of cognitive scientists and cognitive modelers; a quest for theories that provide comprehensive accounts of the human mind. As he put it: “A single system (mind) produces all aspects of behavior. It is one mind that minds them all. Even if the mind has parts, modules, components, or whatever, they all mesh together to produce behavior… If a theory covers only one part or component, it flirts with trouble from the start. It goes without saying that there are dissociations, interdependencies, impenetrabilities, and modularities… But they don’t remove the necessity of a theory that provides the total picture and explains the role of the parts and why they exist.” (Newell, 1990, pp. 17-18).

Poster: Supraarchitectural Capability Integration: From Soar to Sigma

Daily Breeze Covers ICT Work

The Daily Breeze featured Paul Debevec of the USC Institute for Creative Technologies and his work in 3-D scanning and digital animation. The story noted that ICT has worked with game developers and special effects studios in Hollywood to help create “some of the most visually arresting films of the past two decades, including ‘The Matrix,’ ‘The Curious Case of Benjamin Button,’ ‘Gravity’ and ‘Avatar.’”

The paper also covered New Dimensions in Testimony, ICT’s collaboration with the USC Shoah Foundation to preserve interactive testimonies of the Holocaust.


Ellie and ICT Researchers in the LA Times

Los Angeles Times featured “ELLIE,” a virtual interviewer that can help diagnose PTSD in veterans through facial and voice analysis. Gale Lucas of the USC Institute for Creative Technologies said that ELLIE has to be “best of both worlds. “She has to be human enough but also look like a machine. If she was just like us, it wouldn’t work.,” she said.

The article noted that Ellie cannot respond to questions. Early in the session, she acknowledges she is “not a therapist” but is there to listen.
“If someone says something terrible, we want them to understand why she isn’t asking them more about it,” said Giota Stratou, a research programmer at the institute.

Foshay Tech Academy Students Visit ICT

Students from the Foshay Learning Center’s Tech Academy recently toured the USC Institute for Creative Technologies on a field trip designed to explore the possible. Read about their visit at USC News.

The MxR Lab at the USC Institute for Creative Technologies

USA Today Quotes Todd Richmond on Rise of Virtual Reality

A story about the rise of virtual reality quotes Todd Richmond, director of prototype development at ICT. The article, which states that VR now is poised not only to challenge reality’s stranglehold on the way we engage with life, also notes that Palmer Luckey once worked at ICT.

“VR has been around for decades, but it will stick this time because there’s enough computational power and the price point will just keep going down,” said  Richmond, a VR group member with the Institute of Electrical and Electronic Engineers who is helping the Army and Navy design new equipment that can be demoed by human operators in VR before being built.

Mental Health Channel Features ICT VR Work for PTSD

Two segments on the Mental Health Channel’s Dr. Brain series feature ways that ICT virtual reality technologies are helping address post-traumatic stress and depression.

Watch the Virtual Reality Therapy story here.

Watch the Ellie/SimSensei story here.

Wired Notes Skip Rizzo’s VR Work for Treating PTSD

Wired mentioned that Skip Rizzo of the USC Institute for Creative Technologies regularly uses virtual reality to treat soldiers suffering from PTSD.

Panel Discussion: Next Gen Evaluation of VR Interfaces

Much of the work on evaluating the usability of VR systems over the past 15 years has focused on fairly low-level tasks, mainly based on Bowman et al.’s so called “Big Five” basic tasks of object Selection, Manipulation, Navigation, Symbolic Input, and System Control. Some additions to this have been discussed, including a) adding Avatar Control, due to the emergence of low-cost body part tracking systems (e.g., Kinect, Leap Motion), b) combining Selection and Manipulation into one, as they are so closely related, and c) splitting Navigation into two, Travel and Wayfinding, since many solutions exist for each of these individually. Even with these tweaks, however, it could be argued that research into interaction has matured to such a point that many viable solutions to each of these tasks exist, and that while we should not abandon this low-level research effort, greater impact could be made more rapidly by shifting focus to higher-level tasks and topics. Also, studying the low-level tasks in isolation ignores the fact that applications require users to physically and mentally switch between tasks, and that studying multiple low-level interface solutions together, along with the required transitions between them, is vital to user acceptance. In this panel, we explore several possible lines of evaluation, in the hopes of encouraging researchers and practitioners to think more impactfully about designing and evaluating their systems. Some of the work can be classified as “Fielded Studies,” where VR has been introduced into traditional workflow settings (e.g., medical student training), and evaluation has focused on how results from such systems relate to traditional approaches. Another tack is to design and evaluate from a user experience (UX) perspective. One possible future use of VR as posited in many works of popular fiction (e.g., Neuromancer, Ready Player One) is that we will spend most of our time wearing VR headsets and input gloves. Well, why not try it now, using today’s technology? Long-term, multi-person exposure approaches are now well within reach of most research budgets. In particular, gaming has been driving VR-related technology advancements for more than a decade, however it is not until recently that VR researchers have begun to focus some effort on formally designing for gaming experiences.

Poster: Towards Context-Sensitive Reorientation for Real Walking in Virtual Reality

Redirected walking techniques help overcome physical limitations for natural locomotion in virtual reality. Though subtle perceptual manipulations are helpful, it is inevitable that users will approach critical boundary limits. Current solutions to this problem involve breaks in presence by introducing distractors, or freezing the virtual world relative to the user’s perspective. We propose an approach that integrates into the virtual world narrative to draw users’ attention and cause them to temporarily alter their course to avoid going off bounds. This method ties together unnoticeable translation, rotation, and curvature gains, efficiently reorienting the user while maintaining the user’s sense of immersion.

One Hundred Challenge Problems for Logical Formalizations of Commonsense Psychology

Toward Natural Turn-Taking in a Virtual Human Negotiation Agent

Skip Rizzo Receives Pioneer in Medicine Award from the Society for Brain Mapping and Therapeutics (SMBT)

The award, presented that the Society’s annual meeting recognized Rizzo for his role in the field of virtual reality medicine and his impact on treatment of patients with obesity, diabetes and Post Traumatic Stress Disorders.

Rizzo, who is also a research professor at the USC Davis School of Gerontology and the department of Psychiatry and Behavioral Sciences of the USC Keck School of Medicine, conducts research on the design, development and evaluation of virtual reality (VR) systems for clinical assessment, treatment rehabilitation and resilience.

Keynote Talk: Acheiving Photoreal Digital Actors

We have entered an age where even the human actors in a movie can now be created as computer generated imagery. Somewhere between “Final Fantasy: the Spirits Within” in 2001 and “The Curious Case of Benjamin Button” in 2008, digital actors crossed the “Uncanny Valley” from looking strangely synthetic to believably real.

This talk describes how the Light Stage scanning systems and HDRI lighting techniques developed at the USC Institute for Creative Technologies have helped create digital actors in a wide range of recent films. For in‐depth examples, the talk describes how high‐resolution face scanning, advanced character rigging, and performance‐driven facial animation were combined to create “Digital Emily”, a collaboration with Image Metrics (now Faceware) yielding one of the first photoreal digital actors, and 2013’s “Digital Ira”, a collaboration with Activision Inc., yielding the most realistic real‐time digital actor to date.

Building a Life-Size Automultiscopic Display Using Consumer Hardware

Automultiscopic displays allow multiple users to experience 3D content without the hassle of special glasses or head gear. Such displays generate many simultaneous images with high-angular density, so that each eye perceives a distinct and different view. This presents a unique challenge for content acquisition and rendering. In this talk, we explain how to build an automultiscopic display using off-the-shelf projectors, video-splitters, and graphics cards. We also present a GPU-based algorithm for rendering a large numbers of views from a sparse array of video cameras.

Mixed Reality Habits: The New Wired Frontier

What happens when the world as we know it combines with virtual reality, streams of content, personal data, sensors, cameras and implantable or wearable devices? People will experience the world within a wired habitat, where computers mediate our perceptions of reality, objects talk to each other and experiences are hyper-customized based on carefully assembled, personal datasets. On the web of the future, Main Street won’t look the same to everyone. Mixed reality will revolutionize the world as we know it, creating an augmented dimension within which people live, work, communicate, collaborate, build, buy and sell. Without question, this pervasive environment will radically alter how we view people, environments, objects and illusion. However, with the unprecedented possibilities come obvious and not-so-obvious risks and threats. As mixed reality converges with life as we know it, how can it be structured to best serve humanity without jeopardizing all we hold dear?

Organized by IEEE.

Mark Bolas Discusses Virtual Reality on KPCC

Mark Bolas joined KPCC’s Larry Mantel on Air Talk to discuss the rising popularity of virtual reality technologies, current uses and future trends.

Poster: Rapid Photorealistic Blendshapes from Commodity RGB-D Sensors

The downside of grit: Increased effort and failure to disengage when losing

Poster Presentation:

Research Question
Are there some costs to being high in grit?

Despite benefits, researchers suggest there may be costs of grit, including performance (Duckworth & Eskreis-winkler, 2013).
Grit may predict effort when failing at tasks even if they are not relevant to long-term goals
Because gritty people might not want to give up on solving more difficult questions, to the detriment of answering simpler questions or completing the test, they might perform more poorly on the test than predicted from baseline
Gritter people might persist even when they would be risking a monetary loss to do so.
They might do so because they have more positive emotions and expectations for tasks.

CNN Features Paul Debevec and ICT’s Light Stage Scanning Systems

CNN featured Paul Debevec in a demonstration of the technology used to capture an actor’s facial expressions for CGI. He explained how seven sports photography cameras take a series of sixteen photos of actors under specialized lighting conditions, capturing fine skin detail and even the sheen of sweat.

Andrew Gordon on Bubble Guppies, the Demise of the Desktop and the Future of Human-Computer Interaction

A USC News story explored how we will communicate with computers in the future. The article, based on Andrew Gordon’s recent talk, “Mind-Reading for Robots,” predicts that human-computer interaction of the future will look a lot like the human-human interactions of today.

“The only interface that makes sense for this future era of computing is anthropomorphism,” Gordon said. “A driver doesn’t need to understand how the fuel injection system works to operate a car. If the car is having problems, he might tell the mechanic: ‘The engine is temperamental, or sluggish. Or mad.’”

Mechanics learn to interpret such anthropomorphic statements as code for any number of engine problems. So, too, might a computer learn to interpret references to psychological state as code for a spectrum of possible desired computing tasks, the story stated.

It’s a little like interpreting the behavior of Bubble Guppies on the popular Nickelodeon cartoon series, Gordon said. Using a fairly limited palette of motions and facial expressions, animators can convey to their young viewers a wide range of emotions, desires and intentions. When a Bubble Guppy shivers, laughs or frowns, kids understand what she is feeling. They need no elaborate theory of psychology to figure this out, just some common-sense models of everyday human mental life, stated the story.

“We need to equip computers with the same common-sense theories that we all use to understand each other in social interactions,” Gordon said.

The story also noted that Gordon and co-investigator Jerry Hobbs, of USC’s Information Sciences Institute have put the finishing touches on a book that encodes commonsense theories of human psychology as a set of 1,600 logical axioms.

ICT 2015 Summer Intern Program: Applications Due Feb 15

ICT offers select interdisciplinary paid internships for creative and technical students wishing to pursue careers in simulation, interactive media and virtual reality fields.

Visit our internship page for more info. February 15, 2015 is the application deadline.

Apply now: ICT offers a 10-week summer research program for undergraduates in interactive virtual experiences

The  USC Institute for Creative Technologies (ICT) offers a 10-week summer research program for undergraduates in interactive virtual experiences.

Reflecting the interdisciplinary nature of ICT research, we welcome applications from students in computer science, as well as many other fields, such as psychology, art/animation, interactive media, linguistics, and communications. Undergraduates will join a team of students, research staff, and faculty in one of several labs focusing on different aspects of interactive virtual experiences. In addition to participating in seminars and social events, students will also prepare a final written report and present their projects to the rest of the institute at the end of summer research fair.

This Research Experiences for Undergraduates (REU) site is supported by a grant from the National Science Foundation.

The application deadline has been extended. Applications are now due March 7, 2015. Click here for all the details.


Panel: Introducing modelling and simulation for autonomous systems

PC World Covers Virtual Reality at Sundance, Recalls 2012 Demo with Nonny de la Pena and Palmer Luckey

PC noted that Nonny de la Peña, a senior research fellow at the USC Annenberg School, and Palmer Luckey, formerly of the USC Institute of Creative Technologies, presented a virtual reality experience at the 2012 Sundance Film Festival. Luckey went on to found Oculus VR four months later.

Wired on How Skating Superstar Rodney Mullen Became a Tech Thought Leader, with a Little Help from our own Randy Hill

Wired writer Brendan Koerner profiles skating legend Rodeny Mullen and his unlikely rise as an influential tech thought leader. The story traces how Mullen’s first talk – at the inaugural TEDxUSC – began with an introduction by ICT Executive Director Randall W. Hill and led to Mullen’s prominence on the speaker circuit. That talk explored Mullen’s observations that  both hacker and skate culture are proudly open source, filled with innovations that improve upon the nonproprietary works of generations past.

The story explains that Hill had consulted with Mullen as part of his ICT research into human resilience and then connected him with Kriztina Holly, who was organizing TEDxUSC.

“He [Hill] had asked Mullen to contemplate what it took to excel as an elite skateboarder; the response he received was surprisingly sophisticated, with elements of philosophy and neuroscience woven throughout. One topic Mullen covered, for example, was how he had trained his brain to enter a semi-hypnotic state prior to contests, so that he wouldn’t dwell on the hundreds of minute variables that can ruin a trick. Hill felt certain that Holly would enjoy a meal with an athlete who was conversant in concepts such as tacit knowledge and executive motor function,” stated the story.

Watch Mullen’s TEDxUSC talk here.

Mystic Isle

Mystic Isle is a virtual reality game-based technology designed to improve performance and participation for persons receiving rehabilitation services

Mystic Isle was developed from the Jewel Mine prototype and has evolved into a more flexible, customized system. The game within Mystic Isle places the player on a three-dimensional virtual island. The backgrounds, game objects and tasks vary depending on the player’s location on the island. Through the game tasks, the player moves around the island collecting objects. The same underlying concepts from Jewel Mine were incorporated into Mystic Isle. The flow of the menu was updated to allow for more intuitive use by clinicians. The tasks within the game were also modified to include cognitive tasks and structured balance tasks.

The design and development process of Mystic Isle has also been in partnership with rehabilitation sites across the US and internationally. Although the game for clinical populations is still a prototype, researchers are attempting to better understand how the game can best be utilized in clinical practice. Sites include a specialized balance and vestibular disorders clinic, an outpatient day treatment program for TBI, and military TBI clinics. The data and feedback from those specialty sites will help shape the game into an efficacious, applicable tool for lower extremity rehabilitation.

To further investigate the application of Mystic Isle, it is deployed  in the home setting rather than the clinic. All participants who played the game in their home reported a high level of satisfaction with the game and the intervention as a whole. Understanding the feasibility of using the game in the home setting is key to integrating the game into the full cycle of rehabilitation. Ultimately, Mystic Isle has potential to be used in acute, inpatient, outpatient and community-based settings for people with stroke and other neurological or orthopedic conditions.

The project is funded by ARO/TATRC and a CDRMP Grant in collaboration with the Kessler Foundation.

SimSensei Demonstration: A Perceptive Virtual Human Interviewer for Healthcare Applications

An Integrated Evaluation of Perception, Interpretation, and Narration

Mark Bolas Explains Virtual Reality for Time.Com

Time, in an article on how virtual reality headsets work, featured Mark Bolas of the USC Institute for Creative Technologies. The story noted his lab “is a major reason why we’re seeing a boom in virtual reality today” and that his team has included Scott Fisher, who directed NASA’s earliest VR efforts and Palmer Luckey, who founded Oculus VR. The story also mentioned a 2011 breakthrough at Bolas’ lab that allowed VR displays to expand their fields of view. The story covered FOV2GO, a free template for constructing VR eyepieces using cardboard and a smartphone.

New York Times Quotes Mark Bolas on Virtual and Augmented Reality

A New York Times article about the Microsoft Hololense calls Mark Bolas as an expert on virtual and augmented reality.

“The question is when we can mix virtual and real worlds seamlessly, what are we going to want to do?” Bolas asked. “I don’t think anyone has an answer to that.”

STRIVE Project Receives 2014 Army Modeling and Simulation Award

The ICT Stress Resilience in Virtual Environments (STRIVE) project received an Army Modeling and Simulation Award  in the Army-wide Training Team category. The award recognized outstanding work on the immense challenge of developing human dimension resilience skills for Soldiers. Skip Rizzo, the principal investigator on the STRIVE project stated that the recognition of the importance of resilience training in advance of a military deployment has becom more recognized by the Army and could lead to lower rates of posttraumatic stress in Service Members when they return home.

The award was presented to an interdisciplinary team of experts at the Interservice/Industry Training, Simulation and Education Conference held in Orlando Fl. in December 2014.

USC Standard Patient Plenary Talk

Scientists at the USC Institute for Creative Technologies have taken the concept of standardized patient actors used for training doctors to a new level.   ICT has succeeded in replacing human actors with virtual human patients that listen, talk and act out roles as people with a variety of ailments. Called the “USC Standard Patient”, the technology allows Doctors in training to talk to the virtual patients, get realistic responses, and then receive feedback with an automated report on how they can keep improving. The technology works through web browsers and has a free online community where medical and other clinical educators can create new patients.    Funded by the Defense Department, it is hoped that this new artificial intelligence-based technology will make medical education less expensive and improve patient safety. Dr. Thomas Talbot, one of the inventors, is unveiling the new technology at the International Meeting on Simulation in Healthcare in New Orleans on January 14th in a Plenary talk from 10:00am to 10:30am.

USC Standard Patient Named Best New Innovation at 2015 International Meeting on Simulation in Healthcare

USC’s Standard Patient won the Serious Games & Virtual Environments contest at the International Meeting on Simulation in Healthcare.  The USC Standard Patient allows doctors in training to talk to virtual patients, get realistic responses, and then receive feedback with an automated report on how they can keep improving. The system works through web browsers and has a free online community where medical and other clinical educators can create new patients.    Funded by the Department of Defense, it is hoped that this new artificial intelligence-based technology will make medical education less expensive and improve patient safety.


San Francisco Chronicle Quotes Mark Bolas on Growth of Virtual Reality

The San Francisco Chronicle quoted Mark Bolas about the development of virtual reality technology.

“Television came out, and then the big thing was a television in every home,” said Bolas. “I think virtual reality is going to go beyond that — I think every person is going to have a device,” he said. “You’ll have early adopters in 2015, and then I think it’s just going to grow exponentially from there.”

The story noted Bolas was an advisor to Palmer Luckey, founder of Oculus VR.

Paul Debevec and 3D Obama Make the Tonight Show with Jimmy Fallon

In his monologue, Jimmy Fallon showed a video clip of Paul Debevec discussing the scanning process used in the Smithsonian-led 3D Presidential Portrait project . Fallon then showed his version of the final result.

Watch it here.

Forbes Covers Pioneering, Open-Source Innovations of Mark Bolas and the Mixed Reality Lab

Forbes noted that Oculus Rift founder Palmer Luckey worked in the lab of Mark Bolas of the USC Institute for Creative Technologies. The story quoted virtual reality pioneer Jaron Lanier as saying that, “Without Mark, there would be no Oculus.”

It also called Bolas a virtual reality pioneer.  “He and his students had spent years refining VR headsets, and all their innovations were open-sourced; Luckey absorbed their wisdom and technology, and quickly applied them to his own work,” stated the story.