The Next Technology Shift: The Internet of Actions

GP-Bullhound released their 2018 technology predictions in early December 2017. At the top of that list was the changing relationship between technology and politics followed by cybersecurity, mobile over TV in China, translation technology, no more emails, international labor, the software suite, Industry 4.0 and the rise of regulators in blockchain and Initial Coin Offerings. Coming in last was Augmented Reality (AR).

But, according to Todd Richmond, IEEE Member and Director of USC’s Mixed Reality Lab, the next impactful technology shift is the Internet of Actions (IoA). IoA is a vision of digital technology becoming a more effective partner for humans as we navigate through our increasingly mixed-reality worlds.

Richmond believes that the key enabling technology is artificial intelligence (AI) which will drive the personalization that is part and parcel of IoA.

Continue reading the full article in Forbes.

We Visited Thomas Jefferson with VR!

Alyssa Smith of Cheddar was hanging out with Thomas Jefferson thanks to USC’s Institute for Creative Technologies. You’ll never guess what his favorite hobby is, watch the full segment on Cheddar.

How Mixed Reality Helps PTSD Patients

Todd Richmond, Director of Advanced Prototype Development for the USC Institute of Creative Technologies, speaks with Cheddar about how emulating the environment of trauma helps treat victims.

Watch the full interview on Cheddar TV.

Becoming an Avatar

Alyssa Julya Smith of Cheddar visited the USC Institute for Technologies in December where she learned about research being conducted in virtual reality from audio lead Jamison Moore and assistant professor Ari Shapiro.

Alyssa became an avatar herself, placed in a virtual world with the nation’s third president, Thomas Jefferson. They talk how virtual reality is being implemented in educational settings and how the technology can add to a curriculum.

Watch the full segment on Cheddar.

How Mixed Reality Could “Profoundly” Change the World

Mixed reality is set to make a huge impact on people’s lives. IEEE Member and USC ICT’s Director of Advanced Prototype Development Todd Richmond explains how he researches the way this technology will change the way things are done across industries.

“Mixed reality is going to profoundly change our world,” says Richmond. Cheddar Anchor Alyssa Julya Smith explores the autonomous drone lab inside USC. Richmond says this lab is looking to understand the relationship between humans and autonomous objects.

The drones are trained to follow the people controlling them. A big question looking into the future of autonomous objects is the ability for humans to trust the technology. Richmond says the project at USC looks at how to build interactions between machines and humans to advance command and control.

Watch the full segment on Cheddar.

Can Magic Leap Deliver on Its Big Hardware Reveal?

Magic Leap announces their One system, a head-mounted display and wearable processing unit that connects to a handheld controller. With not much details available, WIRED turned to ICT’s David Nelson for some thoughts on what kind of technology might be behind the One system.

Continue reading the full article on WIRED.com.

Magic Leap: Founder of Secretive Start-Up Unveils Mixed-Reality Goggles

Magic Leap today revealed a mixed reality headset that it believes reinvents the way people will interact with computers and reality. Unlike the opaque diver’s masks of virtual reality, which replace the real world with a virtual one, Magic Leap’s device, called Lightwear, resembles goggles, which you can see through as if wearing a special pair of glasses. The goggles are tethered to a powerful pocket-sized computer, called the Lightpack, and can inject life-like moving and reactive people, robots, spaceships, anything, into a person’s view of the real world.

Creative Director of ICT’s Mixed Reality Lab David Nelson chatted with Brian Crecente of Rolling Stone about the technology and how it is moving us toward a new medium of human-computing interaction.

Read the full article in Rolling Stone.

Learning a Language in VR is Less Embarassing than IRL

Will virtual reality help you learn a language more quickly? Or will it simply replace your memory?

Quartz investigates and chats with ICT’s Jonathan Gratch about the positive effects VR can have on people that might be wary of the technology.

Read the full article on Qz.com.

Creating a Life-sized Automultiscopic Morgan Spurlock for CNN’s “Inside Man”

Download a PDF overview.

We present a system for capturing and rendering life-size 3D human subjects on an automultiscopic display. Automultiscopic 3D displays allow a large number of viewers to experience 3D content simultaneously without the hassle of special glasses or head gear. Such displays are ideal for human subjects as they allow for natural personal interactions with 3D cues such as eye-gaze and complex hand gestures. In this talk, we will focus on a case-study where our system was used to digitize television host Morgan Spurlock for his documentary show “Inside Man” on CNN. Automultiscopic displays work by generating many simultaneous views with highangular density over a wide-field of view. The angular spacing between between views must be small enough that each eye perceives a distinct and different view. As the user moves around the disply, the eye smoothly transitions from one view to the next. We generate multiple views using a dense horizontal array of video projectors. As video projectors continue to shrink in size, power consumption, and cost, it is now possible to closely stack hundreds of projectors so that their lenses are almost continuous. However this display presents a new challenge for content acquisition. It would require hundreds of cameras to directly measure every projector ray. We achieve similar quality with a new view interpolation algorithm suitable for dense automultiscopic displays.

While the display has many applications, from video games to medical visualization, we are currently working on a much larger project to record the 3D testimonies of Holocaust survivors. This project, “New Dimensions in Testimony” or NDT, is a collaboration between the USC Shoah Foundation and the USC Institute for Creative Technologies, in partnership with exhibit design firm Conscience Display. NDT combines ICT’s Light Stage technology with natural language processing to allow users to engage with the digital testimonies conversationally. NDT’s goal is to develop interactive 3D exhibits in which learners can have simulated educational conversations with survivors though the fourth dimension of time. Years from now, long after the last survivor has passed on, the New Dimensions in Testimony project can provide a path to enable youth to listen to a survivor and ask their own questions directly, encouraging them, each in their own way, to reflect on the deep and meaningful consequences of the Holocaust. NDT follows the age-old tradition of passing down lessons through oral storytelling, but with the latest technologies available.

Multi-View Stereo on Consistent Face Topology

Download a PDF overview.

We present a multi-view stereo reconstruction technique that directly produces a complete high-fidelity head model with consistent facial mesh topology. While existing techniques decouple shape estimation and facial tracking, our framework jointly optimizes for stereo constraints and consistent mesh parameterization. Our method is therefore free from drift and fully parallelizable for dynamic facial performance capture. We produce highly detailed facial geometries with artist-quality UV parameterization, including secondary elements such as eyeballs, mouth pockets, nostrils, and the back of the head. Our approach consists of deforming a common template model to match multi-view input images of the subject, while satisfying cross-view, cross-subject, and cross-pose consistencies using a combination of 2D landmark detection, optical flow, and surface and volumetric Laplacian regularization. Since the flow is never computed between frames, our method is trivially parallelized by processing each frame independently. Accurate rigid head pose is extracted using a PCA-based dimension reduction and denoising scheme. We demonstrate high-fidelity performance capture results with challenging head motion and complex facial expressions around eye and mouth regions. While the quality of our results is on par with the current state-of-the-art, our approach can be fully parallelized, does not suffer from drift, and produces face models with production-quality mesh topologies.

Our objective is to warp a common template model to a different person in arbitrary poses and different expressions while ensuring consistent anatomical matches between subjects and accurate tracking across frames. The key challenge is to handle the large variations of facial appearances and geometries, as well as the complexity of facial expression and large deformations. We propose an appearance-driven mesh deformation approach that produces intermediate warped photographs for reliable and accurate optical flow computation. Our approach effectively avoids image discontinuities and artifacts often caused by methods based on synthetic renderings or texture reprojection.

Practical Multispectral Lighting Reproduction

Download a PDF overview.

We present a practical framework for reproducing omnidirectional incident illumination conditions with complex spectra using an LED sphere with multispectral LEDs. For lighting acquisition, we augment standard RGB panoramic photography with one or more observations of a color chart. We solve for how to drive the LEDs in each light source to match the observed RGB color of the environment and to best approximate the spectral lighting properties of the scene illuminant. Even when solving for non-negative intensities, we show that accurate illumination matches can be achieved with as few as four or six LED spectra for the entire ColorChecker chart for a wide gamut of incident illumination spectra.

A significant benefit of our approach is that it does not require the use of specialized equipment (other than the LED sphere) such as monochromators, spectroradiometers, or explicit knowledge of the LED power spectra, camera spectral response curves, or color chart reflectance spectra. We describe two useful and easy to construct devices for multispectral illumination capture, one for slow measurements of detailed angular spectral detail, and one for fast measurements with coarse spectral detail.

We validate the approach by realistically compositing real subjects into acquired lighting environments, showing accurate matches to how the subject would actually look within the environments, even for environments with mixed illumination sources, and demonstrate real-time lighting capture and playback using the technique.

Augmented Reality and Virtual Reality to Enhance US Military Training

AR Post remembers important conversations about the use of mixed reality in the Army held during the 2017 Body Computing Conference.

Read the full article here.

Oscars: How Top VFX Pros Brought Baby Groot, Wonder Woman’s Golden Lasso to Life

Hollywood Reporter runs down the technology behind 2017 blockbusters. Read the full article to learn more about ICT’s involvement in films like Logan, Valerian and Blade Runner 2049.

Immerse Yourself in New Worlds With Augmented and Virtual Reality

IEEE features a piece on their IEEE Transmitter and how it can be used to explore outer space and to treat medical conditions. To get a closer look at an archeological dig or to take a trip to Mars, visit the immersive experiences on IEEE Transmitter. There you can learn about the many applications of augmented and virtual reality, including how they’re being used for health care.

To continue reading, visit The Institute, IEEE’s news source.

Interactive Exhibit Lets Visitors ‘Talk’ with Nanjing Massacre Survivor in Mandarin

The Nanjing Massacre Memorial Hall in China today debuted its permanent exhibition of New Dimensions in Testimony — interactive survivor testimony technology developed by USC Shoah Foundation — The Institute for Visual History and Education.  The event marks the 80th anniversary of massacre in Nanjing.

This is the first permanent museum exhibition of New Dimensions in Testimony, or NDT, outside the United States and the first exhibit anywhere featuring the Mandarin-language testimony of Madame Xia Shuqin, a survivor of the massacre as a child. She is the only non-Holocaust survivor who has been interviewed for NDT.

Continue reading the full article in USC News.

First Mandarin-Language New Dimensions in Testimony Exhibit to Premiere at Nanjing Massacre Memorial Hall

As part of the commemoration of the 80th anniversary of the Nanjing Massacre, USCShoah Foundation will introduce its first Mandarin-language New Dimensions in Testimony interactive survivor testimony at the Nanjing Memorial Hall in Nanjing, China on Dec. 13.

It is the first permanent exhibition of NDT outside the United States and features Madame Xia Shuqin, a child survivor of the Nanjing Massacre. She is the only non-Holocaust survivor who has been interviewed for the project so far.

The Tianfu Bank and Tianfu Group generously funded the creation of this NDT testimony.

The exhibit is a featured part of the newly reconstructed Nanjing Massacre Memorial Hall’s core exhibit space. NDT uses groundbreaking natural language software to allow audiences to interact with the recorded image of a genocide survivor, who responds to questions in real time, powered by complex algorithms providing realistic conversation.

Xia traveled to Los Angeles in 2016 to film the NDT interview. The interview took five days and was filmed on the 360-degree “light stage” at the USC Institute for Creative Technologies. High definition cameras and lights captured the interview from all angles as Xia answered hundreds of questions about her life before, during and after the Nanjing Massacre.

Continue reading the full press release in Markets Insider, part of Business Insider, here.

Q&A with Skip Rizzo

Beyond Standards of the IEEE Association sits down with ICT’s Dr. Skip Rizzo to discuss everything from how mixed reality differs from virtual and augmented realities, to how these technologies are used in his line of work.

Read the full Q&A here.

CVMP

CVMP
December 11-12, 2017
London, England
Keynote Presentation

A New Virtual PTSD Screening is Helping Veterans Disclose Symptoms

Medical Training Magazine features Ellie and how AI can be used as a tool to detect symptoms of PTS.

Read the full article here.

The Near Future of AI Media

CableLabs is developing artificial intelligence (AI) technologies that are helping pave the way to an emerging paradigm of media experiences. We’re identifying architecture and compute demands that will need to be handled in order to deliver these kinds of experiences, with no buffering, over the network.

Click here to continue reading about the CableLabs and USC Institute for Creative Technologies partnership.

2017 Neural Information Processing Systems (NIPS)

2017 Neural Information Processing Systems (NIPS)
December 4-9, 2017
Long Beach, CA
Presentations

Virtual Reality to Treat PTSD: Interview with Todd Richmond, Director of USC’s Mixed Reality Lab

While PTSD is a significant issue for many of those serving in the military and others who work in traumatic situations, it also affects huge numbers of ordinary people who experience traumatic events such as assaults or natural disasters. Nearly 24 million Americans suffer from PTSD at any given time, and women are twice as likely as men to develop the condition. PTSD can sometimes be overlooked and is reportedly underdiagnosed, but anxiety disorders still cost society approximately $40 billion per year in treatment costs and loss of productivity.

A relatively new option for PTSD therapy involves virtual reality, with the goal of creating multisensory, immersive environments and experiences to treat the condition. The technique can be controlled by a clinician to suit a patient’s needs, and the results so far are promising.

Medgadget had the opportunity to ask Todd Richmond, Director of the Mixed Reality Lab at the University of Southern California and IEEE Member, some questions about the concept of using VR for PTSD and how it has worked so far.

Continue reading in medGadget.

Liquid Science: Virtual Reality

Red Bull TV hosts a TV show featuring GZA “The Genius” of Wu Tang Clan as he travels to USC Institute for Creative Technologies to engage in VR prototypes in our Mixed Reality Lab and even creates his own virtual avatar.

View the full episode here.

This New Robotic Avatar Arm Uses Real Time Haptics

According to Todd Richmond, IEEE Member and Director of USC’s Mixed Reality Lab, Internet of Actions (IoA) is a vision of digital technology becoming a more effective partner for humans as we navigate through our increasingly mixed-reality worlds.

“Technology development will need to move from the past and current focus on the “device” and the “capability” and more towards the human at the end of the technology,” adds Richmond.

Continue reading on Forbes.com.

Computing Clout Helps Hollywood Tap Potential of Immersive Content

In the worlds of AI, AR and VR, Dell delivers the power and reliability that can stitch complex data into a seamless experience.

Hollywood also stands to benefit from Dell’s non-entertainment work. Dell’s partnership with headset maker Meta and Ultrahaptics to develop AR shoe design systems for Nike has the potential to change the way production designers, set decorators and costume designers work. And its work with Albert “Skip” Rizzo at the USC Institute of Creative Technologies, to explore how VR can help service members deal with PTSD and autistic teenagers overcome the stress of job interviews, could one day be used by actors and writers for research and character development.

Continue read on Variety.com.

SIGGRAPH ASIA

SIGGRAPH AsiaNovember 27-30, 2017
Bangkok, Thailand
Presentations

 

I/ITSEC

I/ITSEC 2017November 27 – December 1, 2017
Orlando, FL
Presentations
 

Using VR-Based Psychotherapy for PTSD Helps Traditional Therapy Effects

The use of virtual reality technology is currently concentrated on gaming, but it has many other applications, including use in medical technology. VR may offer an alternative or even an accompaniment to traditional psychotherapy for post-traumatic stress disorder patients.

International Business Times sat down with ICT’s Todd Richmond for more insight into using VR as a tool in helping treat PTSD. Visit ibtimes.com to read the full article.

ICAT-EGVE 2017

ICAT-EGVE 2017
November 22-24, 2017
Adelaide, Australia
Presentations

Six USC Professors Named Fellows of Esteemed Scientific Society

FFive USC scientists and one Keck School of Medicine of USC physician have been elected fellows of the American Association for the Advancement of Science, an honor awarded to AAAS members by their peers.

Founded in 1848, the nonprofit organization is the world’s largest general scientific society. The group began the AAAS Fellows tradition in 1874 and publishes the journal Science.

This year 396 members will be named fellows because of their scientifically or socially distinguished efforts to advance science or its applications. Of the six USC professors included, ICT’s Paul Rosenbloom, also a computer science professor at Viterbi, was recognized for his focus is on the mechanisms that enable thought and how they combine to yield minds.

Read the full article in USC News.

Facing Down PTSD

“We tell patients it’s going to get harder before it gets easier. We’re not bullshitting anybody,” says Dr Albert ‘Skip’ Rizzo on the phone from his base at the University of Southern California. Rizzo is at the forefront of innovative research into how virtual reality ‘exposure therapy’ – exposing a patient to virtual reconstructions of a traumatic event – can be used to treat patients suffering from post-traumatic stress disorder (PTSD).

The foundations of this work are in the treatment of military veterans, with a ‘Virtual Vietnam’ developed in 1997 to help veterans still suffering with PTSD four decades after the end of the conflict. Virtual reality was first seriously looked at as a tool to treat terror-related PTSD victims four years later, in the month ­following the September 11 World Trade Center attacks.

Continue reading the full article in The Big Issue.

Confessions of a Technology Evangelist

ICT’s Todd Richmond contributes a piece to EdTech Digest about how VR and AR will (really) transform education to create a meaningful future.

Read the full article here.

VR Brings Dramatic Change to Mental Health Care

Skip Rizzo, associate director for medical virtual reality at the USC Institute for Creative Technologies, has been working with the U.S. Army on ways to use Virtual Reality (VR) to treat soldiers’ Post-Traumatic Stress Disorder for over a decade. His system, “Bravemind,” initially funded by the Department of Defense in 2005, can accurately recreate an inciting incident in a war zone, like Iraq, to activate “extinction learning” which can deactivate a deep-seated “flight or fight response,” relieving fear and anxiety. “This is a hard treatment for a hard problem in a safe setting,” Rizzo told me. Together with talk therapy, the treatment can measurably relieve the PTSD symptoms. The Army has found “Bravemind” can also help treat other traumas like sexual assault.

Continue reading the full article in Huffington Post.

Building the World of ‘Blade Runner 2049’

Digital Media World covers the technology behind the blockbuster, including ICT’s Light Stage.

Read the full article on Digitalmediaworld.tv.

LA CoMotion

LA CoMotion
November 16-19, 2017
Los Angeles, CA
Panel Discussion

Siri is My Agony Aunt – but is Telling Big Tech My Innermost Feelings a Bad Idea?

In the same way we feel free to write what we really think in the pages of a book, when we discuss our innermost feelings we now tend to disclose more talking to AI than to humans, according to a study conducted by the Institute for Creative Technologies in Los Angeles.

Continue reading on TheGuardian.com.

Dipping a Toe in the US Synthetic Training Environment

In the last few years training grounds have become increasingly virtual but it’s only the beginning. A new Synthetic Training Environment, built to fuse the real world with the virtual world into one giant training platform, could change soldier training forever.

Read more about the Synthetic Training Environment in Army-Technology.

SIGGRAPH Asia Tech Papers Preview: 3D Avatars From a Single Photo

Ian Failes of VFX Blog interviews Hao Li in anticipation of SIGGRAPH Asia.

Read the full article featuring what to expect from Hao Li and team here.

3 Ways AR/VR Are Improving Autonomous Vehicles

By Todd Richmond, Director, USC Mixed Reality Lab November 15, 2017

Virtual, Augmented and Mixed Reality (VAMR) aren’t just for games and entertainment, despite what many think. VAMR will touch every aspect of our lives, from how we shop for clothes to the way we consume media.

Autonomous vehicles, another one of the most exciting disruptive technologies, will also be impacted by VAMR in several ways. Here’s a look at three ways VAMR is improving autonomous vehicles.

Simulated Testing Ground
Mixed Reality (MR) Prototyping will provide a safe testing ground for autonomous vehicles, which are yet to be perfected. The Mixed Reality Lab at the University of Southern California (USC) has been using MR Prototyping to explore human-machine teaming; as of writing, the lab has successfully paired people with autonomous drones.

Traditional research, design, and development would put real drones and real people in very real danger while algorithms and other aspects are sorted out. Instead, the USC lab has successfully virtualized all aspects of the pairing within a game engine – virtual drones and virtual humans, together, in virtual environments.

Within this simulation, the level of “reality” can vary, and geo-specific terrain can be used if desired. Many of the design parameters and algorithmic problems associated with human-drone pairing can be solved in the virtual space. Afterwards, reality gets “mixed,” with some of the elements (e.g. autonomous drones) flying in both the virtual and physical space, and the humans remaining exclusively in the virtual space. More of the problems are resolved, and then the system can go fully physical when the algorithms are well-behaved.

Testing human-drone pairing in a VAMR space doesn’t differ that much from testing autonomous vehicles in a wholly virtual space – engineers can test, tweak, and ultimately educate autonomous vehicles in a virtual space, removing all danger to humans. Additionally, since the virtual and physical can be tied together in real time, that allows for remote collaboration as well as connection to other virtual systems, further removing the danger to humans and improving testing outcomes.

Visual Displays Will Improve Situational Awareness
New forms of visual displays, combined with aural and haptic feedback, will be designed to improve driver situational awareness and increase safety; combining these with other active systems based on computer vision, such as lane departure and auto-braking, presents the promise of lower accidents, fatality rates, and more.

There is perhaps a romantic notion of the traditional sedan dashboard evolving into something akin to an F-35 cockpit, which is actually not the way autonomous vehicle’s interfaces should be designed. Pilots, especially those who fly fighter jets, are highly trained and are a far cry from the normal driver, in terms of reaction time, decision making ability and much more. For normal drivers, less is probably more.

Portable Environment for Riders
AI will end up being a major player for autonomous vehicles, and it will likely be a combination of highly-optimized computer vision algorithms, next-generation path planning and traffic flow monitoring and metering. In the case of fully autonomous vehicle, VAMR may end up as the “portable environment” for riders.

For example, as telepresence capabilities improve, the ability to “take a meeting” while riding to work may become a viable reality. While we run the risk of personal isolation, VAMR combined with autonomous travel could provide the productivity increase technology has long promised, while also providing a bit of an escape from mundane commutes. Additionally, VAMR combined with autonomous travel could supplement taking the traditional phone call while driving, which would significantly reduce accident rates, fatality rates, and more.

Autonomous vehicles are upon the world, and there’s much to be excited about. But before we start hailing our automated Ubers and Lyfts, whizzing around cities in the backseat of a car with no driver, a lot of work needs to be done. VAMR will play a larger role than many think in making the (safe) driverless vehicles of tomorrow a reality.

Content and images via Robotics Trends.

USC Collaboration to Test VR for Stroke Treatment

The Neural Plasticity and Neurorehabilitation Laboratory at USC is exploring the intersection of virtual reality and neuroscience with a new program that aims to help stroke victims.

The laboratory aims to implement VR technology in health care under the acronym REINVENT, according to Sook-Lei Liew, head of the NPNL.

Liew’s laboratory is partnering with USC Institute of Creative Technologies’ MxR lab for the REINVENT project, which focuses on virtual reality and stroke rehabilitation.

“In all areas of human interaction with any kind of sophisticated technology, you want to develop things can amplify human functioning and learning,” said Albert Rizzo, who works at the MxR lab.

The MxR lab is part of USC’s Institute for Creative Technologies and hosts a variety of research at the forefront of virtual development, with focuses on optical physical therapy and game apps.

Rizzo explained that in using VR as a medium for rehabilitation or as a tool for data collection, there are three measurements that must be considered.

Continue reading on DailyTrojan.com.

Intelligent Cognitive Assistants Workshop

Intelligent Cognitive Assistants Workshop
November 14-15, 2017
San Jose, CA
Panel Discussions

VR, 360-degree video help journalism students tell stories at USC Annenberg

USC News explores how VR is making its way into journalism and USC’s role in the cutting-edge technology.

Read the full story in USC News.

Victor Luo is Making NASA Cool for Coders

In a piece for Bloomberg about coding with NASA, ICT’s Skip Rizzo gives his perspective on using AR and VR to design virtual space shuttles in 3D and then assist astronauts on the real shuttles orbiting outside the atmosphere.

Read the full article on Bloomberg.com.

This USC Lab is Pioneering the Next Big Thing in Experience Design

Todd Richmond and his team at USC Institute for Creative Technologies are exploring techniques and technologies to improve the fluency of human-computer interactions.

Watch the full video segment on Fastcodesign.com.

Are Humans Actually More ‘Human’ than Robots?

In a recent report, the Pew Research Center found that Americans are more worried than they are enthusiastic about automation technologies when it comes to tasks that rely on qualities thought to be unique to humans, such as empathy. They’re concerned that, in lacking certain sensibilities, robots are fundamentally limited in their ability to replace humans at those jobs; they don’t, according to the report, trust “technological decision-making.”

Human drivers don’t seem all that “human” when it comes to thoughtful decision-making. Federal fatal-crash data show that despite reductions in the number of deaths due to distracted or drowsy driving, those related to other reckless behaviors—including speeding, alcohol impairment, and not wearing seatbelts—have continued to increase. Roughly 37,000 of last year’s fatal crashes were attributed to poor decision-making.

Humans aren’t necessarily better than robots at caregiving, either. The American Psychological Association in 2012 estimated that 4 million older Americans—or about 10 percent of the country’s elderly population—are victims of physical, psychological, or other forms of abuse and neglect by their caregivers, and that figure excludes undetected cases.

Nor do they inherently excel at interpersonal skills. Humans incessantly use “strategic emotions”—emotions that don’t necessarily reflect how they actually feel—to achieve social goals, protect themselves from perceived threats, take advantage of people, and adhere to work-environment rules. Strategic emotions can help relationships but, if they’re detectable, they can harm them, too.

As an example, Jonathan Gratch, the director of emotion and virtual human research at the University of Southern California’s Institute for Creative Technologies, pointed to customer-service representatives, who tend to follow a script when speaking with people. Because they rarely express genuine emotions, they aren’t, according to Gratch, “really being human.” In fact, these rules surrounding professional conduct make it easier to program machines to do that sort of work, especially when Siri and Alexa are already collecting data on how people talk, such as their intonations and speech patterns. “There’s this digital trace you can treat as data,” he said, referring to the scripts on which customer-service reps rely, “and machines learn to mimic what people do in those tasks.”

Read more in The Atlantic.

ICMI 2017 – 19th ACM International Conference on Multimodal Interaction

ICMI 2017 – 19th ACM International Conference on Multimodal InteractionNovember 13-17, 2017
Glasgow, Scotland
Presentations

New Post-Traumatic Stress Disorder Treatments for Veterans Focus on Technology

It turns out that military veterans are willing to talk about stress. They just haven’t been getting access to the right confidantes and a comfortable setting.

Ellie is an example. Her ability to press veteran interview subjects to reveal information about their feelings and mental state worked better than the primary method used by the Department of Veteran Affairs.

Her secret? Ellie isn’t a bland questionnaire. She isn’t a human, either.

Developed by the University of Southern California’s Institute for Creative Technologies, Ellie is a virtual PTSD screening and diagnostic tool that provides patients with an anonymous, unrecorded interview session. A recent study of Ellie’s interactions with veterans showed that they are more willing to report symptoms of post-traumatic stress disorder to the program than using a traditional assessment method.

Read more on CNBC.com.

What Are Holographic Calls? Technology May Replace Voice Calling In The Future

Even though the technology seems to be pretty far-fetched right now, it has a possibility of becoming a consumer technology, according to Todd Richmond, IEEE member and director of the mixed reality lab at the Institute for Creative Technologies at the University of Southern California.

He outlined the challenges involved in an email to IBT.

“In our labs we have shared virtual environments between east and west coast using a combination of VR and AR (Vive and Hololens). For that to move into the widespread consumer market, there are technical and user experience challenges that need to be solved. Current AR/VR hardware is still rather clunky and is not particularly comfortable for long-term use. The technology needs to approach something closer to reading glasses or perhaps large sunglasses. Or projection (“hologram”) technology needs to improve (probably a combination of both),” he stated.

The biggest challenge for the technology, according to Richmond, is the portrayal of a person in VR and how these virtual environments will be navigated. If you want to text in such an environment — how do u draw letters?

Richmond says that the technology could be available for commercial usage around 2020, however, it might take a decade for it to become a consumer technology.

Read more on IBT.com.

Army Veteran, a USC Adminitrator, Uses Storytelling to Support the Military

By Ron Mackovich, USC News

Randall Hill ’78 was born on an Army base, and his life story revolves around the military.

“My dad was in the Army, and he encouraged me to pursue my application to West Point,” Hill recalled. “He said, ‘Why don’t you just see if you can get in?’”

Hill got in, graduated and served six years as a commissioned officer with assignments in field artillery and intelligence.

After earning a PhD in computer science from USC, Hill went on to become executive director of the Institute for Creative Technologies, where the entertainment and gaming fields converge with research to build training and simulation platforms. ICT’s interactive and virtual reality programs go beyond military training, helping veterans find jobs and cope with trauma from combat and sexual assault.

For Hill, it’s all about the story.

“We use the art of storytelling to support the military,” Hill said. “We’re in Hollywood. USC has the best cinema school, so we’re in the right place.”

Dedicated to authenticity
One of ICT’s newest interactive platforms is being rolled out at Fort Leavenworth to help victims of sexual assault.

“We interviewed a male soldier who suffered a sexual assault, and integrated that into an interactive media program using artificial intelligence,” Hill said. “Other soldiers can ask him questions, and we created a database that will generate the best response. They’re not just fact-based questions. It’s more like ‘What was your experience,’ and it comes back with a story.”

The interactive program grew in part from a project involving the USC Shoah Foundation – The Institute for Visual History and Education that created an interactive experience with a Holocaust survivor.

“They were intent on authenticity, so we brought that same authenticity to the sexual assault program,” Hill said. “Part of the goal is prevention because some sexual assaults happen during hazing. We want to expose this to say ‘hazing is not OK, it has long-term consequences, it has a huge impact on people’s lives.’”

Going forward, Hill plans to stick to the story as ICT develops new programs.

“Storytelling is one of the oldest ways people have communicated, and your brain lights up when you hear a story,” Hill said. “You always remember a good movie, even decades after you see it.”

AAAI 2017 Fall Symposium

AAAI 2017 Fall Symposium on A Standard of Model of the Mind
November 9-11, 2017
Arlington, VA
Presentations

DS2A

Download a PDF overview.

The Digital Survivor of Sexual Assault (DS2A) system allows Soldiers to interact with a digital Sexual Harassment/Assault Response and Prevention (SHARP) guest speaker and hear their stories. First-person stories are one of the most powerful ways people share information, connect with, and learn from each other. As part of the ongoing SHARP training, survivors of sexual assault often speak to large groups of Soldiers. Unfortunately, not every Soldier who would benefit from interacting with a survivor will have this opportunity. DS2A allows more soldiers to interact with a speaker and preserves the emotional impact of hearing about the speaker’s experience. Soldiers can interact with the digital survivor and hear the speaker’s stories as direct responses to their own questions.

DS2A is a powerful new tool for instructors at the Army SHARP Academy. The system enables new SHARP personnel, as well as selected Army leaders, to participate in conversations on SHARP topics through the lens of a survivor’s firsthand account. DS2A can play an important role in the prevention of sexual harassment and sexual assault by enabling new Sexual Assault Response Coordinators (SARCs) and Victim Advocates (VAs) to interact with a sexual assault survivor and hear the survivor’s stories in a non-confrontational environment. The experience may help SHARP professionals understand how to better support victims, and perform their prevention and response duties. It can also help Army leaders understand the impact that incidents of sexual assault and retaliation can have on an individual Soldier and unit readiness. The Army SHARP Academy plans to use DS2A in its resident courses of instruction, directed by instructors trained in the proper use of the system.

DS2A system is based on the New Dimensions in Testimony (NDT) project, a collaborative effort between the USC Shoah Foundation and USC ICT. Development of DS2A leveraged research technologies previously created for the Department of Defense under the direction of the Army Research Lab Simulation and Training Technology Center (ARL STTC). These technologies include the Light Stage, to facilitate recordings of survivors, and natural language dialogue technology to enable conversational engagement with survivors. DS2A is the first system of its kind to be used in an Army classroom.

AECT 2017

AECT 2017November 6-11, 2017
Jacksonville, FL
Presentations

How Technology is Keeping Holocaust Testimony Alive

The life-size image of Pinchas Gutter on a video screen, fidgeting, blinking and tapping his foot, seemed present and alive in the way portraits do in the magical world of Harry Potter. The Holocaust survivor, who lives in Toronto, was nowhere near the Museum of Jewish Heritage on the day I visited, but by stepping up to a podium, clicking on a mouse and speaking into a microphone, I was able to ask Gutter questions. His image responded with answers—speech quirks, pauses and gestures included. He spoke to me about religion and sports; he shared his favorite Yiddish joke; I hear he sometimes sings. Gutter also told me that he was a happy child until September 1, 1939, when Hitler’s armies invaded Poland and World War II began. Soon after, his father was taken away and beaten nearly to death. After that, he said, “I knew that life wouldn’t be the same.”

Read more on MSN.com.

The Ultimate Escapism

Is VR addiction really something we need to worry about now? Currently, eye strain, cybersickness, and a lack of sense of touch in VR make it far less immersive than portrayed in sci-fi. You can’t yet plug in for hours and hours. “It’s not at a holodeck level yet. I don’t think you’re seeing a public health challenge,” says Albert “Skip” Rizzo, director of medical virtual reality at USC’s Institute for Creative Technologies. In fact, experts have been speculating and researching how VR technologies could be used to treat addiction. For example, alcoholics immersed in a virtual bar or a virtual party can be taught to manage their cravings and develop coping and refusal skills so that they can prevent a relapse when they’re near the real thing.

Continue reading on Slate.com.

Hologram Technology in Holocaust Museum Exhibit Immortalizes Survivors’ Stories

Illinois Holocaust Museum CEO Susan Abrams said the museum helped advance the project — New Dimensions in Testimony — a collaboration between the USC Shoah Foundation and the USC Institute for Creative Technologies.

“The survivors were filmed in a studio in L.A. of which there are only three in the world,” Abrams said. “The survivors were surrounded by over a hundred cameras.”

She called the technology “future-proof” meaning that one day the recordings may be able to be shown in a 360-degree venue as technology advances. But for now, survivor testimonies are expressed through a three-dimensional hologram that is as close to the real thing as technology gets, she said.

“It prepares us for the day when our survivors will not be here,” Abrams said. “Right now, the 60,000 students and educators who come through plus tens of thousands of general visitors have the incredible privilege to hear directly from a survivor.”

In addition to the holograms, the Take a Stand Center highlights 40 historical and contemporary “upstanders” who have fought against injustice and cruelty in various ways.

Continue reading on ChicagoTribune.com.

UX and the Psychology of Storytelling

Stories are effective because they appeal to a hardwired way that the human mind works. It’s our natural impulse to impose order and attach meaning to our observations.

In a 1944 psychology experiment, participants watched a short animated film in which three geometric figures — a large triangle, a small triangle, and a small circle — move around and within a rectangle shape with a ‘door’. Participants in this study then described what they saw.

The researchers, Fritz Heider and Marianne Simmel, discovered that participants assigned all kinds of personality characteristics and motives to these simple shapes, generating compelling plots about an ‘aggressive’ large triangle, the ‘helpless’ circle, and the ‘hero’ small triangle. Sometimes the plot centered on love, or cheating, or sometimes it was a parenting saga.

More recently, seven comedians interpreted this short film for USC Institute for Creative Technologies, which is a very entertaining watch… The film simply depicts lines and shapes in motion, yet our brains fill in so much more.

Continue reading on WhatUsersDo.com.

Metafocus: Personalized Lifelong Learning

The University of Southern California (USC) Institute for Creative Technologies has developed “virtual humans” that look, move, and speak like real humans, albeit on large screens. These virtual humans employ MBE science and ITS technology to create learning experiences in schools, museums, and medical research facilities. Virtual humans “add a rich social dimension to computer interaction,” answering questions at any time of day so students never feel completely stuck.

Continue reading in Learning Solutions Magazine.

U.S. Military Seeking Technology to Better Prepare for War

Voice of America’s Elizabeth Lee investigates cutting-edge technologies to better prepare military for war. By attending the USC Center for Body Computing’s annual Body Computing Conference and visiting ICT’s Mixed Reality Lab, this piece explores technologically advanced solutions aimed to help the U.S. military.

Virtual Technology Allows SHARP Academy Students Opportunity to Interview Survivor

United States Army Spc. Jarett Wright was hazed and sexually assaulted while deployed to Iraq in 2010. Students from the Sexual Harassment/Assault Response and Prevention Academy had the opportunity to interview Wright Oct. 12, even though he wasn’t speaking to them at the time.

The Digital Survivor of Sexual Assault (DS2A) project uses the latest technology to educate and bring awareness to U.S. Army personnel to the horrific realities of sexual assault. The project replicates the experience of an in-person interaction with a survivor of sexual assault. Using Google voice recognition software, DS2A allows Soldiers and Department of the Army Civilians to have an immersive and interactive conversation with a survivor of sexual abuse using a virtual avatar in place of the victim.

“This initiative represents a great collaborative effort between the SHARP Academy, the Army Research Laboratory, and the University of Southern California, Institute for Creative Technologies (USC-ICT), and leverages innovative technology to enhance SHARP education and training,” said Col. Christopher Engen director of the U.S. Army SHARP Academy.

Though the program has been tested many times, this session was the first time an entire SHARP Academy class was allowed to interact with the DS2A.

Continue reading on TRADOC’s news site.

Found in Translation: USC Scientists Map Brain Responses to Stories in Three Different Languages

New brain research by USC scientists shows that reading stories is a universal experience that may result in people feeling greater empathy for each other, regardless of cultural origins and differences.

And in what appears to be a first for neuroscience, USC researchers have found patterns of brain activation when people find meaning in stories, regardless of their language. Using functional MRI, the scientists mapped brain responses to narratives in three different languages — Americanized English, Farsi and Mandarin Chinese.

The USC study opens up the possibility that exposure to narrative storytelling can have a widespread effect on triggering better self-awareness and empathy for others, regardless of the language or origin of the person being exposed to it.

“Even given these fundamental differences in language, which can be read in a different direction or contain a completely different alphabet altogether, there is something universal about what occurs in the brain at the point when we are processing narratives,” said Morteza Dehghani, the study’s lead author and a researcher at the Brain and Creativity Institute at USC.

Dehghani is also an assistant professor of psychology at the USC Dornsife College of Letters, Arts and Sciences, and an assistant professor of computer science at the USC Viterbi School of Engineering.

The study was published in the journal Human Brain Mapping.

Making sense of 20 million personal anecdotes

The researchers sorted through more than 20 million blog posts of personal stories using software developed at the USC Institute for Creative Technologies. The posts were narrowed down to 40 stories about personal topics such as divorce or telling a lie.

They were then translated into Mandarin Chinese and Farsi, and read by 90 American, Chinese and Iranian participants in their native language while their brains were scanned by MRI. The participants also answered general questions about the stories while being scanned.

Using state-of-the-art machine learning and text-analysis techniques, and an analysis involving over 44 billion classifications, the researchers were able to “reverse engineer” the data from these brain scans to determine the story the reader was processing in each of the three languages. In effect, the neuroscientists were able to read the participants’ minds as they were reading.

Continue reading the full article in USC News.

M+DEV Conference

M+DEV (Madison Game Development) ConferenceOctober 27, 2017
Madison, WA
Presentations

U.S. Military Looks to Solve Old Problems with New Solutions

“When you combine performance capture that is autonomously driven with a lot of this biodata, it is going to change the way athletes train. It’s going to change the way that the military trains and operates, and it is going to change the way that we interact with the world,” said Todd Richmond, director of Advanced Prototype Development at the University of Southern California Institute for Creative Technologies.

Read the full article on i-HLS.com.

VR Brings Dramatic Change to Mental Health Care

Skip Rizzo, associate director for medical virtual reality at the USC Institute for Creative Technologies, has been working with the U.S. Army on ways to use Virtual Reality (VR) to treat soldiers’ Post-Traumatic Stress Disorder for over a decade. His system, “Bravemind,” initially funded by the Department of Defense in 2005, can accurately recreate an inciting incident in a war zone, like Iraq, to activate “extinction learning” which can deactivate a deep-seated “flight or fight response,” relieving fear and anxiety. “This is a hard treatment for a hard problem in a safe setting,” Rizzo told me. Together with talk therapy, the treatment can measurably relieve the PTSD symptoms. The Army has found “Bravemind” can also help treat other traumas like sexual assault.

Read more on Forbes.com.

All the Face-Tracking Tech Behind Apple’s Animoji

WIRED examines Apple’s iPhoneX, animojis and the researchers behind the new technology. Elizabeth Stinson talks with Hao Li for more, read the full article on WIRED.com.

Affective Computing and Intelligent Interaction (ACII) 2017

ACII 2017
October 23-26, 2017
San Antonio, TX
Presentations

Interactive Holocaust Project Opens Thursday

The USC Shoah Foundation and USC Institute for Creative Technologies will open the first permanent installation of their interactive Holocaust project on Thursday after over five years of work.

The installation, New Dimensions in Testimony, features extensive interviews with Holocaust survivors through interactive technology that allows the public to have conversations with the individuals.

The Holocaust survivors were selected from a variety of backgrounds that included a large range in ages, experiences and locations during the war. One of the 15 participants in the project was Eva Schloss, Anne Frank’s stepsister, whose interactive work is currently being displayed in New York at a temporary installation.

The goal of the project was to recreate the intimacy of learning from Holocaust survivors, which the team working on the project attempted to do by allowing the public to ask the interactive displays any question they wished.

Read the full article on DailyTrojan.com.

NATO Modeling and Simulation Group MSG-149 Symposium

NATO Modeling and Simulation Group MSG-149 SymposiumOctober 19-20, 2017
Lisbon, Portugal
Presentations

SHARP Academy Tests Virtual Victim

Army Spc. Jarett Wright was hazed and sexually assaulted while deployed to Iraq in 2010. Students from the Sexual Harassment/Assault Response and Prevention Academy had the opportunity to interview Wright Oct. 12, even though he wasn’t speaking to them at the time.

The Digital Survivor of Sexual Assault project uses the latest technology to educate and bring awareness to Army personnel to the horrific realities of sexual assault. The project replicates the experience of an in-person interaction with a survivor of sexual assault. Using Google voice recognition software, DS2A allows soldiers and Department of the Army civilians to have an immersive and interactive conversation with a survivor of sexual abuse using a virtual avatar in place of the victim.

Read more in Ft. Leavenworth Lamp.

‘Raw Data’: An Oral History

The Rolling Stone covers Survios’ first game as a studio, interviewing its founding members and discussing their experience with ICT’s MxR Lab.

Read the full article on Rollingstone.com.

Experimental Virtual and Mixed Reality Technologies Can be Applied to Military of the Future

Virtual reality, augmented reality and mixed reality projects are being developed that can have military applications. One mixed reality project at the University of Southern California Institute for Creative Technologies (USC ICT) involves drones small enough to fit in the palm of a hand. The drones can follow and capture a person’s movements so they can be analyzed under a training simulation.

“When you combine performance capture that is autonomously driven with a lot of this biodata, it is going to change the way that athletes train. It is going to change the way that the military trains and operates, and it is going to change the way that we interact with the world,” said Todd Richmond, director of Advanced Prototype Development at the University of Southern California Institute for Creative Technologies.

Read more in Voice of America or VOANews.com.

How Technology is Keeping Holocaust Survivor Stories Alive Forever

Pinchas Gutter was the first Holocaust survivor to participate in the New Dimensions in Testimony project, a collaboration of the USC Shoah Foundation, the Institute for Creative Technologies (ICT), also at the University of Southern California and Conscience Display.

Read the full article about New Dimensions in Testimony on Newsweek.com.

5th International Conference on Human Agent Interaction (HAI 2017)

5th International Conference on Human Agent Interaction (HAI 2017)
October 17-20, 2017
Bielefeld, Germany
Presentations

Virtual Therapists Help Veterans Open Up About PTSD

WHEN US TROOPS return home from a tour of duty, each person finds their own way to resume their daily lives. But they also, every one, complete a written survey called the Post-Deployment Health Assessment. It’s designed to evaluate service members’ psychiatric health and ferret out symptoms of conditions like depression and post-traumatic stress, so common among veterans.

But the survey, designed to give the military insight into the mental health of its personnel, can wind up distorting it. Thing is, the PDHA isn’t anonymous, and the results go on service members’ records—which can deter them from opening up. Anonymous, paper-based surveys could help, but you can’t establish a good rapport with a series of yes/no exam questions. Veterans need somebody who can help. Somebody who can carry their secrets confidentially, and without judgement. Somebody they can trust.
Or, perhaps, something.

“People are very open to feeling connected to things that aren’t people,” says Gale Lucas, a psychologist at USC’s Institute for Creative Technologies and first author of a new, Darpa-funded study that finds soldiers are more likely to divulge symptoms of PTSD to a virtual interviewer—an artificially intelligent avatar, rendered in 3-D on a television screen—than in existing post-deployment health surveys. The findings, which appear in the latest issue of the journal Frontiers in Robotics and AI, suggest that virtual interviewers could prove to be even better than human therapists at helping soldiers open up about their mental health.

Read the full article on WIRED.com.

VR Could Trick Stroke Victims’ Brains Toward Recovery

Researchers at the University of Southern California are examining how virtual reality could promote brain plasticity and recovery.

Read all about the research on CNET.com.

DocOn

The DocOn application currently being developed by USC’s Center for Body Computing (CBC) and USC’s Institute for Creative Technologies (ICT) brings the convenience and reassurance of a personal doctor right to your smartphone. With DocOn, no matter where you are, expert medical advice is only one click away.

By combining ICT’s Rapid Avatar scanning technology and artistic creative development with the medical expertise of cardiac-electrophysiologist, Dr. Leslie Saxon and her team, DocOn takes internet searches to a more personal, accessible, mobile level.

Not only will DocOn bring expert medical advice to smartphone users 24/7, but it will also broaden the reach of exceptional medical care to anyone, anywhere in the world, regardless of language or proximity to a staffed clinic.

CLOVR

The CLOVR responsive system, delivered conveniently to users at home, school, or anywhere it’s needed, enables users to have meaningful, personal, interactions with virtual humans. Like an empathetic listener, CLOVR analyzes the user’s emotional state and responds adaptively while also providing conversational feedback loops and non-verbal behavior.

Phase I of the project focused on creating a User Perceived State model based on a lexical analysis of direct user input. This emotional model, called the User Perceived State, or UPS, in combination with a new dialogue management and natural language processing editor, drives the system’s response. Responses include not only what the virtual agent says, but their gestures, posture, and tone. Reactive responses can also include changes to the environment or multimedia selections.

Phase II is slated to expand the UPS model by integrating indirect input such as the user’s facial expressions, vocal tone, pulse rate, and posture to create a more holistic analysis of their emotional state.

Like human to human communication, CLOVR will include these implicit, non-verbal, clues in determining its responses.

RAM Replay

RAM Replay aims to improve memory retention, skill acquisition and rule learning for soldiers. The application consists of three Virtual Reality testbeds, which can be authored by researchers to contain a range of events and rules related to specific tasks. Through EEG readings and stimulation during sleep, neuroscientists explore methods to improve subject performance on these tasks.

RAM Replay is funded by the Department of Defense Advanced Research Projects Agency (DARPA) and is a collaboration with the Hugh Research Laboratory, Rutgers University, University of New Mexico and Cardiff University.

VR on the Lot

VR on the LotOctober 13-14, 2017
Los Angeles, CA
Presentations

Virtual Interviewers Prods Veterans to Reveal Post-Traumatic Stress

Talking – to a computer-generated interviewer named Ellie – appears to free soldiers and veterans who served in war zones to disclose symptoms of post-traumatic stress, a new study from USC Institute for Creative Technologies finds.

Read the full article on Reuters.com.

PTSD Treatment: How AI Is Helping Veterans with Post-Traumatic Stress Disorder

The Institute for Creative Technologies at USC got lots of buzz for its original research, and introducing the world to Ellie, a digital diagnostic tool that strongly resembles, but cannot replace a human therapist. Ellie, an avatar of a woman in a cardigan with olive-toned skin and a soothing voice, listens to the people who come to her, and does what any human sounding board does. She listens to the content of their speech, and scans their facial expressions, tone, and voice, for cues that hint at meanings beyond speech. Ellie’s design was decided upon by the research group’s art team. As for how Ellie sounds, “she has a very comforting voice,” Lucas told Newsweek.

Continue reading on Newsweek.com.

Dr. Skip Rizzo and the Rise of Medical VR Therapy

Once thought to be a technology exclusively for entertainment, virtual reality applications pioneered by Albert “Skip” Rizzo, Ph.D. have provided life-changing therapeutic results for clients with serious anxiety disorders and members of the military in particular.

As the Director of Medical Virtual Reality at USC’s Institute for Creative Technologies (ICT) and Research Professor at the USC’s Department of Psychiatry and School of Gerontology, Dr. Rizzo has been at the forefront of dramatic innovations in clinical research and care for more than two decades, and his application of VR as a valuable tool in medical treatment underscores the broadening growth of VR beyond entertainment.

Read the full article in VFX Voice.

Our Search for Meaning Produces Universal Neural Signatures

In an era dominated by heartbreaking headlines and divisive political rhetoric, a pioneering state-of-the-art brain imaging study reminds us of our human commonality and the universality of our search for meaning in the stories we read.

Read the full article in Psychology Today.

Reading Stories Creates Universal Patterns in the Brain

New research shows that when we hear stories, brain patterns appear that transcend culture and language. There may be a universal code that underlies making sense of narratives.

Read the full article in Medical News Today.

Scientists Find There is Something Universal About What Occurs in the Brain When It Processes Stories

New brain research by USC scientists shows that reading stories is a universal experience that may result in people feeling greater empathy for each other, regardless of cultural origins and differences.

Read the full article in Gears of Biz.

Reading Makes You Feel More Empathy for Others, Researchers Discover

This University of Southern California study using ICT software to measure brain activity found greater empathy for others through reading.

Full article available on DailyMail.com.

AAAI AI and Interactive Digital Storytelling Conference

AAAI AI and Interactive Digital Storytelling Conference (AIIDE 2017)October 5-9, 2017
Salt Lake City, UT
Presentations

Academy’s Tech Council Adds 7 New Members

Established in 2003 by the Academy’s Board of Governors, the Science and Technology Council provides a forum for the exchange of information, promotes cooperation among diverse technological interests within the industry, sponsors publications, fosters educational activities, and preserves the history of the science and technology of motion pictures.

The returning Council co-chairs for 2017–2018 are two members of the Academy’s Visual Effects Branch: Academy governor Craig Barron, an Oscar-winning visual effects supervisor; and Paul Debevec, a senior staff engineer at Google VR, adjunct professor at the USC Institute for Creative Technologies and a lead developer of the Light Stage image capture and rendering technology, for which he received a Scientific and Engineering Award in 2009.

Read the full press release on Oscars.org.

Virtual Reality Teaches Veterans to Develop Interview

IEEE JobSite features ICT’s VITA4VETS collaboration with ARL, Dan Marino Foundation, Google.org and U.S. Vets in a recent piece about the project aimed at helping returning service members land jobs.

Read the full article on IEEE JobSite.

How to Improve Customer Experience with VR

Knowledge Center reports how VR can help the customer experience, speaking with ICT’s Skip Rizzo about using the technology already for mental health rehabilitation.

Read the full article here.

The Last Human Job

What abilities will set humans apart from machines?

It’s been a question at the center of decades of science fiction, and one that’s taken on increasing real-world urgency as we try to anticipate how the advancing artificial intelligence revolution will transform the way we work and live.

This piece has been published in Slate and is adapted from an essay that originally ran in the New America Weekly.

Read the full article here.

Reuters Ranks USC #20 in World’s Most Innovative Universities

The annual World’s Most Innovative Universities rankings from Reuters has been published, placing USC in the 20th spot. Reuters specifically called out ICT’s work in studying how people engage with technology through virtual characters and simulations, as well as our collaborations with studios including Warner Bros and Sony Pictures Entertainment to develop ever more realistic computer-generated characters in movies.

See the ranking in Reuters here.

Will Technology Transform Mental Health Care? A Future Tense Event Recap

On Sept. 28, Future Tense convened leading researchers in the field to discuss the ways technology is changing approaches to psychiatric study and care. The question at the heart of the discussion was: Are we on the verge of a new era in psychiatric care, or will these treatments go the way of other now-condemned methods?

Visit Slate.com for the full recap.

2017 IEEE Visualization Conference

2017 IEEE Visualization ConferenceSeptember 29-October 7, 2017
Phoenix, AZ
Presentations

Can Gaming & VR Help You With Combatting Traumatic Experiences?

Trauma affects a great many people in a variety of ways, some suffer from deep-seated trauma such as Post Traumatic Stress Disorder caused by war or abuse. And others suffer from anxiety and phobias caused by traumatic experiences such as an accident, a loss an attack.

Each needs its own unique and tailored regime to lessen the effects and to aid the individuals in regaining some normalcy to their lives. Often these customized treatments are very expensive and difficult to obtain.

In the world of ubiquitous technology and an ever-increasing speed in visual-based treatments, these personalized therapies are becoming more accessible to the average sufferer.

Gamasutra explores more, read the full article here.

Metafocus: Why I Don’t Want You to Know About Robo-Teachers

Learning Solutions Magazine explores the world of avatars and virtual humans, citing a few of ICT’s projects in the piece.

Read the full article here.

Vets Prep for Careers with Virtual Job Interviews

Government Computer News (GCN) covers the ICT prototype designed to help Veterans prepare for rejoining with workforce with VITA4VETS.

Read the full article here.

From Cancer Screening to Better Beer, Bots are Building a Brighter Future

VentureBeat explores the latest era of artificial intelligence, a period marked by the proliferation of intelligent virtual assistants and robots with specific skill sets.

Read the full article featuring a mention of ICT’s SimSensei project.

5 Angelenos Who Have Fascinating L.A. Jobs

L.A. Weekly’s Jessica Ogilvie talks with ICT’s Arno Hartholt about his work with the institute and what makes it so fascinating.

Read the full article in L.A. Weekly.

Virtual Reality, Real Medicine: Treating Brain Injuries with VR

ICT’s Dr. Skip Rizzo visits the Kessler Foundation to track progress of a trial measuring executive function performance in traumatic brain injury patients.

Streaming Media covered the news, you can read the full article here.

Virtual Reality Helps Veterans Prepare for New Jobs

ABERDEEN PROVING GROUND, Md. — The U.S. Army Research Laboratory and its partners recently developed a new way for veterans to seek employment.

The Virtual Training Agent for Veterans, or VITA4VETS, is a virtual simulation practice system designed to build job interviewing competence and confidence, while reducing anxiety. Although Army researchers and developers at the University of Southern California’s Institute for Creative Technologies, Google.org and the Dan Marino Foundation originally developed the training system to help those with autism prepare for job interviews, they soon realized its potential to help veterans.

While several companies advertise they hire vets, transitioning from military service life to a civilian workplace can be challenging. One day they are a Soldier, Sailor, Airman or Marine — then the next day, they are back to being “just a citizen.” The prevalence of militarisms in speech and thought override the ways of conceptualizing the civilian world.

The researchers and developers said they understand returning home can be arduous in itself, but preparing to find employment can be even more taxing.

That’s where they believe VITA4VETS can help improve one’s interviewing skills and instill a sense of discipline.

Juan Gutierrez, a 33-year-old Navy veteran with experience in aviation electronics was satisfied with the new style of interview.

“Answering questions with a virtual human rather than a real human helped me feel less nervous, and I could practice different responses and there were no repercussions with the avatar,” Gutierrez said.

Gutierrez said he had more confidence and the experience was as much an interview for a potential employer as it was for him.

“I learned I could ask questions too. Instead of feeling nervous — like I am being tested, it was a way for me to be honest and learn if it (the job) is something I’d like to do. Overall, VITA helped me feel confident with my interview,” said Gutierrez.

In 2016, the Bureau of Labor Statistics reported that 20.9 million men and women were veterans, accounting for about nine percent of the civilian non-institutional population age 18 and over. Of those 20.9 million, more than 450,000 were unemployed.

The military provides transition training, but when one considers the unemployment statistics and challenges servicemembers face, it underscores the urgency for creating methods to better prepare veterans for civilian employment.

“Although many veterans have the necessary talent and temperament for vocational achievement, they may find it challenging to express the ways in which their skills and experience are able to translate to the private sector,” said Matthew Trimmer, project director for VITA4VETS at USC ICT.

Currently available through U.S. VETS in Los Angeles, VITA4VETS leverages virtual humans that can support a wide-range of interpersonal skill training activities. It uses six characters that span different genders, ages and ethnic backgrounds. Each character is capable of three behavioral dispositions or interview styles and can be placed in a variety of interchangeable background job contexts, all controllable from an interface menu.

According to Trimmer, offering a variety of possible job interview roleplay interactions supports practice across a range of challenge levels and allows for customizable training geared to the needs of the user. Trimmer also said the approach has been known to produce positive results, indicating increased confidence with practice and high job acquisition rates.

“If focusing on one portion of said issue can provide any support to those that have served us, then it is one step closer to better assisting the overall transition process,” Trimmer said.

According to the U.S. Vets in Los Angeles, 93-percent of veterans have obtained employment using the VITA4VETS application.

Read the full article on the U.S. Army website.

Virtual Reality Helps Veterans Prepare for New Jobs

ABERDEEN PROVING GROUND, Md. — The U.S. Army Research Laboratory and its partners recently developed a new way for veterans to seek employment.

The Virtual Training Agent for Veterans, or VITA4VETS, is a virtual simulation practice system designed to build job interviewing competence and confidence, while reducing anxiety. Although Army researchers and developers at the University of Southern California’s Institute for Creative Technologies, Google.org and the Dan Marino Foundation originally developed the training system to help those with autism prepare for job interviews, they soon realized its potential to help veterans.

While several companies advertise they hire vets, transitioning from military service life to a civilian workplace can be challenging. One day they are a Soldier, Sailor, Airman or Marine — then the next day, they are back to being “just a citizen.” The prevalence of militarisms in speech and thought override the ways of conceptualizing the civilian world.

The researchers and developers said they understand returning home can be arduous in itself, but preparing to find employment can be even more taxing.

That’s where they believe VITA4VETS can help improve one’s interviewing skills and instill a sense of discipline.

Juan Gutierrez, a 33-year-old Navy veteran with experience in aviation electronics was satisfied with the new style of interview.

“Answering questions with a virtual human rather than a real human helped me feel less nervous, and I could practice different responses and there were no repercussions with the avatar,” Gutierrez said.

Gutierrez said he had more confidence and the experience was as much an interview for a potential employer as it was for him.

“I learned I could ask questions too. Instead of feeling nervous — like I am being tested, it was a way for me to be honest and learn if it (the job) is something I’d like to do. Overall, VITA helped me feel confident with my interview,” said Gutierrez.

In 2016, the Bureau of Labor Statistics reported that 20.9 million men and women were veterans, accounting for about nine percent of the civilian non-institutional population age 18 and over. Of those 20.9 million, more than 450,000 were unemployed.

The military provides transition training, but when one considers the unemployment statistics and challenges servicemembers face, it underscores the urgency for creating methods to better prepare veterans for civilian employment.

“Although many veterans have the necessary talent and temperament for vocational achievement, they may find it challenging to express the ways in which their skills and experience are able to translate to the private sector,” said Matthew Trimmer, project director for VITA4VETS at USC ICT.

Currently available through U.S. VETS in Los Angeles, VITA4VETS leverages virtual humans that can support a wide-range of interpersonal skill training activities. It uses six characters that span different genders, ages and ethnic backgrounds. Each character is capable of three behavioral dispositions or interview styles and can be placed in a variety of interchangeable background job contexts, all controllable from an interface menu.

According to Trimmer, offering a variety of possible job interview roleplay interactions supports practice across a range of challenge levels and allows for customizable training geared to the needs of the user. Trimmer also said the approach has been known to produce positive results, indicating increased confidence with practice and high job acquisition rates.

“If focusing on one portion of said issue can provide any support to those that have served us, then it is one step closer to better assisting the overall transition process,” Trimmer said.

According to the U.S. Vets in Los Angeles, 93-percent of veterans have obtained employment using the VITA4VETS application.

Read the full article on the U.S. Army website.

The 11th Annual USC Body Computing Conference

The 11th Annual USC Body Computing Conference
September 22, 2017
USC Town & Gown Ballroom

How Medical Care Benefits from VR/AR and Virtual Humans

Gamasutra sits down with Arno Hartholt to discuss the role of VR, AR and virtual humans in healthcare.

Read the full article in Gamasutra.

Museum of Jewish Heritage in New York City Allows Virtual ‘Interviews’ with Holocaust Survivors

An exhibit at the Museum of Jewish Heritage in Manhattan called “New Dimensions in Testimony” uses hours of recorded high-definition video and language-recognition technology to create just that kind of “interview” with Eva Schloss, Anne Frank’s stepsister, and fellow survivor Pinchas Gutter.

Read more about the New Dimensions in Testimony project in Newsday.

MYiHealth

MYiHealthSeptember 20-21, 2017
Stockholm, Sweden
Keynote Presentation

Army Research Center Maps LA Coliseum in 3-D for Homeland Security

ABERDEEN PROVING GROUND, Md. (Sept. 20, 2017) — The US Army Research Laboratory’s university partner – the University of Southern California Institute for Creative Technologies, in collaboration with the Aerospace Corporation and Department of Homeland Security, created a three-dimensional reconstruction of the Los Angeles Memorial Coliseum to help ensure the safety of its visitors.

They used commercial, off the shelf unmanned aerial systems and photogrammetric software to create the 3-D reconstruction of the LA Coliseum to be used by the Department of Homeland Security for infrastructure protection and security planning.

The Department of Homeland Security visited the ICT for demonstrations of the One World Terrain project, specifically the collection and 3-D reconstruction of areas of interest useful for terrain visualization, walk-through, planning and mission rehearsal. Officials thought it held promise for their infrastructure protection group, specifically where large crowds gathered and may be soft targets.

The aircraft were flown autonomously using an Android app built by ICT known as RAPTRS. Thousands of high-resolution still photographs of the structure were collected by the drones. Commercial photogrammetry software, in combination with classification algorithms developed at ICT, were used to reconstruct the structures in three dimensions and prepare the models for visualization and analysis. The visualization and analysis occurs in ICT’s Aerial Terrain Line of sight Analysis System, or ATLAS. In ATLAS, users can visualize sight-lines and plan tactical movements on and around reconstructed structures.

This addresses an enduring Army challenge. Terrain remains one of, if not the most, pressing challenge when it comes to training, preparing and planning for all aspects of combined arms operations.

The UAS-to-3-D-model pipeline used at LA Coliseum gives Army units the ability to launch organic UAS assets on automated imagery acquisition flights and use the acquired imagery to reconstruct the terrain of their areas of interest. It provides leaders a georeferenced, up-to-date, high detail (5cm or better) 3-D model that can be used for mission rehearsal, simulation and situational awareness.

The RAPTRS pipeline is user-friendly and more cost-effective than many terrain capture methods. If Soldiers need to train in an area with insufficient or outdated geospatial data, they can rapidly collect and create or update terrain models.

ICT’s research and the next-generation process used at the Coliseum is one of several ways the Army can regain terrain/geospatial overmatch and reduce cost and time for creating geo-specific datasets for modeling and simulation.

World Conference on Information Technology

World Conference on Information Technology
September 20-22, 2017
Beijing, China

VRDC Fall 2017

VRDC Fall 2017
September 20 – 22, 2017
San Francisco, CA
“The Science of Engineering of Redirected Walking” with Mahdi Azmandian
Presentations by Arno Hartholt

The Remembering Machine

Davina Pardo writes an opinion piece for the New York Times about preserving Holocaust survivor stories and the New Dimensions in Testimony project.

Read the full article in the New York Times.

Gordon & Hobbs Book Celebration

Gordon & Hobbs Book Celebration
Tuesday, September 19, 2017
Andrew Gordon and Jerry Hobbs invite you to join them in celebrating the completion of their book, entitled “A Formal Theory of Commonsense Psychology: How People Think People Think” (Cambridge University Press).
We will be enjoying food, drinks, and live jazz with friends and colleagues at the USC Town & Gown Ballroom, near the center of campus. 
Pre-party lecture: Andrew and Jerry will deliver a public lecture on the topic of the book in Cammilleri Hall at the USC Brain & Creativity Institute, 3620A McClintock Avenue, at 4:30pm on September 19.
The favor of a reply is requested by Wednesday, September 13 online at usc.edu/esvp (code: bookparty)
Click here to RSVP:

Exhibit Allows Virtual Interviews with Holocaust Survivors

What was it like in a Nazi concentration camp? How did you survive? How has it affected your life since?

Technology is allowing people to ask these questions and many more in virtual interviews with actual Holocaust survivors, preparing for a day when the estimated 100,000 Jews remaining from camps, ghettos or hiding under Nazi occupation are no longer alive to give the accounts themselves.

Karen Matthews of the Associated Press investigates more. Read the full article on AP.

IEEE International Conference on Image Processing (ICIP 2017)

ICIP 2017
September 17-20, 2017
Beijing, China
Presentations

Holocaust Survivor Holograms Give History New Depth

KCET in California explores the ICT and USC Shoah Foundation’s collaborative project, New Dimensions in Testimony. Read all about here.

What to Expect When You’re Expecting the New Apple iPhone

Apple officially unveiled the iPhone 8, iPhone 8S and iPhone X, the 15th iteration of the little personal computing device that changed the world on September 12, 2017. USC experts discuss whether it is indeed just another minor iteration in Apple’s incremental but successful corporate philosophy under CEO Tim Cook, or a release that finds a way to transform our collective relationship with technology like it did 10 years ago.

ICT’s Todd Richmond comments on Apple’s integration of Augmented Reality. To read his thoughts, visit the USC News Press Room.

So, About Those iPhone Animojis

Apple’s animated emojis – Animojis – for the iPhone X just announced today are getting lots of attention, partly because the tech behind them likely extends from the company’s acquisition of Faceshift in 2015.

While that’s certainly not been officially confirmed, back then, Faceshift was doing some very cool things with driving animated avatars directly (ie. in real-time) from video of your own face, coupled with depth sensing tech – effectively the same thing that happens with these Animojis via the iPhone cameras.

Several tools have of course also been developed elsewhere that use input video and facial performance to drive animated characters, but for fun, I thought it might be interesting to go back to specific pieces of computer graphics research from 2009, 2010 and 2011 that each partly served as the origins of Faceshift.

Other continued research efforts also played a part in the development of Faceshift, but these papers below (which also have accompanying videos), were key and show how the facial animation of CG avatars would be driven in real-time from video captured of human performances.

FACE/OFF: LIVE FACIAL PUPPETRY
Thibaut Weise, Hao Li, Luc Van Gool, Mark Pauly
Proceedings of the Eighth ACM SIGGRAPH / Eurographics Symposium on Computer Animation 2009, 08/2009 – SCA ’09

Visit VFX Blog for more.

Speaker Q&A: Arno Hartholt Discusses the Use of Virtual Humans and VR/AR for Clinicians

Arno Hartholt is Director for R&D Integration at USC Institute for Creative Technologies and will be at VRDC 2017 to present his talk Immersive Medical Care with VR/AR and Virtual Humans, which will discuss how to apply VR/AR and other powerful capabilities to heal, inform, and teach in the medical domain. Here, Arno gives us some information about himself and his work.

Read more in Gamasutra.

Why Augmented Reality Is About to Take Over Your World

In preparation for Apple’s big announcement, BuzzFeed News talks with ICT’s Todd Richmond about Augmented Reality and what to expect in the not-too-distant future.

Read the full article in BuzzFeed News.

Digital Taipei 2017

Digital Taipei 2017
September 9-12, 2017
Taipei City, Taiwan
Panelist Participation

Keeping Holocaust Survivor Testimonies Alive – Through Holograms

As Holocaust survivors die out, many museums and study centers are scrambling to figure out just how to preserve testimonies in ways that will engage young people of the future.

One answer may be holography.

Illinois’ Holocaust Museum & Education Center will be unveiling its long-awaited, multi-million-dollar Take A Stand Center this October, which combines high-definition holographic interview recordings and voice recognition technology to enable Holocaust survivors to tell their personal stories and respond to questions from the audience, inviting a one-on-one ‘conversation’.

“The idea, there, is to carry on the dialogue,” David Traum, a lead researcher on the project, told the Forward. “It’s beyond what you can get from a static recording or documentary.”

Read more in The Forward.

Everything a Computer Wanted to Know About Humans (but was too afraid to ask)

Computers can outperform the greatest minds in many challenges, but while new approaches in machine learning are making in huge strides, from classifying images to understanding languages, machines still fall short in other areas.

Despite excelling in chess and solving complex computations, when it comes to understanding the pain of heartbreak or recognizing the emotional power of a Rothko painting, humans still come out on top.

But will that always be the case?

In a new book, dubbed “a computer’s guide to humans,” USC Information Sciences Institute chief scientist Jerry Hobbs and Institute for Creative Technologies director for interactive narrative research Andrew Gordon provide a linguistic framework to help computers understand our mysterious human ways, from emotions and beliefs to planning and memory.

Read the full article on USC’s Information Sciences Institute’s website.

New Dimensions in Testimony

As part of its 20th-anniversary commemoration, the Museum of Jewish Heritage is proud to pilot this interactive testimony installation — the first of its kind in the greater New York area — and to present the world premiere of the testimony of Eva Schloss and the New York premiere of the testimony of Pinchas Gutter.

For more information, visit the Museum of Jewish Heritage.

VR For Good: How The Virtual Medicine Conference Wants To Better VR Healthcare

Simply saying that VR is good for healthcare is too broad a statement. As well all know, there are thousands of different strands of subjects that fall under that umbrella, and identifying which ones VR is well-suited for is a little trickier. But an upcoming conference aims to unify these various strands, providing a VR medical conference that charts the future of technology’s impact on the health sector.

Virtual Medicine, as the event is called, is organized by Dr. Brennan Spiegel, the Director of Health Services Research for Cedars-Sinai Health System, and Co-Chair of the VR/AR Association Digital Health Committee. Taking place from March 28th – 29th 2018 at the Cedars-Sinai Medical Center in Los Angeles, the event gathers various leaders from the world of medical VR for two days of talks, sessions, and workshops.

19 speakers have been lined up so far from groups like the USC Institute for Creative Technologies, Osso Health, Children’s Hospital LA and even Samsung. In fact, Samsung is a partner for the event as are the VR/AR Association and AppliedVR, a VR platform designed for healthcare.

Check out some of the work Cedars-Sini itself is doing with VR already.

“Often conversations happen in isolation about how to make VR successful,” AppliedVR’s Josh Sackman said of the conference, “but it truly takes a village to make something like VR work in such a complex space like healthcare. It requires trial and error, constant feedback and communication with a cross-functional team, establishing best practices, and most importantly open dialogue from a wide range of stakeholders ranging from clinicians to content creators to investors to researchers to hardware companies.”

By gathering figureheads together to share successes and failures as well as data and research, Virtual Medicine hopes to stimulate VR’s use within the medical community and defy claims of gimmicks.

“Ultimately, we do see this as something in every patient room, operating room, imaging center, emergency room, surgery center, infusion center and other places where patients experience something scary or painful,” Sackman adds. “And we are starting to see some really powerful stories of patients using this in their homes, which is leading to safer and possibly more effective management strategies for those with chronic pain and other chronic conditions.”

If you’re interested in attending Virtual Medicine then it’d be a good idea to act fast; super early bird tickets are available until the end of the month and offer general admission for $299. After that, GA tickets will be priced at $399 for the rest of the year and $499 leading up to the event in March.

Via Upload VR.

20 Best 3D Animation Software Tools

All3DP lists out the 20 Best 3D Animation Software Tools, including ICT’s SmartBody prototype. Read the full article in All3DP here.

SMPTE 2017 Dives Into the Tech Behind Next-Gen Media

For more than a century, the people of SMPTE have sorted out the details of many significant advances in media and entertainment technology, and the programme for the SMPTE 2017 Annual Technical Conference & Exhibition (SMPTE 2017) keeps that tradition.

SMPTE 2017 will take place 23-26 October, and it will fill two exhibit halls and multiple session rooms at the Hollywood & Highland Center in Los Angeles. In fact, this is the final year the event will take place in its long-standing Hollywood location; the conference and exhibition will move to a larger venue in downtown Los Angeles next year.

Paul Debevec will be honored this year, read the full article in 4RFV for more information.

Tech Talk: Long-Term VR Side Effects Are Still a Big Unknown

There may be a lot of hype around VR technologies, but many researchers are still really undecided on the question about whether or not that’s a good thing. Although studies have been conducted for decades with a focus on the possible side effects, the most recent research suggests that more data is needed about what happens to users in the long term.

Android Headlines explores possible side effects and challenges facing VR today. Read the full article here.

How Digital Healthcare is Changing Everything: Thought Leaders Meet to Discuss Innovation in Digital Health at USC’s 11th Annual Body Computing Conference

LOS ANGELES, Aug. 28, 2017 /PRNewswire-USNewswire/ — On Friday, September 22, the University of Southern California (USC) Center for Body Computing (CBC), part of the Keck School of Medicine of USC, will curate conversations to provide a comprehensive understanding of how digital health is touching every aspect of our lives – from performance, behavior and decision-making to medicine, cybersecurity, the military, sports and public policy –at its 11th annual Body Computing Conference (click link to register).

Thought leaders across a broad spectrum come together for the one-day summit to offer local, national and global perspectives on the evolving convergence of health and digital technology. Speakers will shed light on a wide range of topics including the California-led cybersecurity initiative in health IT and L.A.’s 2028 Olympics plan to transform the Olympic Village into a connected health space. The event also includes demonstrations and discussions on the impact of unique wearable sensors to track personal fitness, enhance elite athletic performance and train the next generation of warfighters. And panelists will debate and advocate for the power of digital health tools to be able to combat the critical global issue of diabetes or build on-demand transportation safety nets to address the rapid rise in our aging population.

“Digital tools are making healthcare omnipresent, they are no longer a spoke in the wheel of our lives – they are the hub,” said Leslie Saxon, MD, founder and executive director of the USC Center for Body Computing. “We’re proud to be one of the only digital health conferences that brings together such an eclectic mix of global thought leaders to demonstrate, debate, and introduce the latest products, research, and investments that are accelerating the integration of digital health into every aspect of our lives.”

The exclusive 250-guest capacity crowd includes digital health start-ups and venture capitalists, small and large company executives, non-profit and government organizations, students, academic leaders and media.

Speakers from the California Governor’s Office of Business and Economic Development (GO-Biz), AARP Foundation, Abbott, an Academy Award-winning producer, Boston Celtics, Brent Scowcroft Center on International Security, ESPN, GE Software, Goldman Sachs, Joslin Diabetes Center, Karten Design, Lyft, NBA, NFL Players Association, Tastemade, UnitedHealthcare, U.S. Army, U.S. Department of Homeland Security, U.S. Food and Drug Administration, U.S. Marines, VSP Global and others will take center stage. Hosted on USC’s main Los Angeles campus, these experts join the innovators from across USC schools including: Annenberg School of Communications, Brain & Creativity Institute, Institute of Creative Technologies and the medical experts at Keck Medicine of USC. Click here for full list of speakers.

“Whether you are an elite athlete using biometrics to achieve peak performance where milliseconds can make a million-dollar difference, a military commander making crucial choices based on collective team health dynamics, a healthcare system protecting patient privacy or an individual looking to be empowered to enhance personal health outcomes, this event showcases today’s realities and tomorrow’s promise of digital health,” added Saxon.

According to Rock Health, digital health companies raised $4.2 billion in 2016 – double the amount raised in 2013 – with wearables and biosensors representing $312 million. GMI Insights projects the digital health market will grow to $379 billion by 2024. Last year, virtual reality reached an almost $1 billion market growth in health care specific applications and the adoption of artificial intelligence, apps and mHealth tools are poised for promise when it comes to individual and population health management. The USC CBC serves as a hub at the convergence of a fast-paced and growing digital technology revolution when it comes to medicine, acting as the research project lead and product design partner for small and large companies who want to maximize doctor efficiency, increase access, decrease costs and increase patient empowerment, engagement and health outcomes.

About the USC Center for Body Computing
The USC Center for Body Computing is the digital health innovation center for the Keck Medicine of USC medical enterprise. Collaborating with inventors, strategists, designers, investors and visionaries from health care, entertainment and technology, the USC CBC serves as an international leader on digital health and wearable technology. Founded in 2006 by Leslie Saxon, a cardiologist, the CBC was one of the nation’s first academically-based centers to focus on digital health solutions.

Dr. Saxon, an internationally renowned digital health guru has spoken at TEDMED, SXSW and WIRED international conferences as well as participates on the Food and Drug Administration (FDA) advisory group on global medical app regulations and recently served on a panel at the Bipartisan Policy Center to discuss medical apps and health IT cybersecurity. She was recognized as the nation’s “Most Tech Savvy Doctor” by Rock Health. For more information about the USC CBC: uscbodycomputing.org.

Media interested in attending the conference, please contact: Sherri.Snelling@med.usc.edu

17th Annual International Conference on Intelligent Virtual Agents

Intelligent Virtual Agents Conference
August 27-30, 2017
Stockholm, Sweden
Presentations

PTSD Exposure Therapy in VR: Importance of Storytelling & Emotional Presence in Healing from Trauma

Voices of VR sits down with ICT’s Dr. Albert ‘Skip’ Rizzo to discuss VR Exposure Therapy.

Read all about in Voices of VR.

USC Joins Alliance to Shape SoCal Into the Next Global Tech Hub

USC has joined the new Alliance for Southern California Innovation, a nonprofit coalition of universities, research institutions and corporations aiming to unify Southern California’s tech and biotech industries.

Read the full article in USC News.

Paul Debevec to Be Honored by Motion Picture and TV Engineers

The Society of Motion Picture and Television Engineers will present Paul E. Debevec with its most prestigious award, the Progress Medal, at its Oct. 26 awards ceremony during the SMPTE Technical Conference & Exhibition at Loews Hollywood Hotel.

Read the full article in Hollywood Reporter.

We Need to Look More Carefully Into the Long-term Effects of VR

When you think about it, virtual reality is such a step change in immersion, mediums like TV and video games seem abstract by comparison. What we’re not asking enough, is what impact this might have in the long run.

This is virtual reality‘s second coming; we’ve been tinkering with the idea since Morton Heilig built the Sensorama in the 1960s, but it wasn’t until the 90s that VR got its first “boom”. Sadly Hollywood’s promises of breathtaking alternate universes were beyond what the technology of the era could reach, dooming it to failure. But even back then, people had concerns about what long-term exposure to VR could do to the human mind. A study carried out at Michigan State University concluded that VR rewired the brain, but was unable to determine if longer term effects were possible.

Now we’re in 2017, VR is back (again), and still we’ve done little to interrogate whether our brains are even ready for this next level of human-machine interfacing. But it’s coming: various researchers have revealed to Wareable that work is underway to look further into the impact virtual reality could have on our brains and eyes.

Read the full article featuring commentary from Dr. Albert ‘Skip’ Rizzo on Wareable.

Augmented Reality is the Potential Future of US Military Training

The Synthetic Training Environment (STE) is an augmented reality training endeavor designed to improve soldier readiness in a variety of environments.

“Due to the rapidly expanding industrial base in virtual and augmented reality, the Army is moving out to seize an opportunity to augment readiness,” Col. Harold Buhl, Army Research Lab Orlando and Information and Communications Technology program manager, told taskandpurpose.com. “With STE, the intent is to leverage commercial advances with military technologies to provide commanders with unit-specific training options to achieve readiness more rapidly and sustain readiness longer.”

Read the full article in Military Training & Simulation.

IJCAI 2017

International Joint Conference on Artificial IntelligenceAugust 19-25, 2017
Melbourne, Australia
Presentations

How VR is Changing the Way We Think About Therapy

VR Scout explores a few methods in which virtual reality enhances therapy. Read the full article here.

Don’t Miss Out On All the Great AR Talks at VRDC Fall 2017

Gamasura teases what to expect at this year’s VRDC Fall 2017 Conference.

Click here for more information about the show and especially Arno Hartholt’s talk on immersive medical care.

BEING THERE: Virtual Reality Lets Therapy Patients Return to the Scene of Their Fear

The Herald Tribune covers Cade Metz’s piece for the New York Times about virtual reality exposure therapy.

Visit The Herald Tribune to read the full article featuring Dr. Albert ‘Skip’ Rizzo.

SIGdial 17

SIGdial 17
August 15-17, 2017
Saarbruken, Germany
Presentations

43rd National Organization for Victim Assistance

43rd National Organization for Victim Assistance (NOVA)
August 14-17, 2017
San Diego, CA
Presentations

SIGKDD 2017

SIGKDD 2017
August 13-17, 2017
Halifax, Nova Scotia, Canada
Presentations

How the Army is Using Augmented Reality to Bolster Troop Readiness

Task & Purpose covers the recent ARL and ICT STE news.

Read the full article in Task & Purpose.

Augmented Reality May Revolutionize Army Training

Orlando Echo covers news of the joint effort between the U.S. Army Research Laboratory and several entities — University of Southern California Institute for Creative Technologies, Combined Arms Center-Training and Program Executive Office for Simulation, Training and Instrumentation — are working to research, prototype and eventually deliver the Synthetic Training Environment, otherwise known as STE.

Read the full article in Orlando Echo.

How Virtual Reality is Transforming Public Health and Medicine

Did you know that virtual, mixed, and augmented reality content can have therapeutic and educational benefits? Pixvana explores noteworthy studies that show how X-Reality improves users’ lives.

Read the full article featuring Bravemind here.

Augmented Reality May Revolutionize Army Training

By Joyce M. Conant, ARL Public Affairs and Sara Preto, ICT

 

ABERDEEN PROVING GROUND, Md. — The development of advanced learning technologies for training is underway. Linking augmented reality with live training will enable units to achieve the highest levels of warfighting readiness and give valuable training time back to commanders and Soldiers.

The U.S. Army must train to win in a complex world that demands adaptive leaders and organizations that thrive in ambiguity and chaos. To meet this need, force 2025 and beyond, the Army’s comprehensive strategy to change and deliver land-power capabilities as a strategic instrument of the future joint force, requires a new training environment that is flexible, supports repetition, reduces overhead and is available at the point of need.

A joint effort between the U.S. Army Research Laboratory, University of Southern California Institute for Creative Technologies, Combined Arms Center-Training and Program Executive Office for Simulation, Training and Instrumentation, are working to research, prototype and eventually deliver the Synthetic Training Environment, otherwise known as STE.

STE is a collective training environment that leverages the latest technology for optimized human performance within a multi-echelon mixed-reality environment. It provides immersive and intuitive capabilities to keep pace with a changing operational environment and enable Army training on joint combined arms operations. The STE moves the Army away from facility-based training, and instead, allows the Army to train at the point of need — whether at home-station, combat training centers or at deployed locations.

“Due to the rapidly expanding industrial base in virtual and augmented reality, and government advances in training technologies, the Army is moving out to seize an opportunity to augment readiness,” said Col. Harold Buhl, ARL Orlando and ICT program manager. “With STE, the intent is to leverage commercial advances with military specific technologies to provide commanders adaptive unit-specific training options to achieve readiness more rapidly and sustain readiness longer.”

Buhl said in parallel, the intent is to immerse Soldiers in the complex operational environment and stress them physically and mentally and iteratively to as General Martin Dempsey (retired U.S. Army general who served as the 18th Chairman of the Joint Chiefs of Staff from October 1, 2011 until September 25, 2015) has said, ‘make the scrimmage as hard as the game.’

This training environment delivers the next generation of synthetic collective trainers for armor, infantry, Stryker and combat aviation brigade combat teams. These trainers are being developed to lower overhead, be reconfigurable and use advanced learning technologies with artificially intelligent entities to simultaneously train BCT-level and below. This multi-echelon collective training will be delivered to geographically distributed warfighters, at the point of need, for both current and future forces.

“As the Army evolves with manned and unmanned teams and other revolutionary battlefield capabilities, STE will be flexible enough to train, rehearse missions and experiment with new organization and doctrine,” Buhl said.

Leveraging current mixed reality technologies, STE blends virtual, augmented and physical realities, providing commanders and leaders at all levels with multiple options to guide effective training across active and dynamic mission complexities. STE will provide intuitive applications and services that enable embedded training with mission command workstations and select platforms.

“This capability coupled with the immersive and semi-immersive technologies that bring all combat capabilities into the same synthetic environment, add to this quantum leap in training capability, the geo-specific terrain that STE will use in collaboration with Army Geospatial Center and you have the opportunity to execute highly accurate mission rehearsal of a mission and multiple branches and sequels,” Buhl said.

STE adaptive technology supports rapid iterations and provides immediate feedback — allowing leaders to accurately assess and adjust training — all in real-time. With a single open architecture that can provide land, air, sea, space and cyberspace synthetic environment, with joint, interagency, intergovernmental, and multi-national partners, Army multi-domain operations are inherent with STE.

An increasingly complex element of the land domain is the expansion of megacities. In the coming decades, an increasing majority of the world’s population is expected to reside in these dense urban areas. Technologies in development by ARL for STE will provide the realism of complexity and uncertainty in these dense and stochastic environments. STE is intended to evolve and enhance readiness in megacities by replicating the physical urban landscape, as well as the complex human dynamics of a large population.

“It enables our formations to train as they fight using their assigned mission command information systems, and all other BCT and echelons above BCT warfighting capabilities,” Buhl said. “Operational informative systems and the training environment systems will share an identical common operating picture; enabling seamless mission-command across echelons.”

Ryan McAlinden, director for Modeling, Simulation and Training at ICT said his team has been working with ARL, the TRADOC capabilities manager, Combined Arms Center for Training, and PEO STRI for the past year to help inform the requirements process for the STE.

“The team has been researching and prototyping techniques and technologies that show feasibility for the one world terrain part of the program,” McAlinden said. “The hope is that these research activities can better inform the materiel development process when the STE is formally approved as a program of record.”

By leveraging technology to provide the means to train in the complex operating environment of the future, integrate technologies to optimize team and individual performance, provide tough realistic training that is synchronized with live capstone events and give commanders options for accelerated and sustained readiness, STE is transforming Army training to achieve readiness and win in a complex world.

“As we develop, demonstrate and transition technologies across the U.S. Army Research Development and Engineering Command that provide solutions to tough Army problems, we never lose sight of focus on Soldiers and commanders,” Buhl said. “These men and women deserve the very best in technology and more importantly in our respect for their leadership, initiative and ingenuity in the use of that technology. STE has tremendous opportunity for the Army if we develop and deliver with that focus.”

—–

The U.S. Army Research Laboratory, currently celebrating 25 years of excellence in Army science and technology, is part of the U.S. Army Research, Development and Engineering Command, which has the mission to provide innovative research, development and engineering to produce capabilities that provide decisive overmatch to the Army against the complexities of the current and future operating environments in support of the joint warfighter and the nation. RDECOM is a major subordinate command of the U.S. Army Materiel Command.

Augmented Reality May Revolutionize Army Training

he development of advanced learning technologies for training is underway. Linking augmented reality with live training will enable units to achieve the highest levels of warfighting readiness and give valuable training time back to commanders and Soldiers.

The U.S. Army must train to win in a complex world that demands adaptive leaders and organizations that thrive in ambiguity and chaos. To meet this need, the Army has developed Force 2025 and Beyond, a comprehensive strategy to change and deliver land-power capabilities as a strategic instrument of the future joint force. The successful implementation of this strategy requires a new training environment that is flexible, supports repetition, reduces overhead and is available at the point of need.

A joint effort between the U.S. Army Research Laboratory and several entities — University of Southern California Institute for Creative Technologies, Combined Arms Center-Training and Program Executive Office for Simulation, Training and Instrumentation — are working to research, prototype and eventually deliver the Synthetic Training Environment, otherwise known as STE.

Read the full article on the U.S. Army website.

Disney’s ‘Magic Bench’ Fixes AR’s Biggest Blind Spot

ICT’s David Nelson and Todd Richmond talk with Brian Barrett of WIRED about Disney’s use of augmented reality in ‘Magic Bench’.

Read the full story in WIRED.

Real-time Digital Human Avatar Rendering Leaves SIGGRAPH 2017 Attendees Stunned

Attendees at SIGGRAPH 2017 were treated to a look at the future of real-time digital human rendering. SIGGRAPH’S VR Village hosted an experience that featured interviews by a digital avatar being “driven” in real-time (yes, like in Ready Player One) by the human Mike, who wore a special rig that captured his motions and expressions.

Read the full article by Alex Wall in Medium.

Researchers Showcase Impressive New Bar for Real-time Digital Human Rendering in VR

A broad team of graphics researchers, universities, and technology companies are showcasing the latest research into digital human representation in VR at SIGGRAPH 2017. Advanced capture, rigging, and rendering techniques have resulted in an impressive new bar for the art of recreating the human likeness inside of a computer in real-time.

MEETMIKE is the name of the VR experience being shown at this week at SIGGRAPH 2017 conference, which features a wholly digital version of VFX reporter Mike Seymour being ‘driven’ and rendered in real-time by the real life Seymour. Inside the experience, Seymour is to play host, interviewing industry veterans and researchers inside of VR during the conference. Several additional participants wearing VR headsets can watch the interview from inside the virtual studio.

Read more in Road to VR.

How Artificial Intelligence Could Benefit Those in Empathy-Centric Professions

Pacific Standard asks the question, ‘If care jobs become the last human jobs, could that encourage employers and policymakers to recognize and value them as the economically critical work that they are?’

Read the full article here.

Annual Meeting of Association for Computational Linguistics (ACL)

Annual Meeting of Association for Computational Linguistics (ACL)
July 30, 2017 – August 4, 2017
Vancouver, Canada
Presentations

A New Way for Therapists to Get Inside Heads: Virtual Reality

The New York Times explores the use of VR in exposure therapy, speaking with Dr. Albert “Skip” Rizzo for insight into the process.

Read the full article here.

SIGGRAPH 2017

SIGGRAPH 2017 Los Angeles
July 30, 2017 – August 3, 2017
Los Angeles, CA
Presentations

The Last Human Job?

New America explores care professions and the potential threat of Artificial Intelligence.

Read the full article featuring commentary from ICT’s Albert “Skip” Rizzo here.

Virtual Reality: New Frontiers for Psychiatric Disorders

Italian blog ‘State of Mind’ explores the role virtual reality plays in the psychiatric field. Digging deep into research and including Dr. Albert “Skip” Rizzo’s work, the article features in depth information about bridging the field and technology together.

Read the full article here.

International Society for Research in Emotion (ISRE)

ISRE
July 25-30, 2017
St. Louis, MO
Keynote Speaker: Jonathan Gratch

This is What the Future of Health Care Looks Like

Fast Company talks the future of health care with ICT’s Todd Richmond and the USC Center for Body Computing. See the full video here.

CVPR 2017

CVPR 2017July 22-25, 2017
Honolulu, HI
Presentations

ICCM 2017

15th Annual Meeting of the International Conference on Cognitive Modeling
July 22-25, 2017
University of Warwick, UK
Presentations

The Therapeutic Value of Virtual Reality

Move over, psychedelics; VR is coming to a clinic near you.

AlterNet sat down with Dr. Albert “Skip” Rizzo for a download on what VR can bring to the table therapeutically.

Read the full article in AlterNet.

2017 Naval Future Force Science and Technology Expo

2017 Naval Future Force Science and Technology Expo
July 20-21, 2017
Washington, D.C.
Presentations

Star Wars Resembles Advancements in USC Research

Famous icons of the Star Wars saga, such as lightsabers, holograms, and futuristic robotics, can be seen not only in the films by USC Alum George Lucas, but also in the real-life work at USC.

Researchers at USC are creating real-life revolutionary advancements in technology that resemble the effects in the imaginary intergalactic world of Star Wars: The Force Awakens.

Read the full article in USC’s International Academy, here.

Can a Machine be Alive?

Julien Crockett of LA Review of Books discusses a panel discussion and meeting with ICT’s Jonathan Gratch.

“On the evening I met Actroid-F, Jonathan Gratch, from USC’s Institute for Creative Technologies (ICT), sat with her creator, Yoshio Matsumoto, both, or should I say all three of them, part of a panel to discuss the uncanny valley, defined as a repulsive tendency toward too much human-ness in something that is not fully human. Gratch, whose background is in the nascent field of affective computing, focused his remarks on the artificial intelligence they are building to bring Actroid-F to “life.” He mused on “if and why and how a machine could ‘have’ an emotion and what good that could be.” Motioning up to the screen, he then introduced “Ellie,” a human-like software agent and the prototype for Actroid-F’s AI. Both of them endowed with a strong posture and calm poise, it is easy to see their similarities.”

To read the full article, visit LA Review of Books here.

Enlisting Virtual Reality to Ease Real Pain

Virtual reality technology engages a person in a 360-degree visual experience. It has been used in medical research for more than two decades, to treat trauma, anxiety and even burn pain. The fact that it can now be accessed with headsets and mobile phones is fueling hospitals’ interest.

The Wall Street Journal explores ways in which VR can help ease pain, and speaks with Dr. Skip Rizzo for more insight. Read the full article here.

A Hologram of a Holocaust Survivor Will Answer Any Question You Have About the Genocide

Soon you’ll be able to ask a hologram of a Holocaust survivor any question you want, and he will answer it.

That’s thanks to the New Demensions in Testimony project from the USC Shoah Foundation and the USC Institute for Creative Technologies (ICT).

Circa explores more, see the video and read the full article here.

HCI International 2017

HCI International 2017 (Human-Computer Interaction International Conference)
July 11-13, 2017
Vancouver, Canada
Presentations

ESRI User Conference

ESRI User Conference
July 10-14, 2017
San Diego, CA
Presentations

13 Secrets Your Smile Can Reveal About You

Smiles are often used to cover up another emotion. “For example, someone might start to frown then cover this with a smile,” says Jonathan Gratch, who is based at USC’s Institute for Creative Technologies in Playa Vista, California, where he is the director for virtual human research “The nature of a smile also communicates subtle information about its authenticity.” Another telltale sign is if a smile that starts and ends too quickly is seen as not genuine, he says.

Reader’s Digest explores 13 secrets your smile reveals about you, talking with ICT’s Jonathan Gratch for more insight. Read the full article here.

Using Artificial Intelligence for Mental Health

Innovative technology is offering new opportunities to millions of Americans affected by different mental health conditions.

 Advancements in artificial intelligence (AI) are bringing psychotherapy to more people who need it. Nonetheless, the benefits of these methods need to be carefully balanced against their limitations. The long-term efficacy of the AI approach regarding mental health is yet to be tested, but the initial results are promising.

Very Well takes a deeper look into this area, featuring ICT’s MultiSense and SimSensei technologies. Read the full article here.

Get Expert Advice on Mixed-Reality Game Design at VRDC Fall 2017

Organizers of the Virtual Reality Developers Conference would like to quickly let you know about two standout sessions taking place later this year at VRDC Fall 2017, which takes place September 21-22 at a bigger, better venue in San Francisco!

Notably, XEODesign president Nicole Lazzaro will be at the show to give a cutting-edge talk on mixed-reality game design. Her “‘Matrix’ vs. Pokemon Go: The Mixed Reality Battle for the Holodeck” session will feature 3 compelling future MR scenarios to illustrate 5 core MR design techniques.

Lazzaro will draw on her 20+ years of interactive experience design as she dives deep into the design requirements for compelling MR that takes advantage of virtual world overlays, depth maps of existing terrain, NPC and object interaction, and character customization.  If you have any interest in creating effective, impactful mixed-reality games and experiences, don’t miss it!

Also at VRDC Fall 2017, Arno Hartholt (Director of Research and Development Integration at the University of Southern California Institute for Creative Technologies) will be presenting a great session on “Immersive Medical Care with VR/AR and Virtual Humans.” that aims to break down how VR/AR offers unique capabilities for health-related research and treatment.

Check it out, and you’ll  learn how VR/AR and virtual humans can be applied to worthy causes beyond gaming and entertainment, particularly within the medical domain. You’ll walk away able to discuss examples of how these capabilities allow researchers to study human behavior and specific ailments, and how they can lead to various treatments, e.g. PTSD or pain. Also, you’ll get a practical overview of how these systems can be designed, developed and assessed.

And of course, VRDC Fall 2017 organizers look forward to announcing many more talks for the event in the weeks to come. Don’t forget to register early at a discounted rate!

Since tickets sold out for the first three VRDC events, VRDC Fall 2017 will offer more sessions and move to a bigger location at the Hilton Union Square in San Francisco, CA September 21-22.

For more information on VRDC Fall 2017, visit the show’s official website and subscribe to regular updates via Twitter and Facebook.

 

Via Gamasutwa.

Find Out What Your Smile Tells About You

AAJ News covers new research from ICT, read the full article here.

Can VR Make You Feel and Heal Better?

Albert “Skip” Rizzo remembers the first wave of VR excitement back in the nineties. Primed by cyberpunk and films like Lawnmower Man, the public became infatuated by the potential of this seemingly futuristic technology.

Visit Transport for the full interview.

IAIED (Artificial Intelligence in Education) 2017

IAIED 2017
Wuhan, China
June 28, 2017 – July 2, 2017
Presentations

Army’s Virtual Reality Therapy Helps Soldiers Combat PTSD

The Army Research Laboratory’s Public Affairs office features a piece about PTSD in honor of June’s PTSD Awareness Month.

Read the full article about Bravemind here.

Is Clinical Virtual Reality the Future of Therapy?

Between eight and 18 percent of American veterans returning from wars in Iraq or Afghanistan have some form of PTSD. Stigma against seeking care remains, and among those who do, the results aren’t great: a recent editorial in the Journal of the American Medical Association—Psychiatry called out an “ongoing crisis” in PTSD care. An average of 28 percent, and as many as 40 percent of patients drop out of treatment programs, and those who do finish have no guarantee the symptoms will decrease.

Into this comes Bravemind, a virtual reality therapy program lead by Albert “Skip” Rizzo at the University of Southern California’s Institute for Creative Technologies, which has helped thousands of veterans.

Thrive Global talked with Dr. Skip Rizzo about VR exposure therapy and what it means for both clinicians and patients. Read the full article here.

International Conference on Artificial Intelligence in Education (AIED 2017)

AIED 2017
June 28, 2017 – July 2, 2017
Wuhan, China
Rapid Avatar Presentation

The Quarry that Formed the Virtual Reality Geniuses of Facebook and Microsoft

El Diario explores ICT’s MxR Lab. Read the full article here.

Virtual Reality: The New Game in Mental Health Care to Improve Outcomes

Virtual reality is seeing an upsurge in use by mental health practitioners for treating conditions such as post-traumatic stress disorder, panic disorders and anxiety in a safe and controlled manner. With the advent of affordable VR headsets and technological advances, companies and researchers worldwide are seizing on the opportunity to bring such techniques as VR exposure therapy and cognitive behavioral therapy to telemedicine, specialty clinics and directly to consumers to improve outcomes and better lives.

Pharma Intelligence, via MedTech Insight features Bravemind in this piece. Read the full article here.

On the Couch? Your Therapist May Soon be Replaced by a Robopsychiatrist

Factor Daily learns about SimSensei and the possibility of robopsychiatrists in the future.

Read the full article here.

So We Never Forget, Holograms Keep Deliver First-Person Holocaust Survivor Testimony

A look inside New Dimensions in Testimony with Fast Company.

Read the full article here.

How Artificial Intelligence Will Disrupt Your Life

We are on the verge of a technological revolution that will fundamentally alter the way we live, work, and relate to one another unlike anything humankind has experienced before. The main driver for this technological revolution is Artificial Intelligence (AI).

Ray Williams explores more for Psychology Today. Read the full article here.

Digital Technology to Engage Patients: Ensuring Access for All

We are in a new communication age in which change has happened so fast that the technology we use every day was unimaginable to most of us 20 years ago. Data show that 77% of Americans now own a smartphone and that ownership is ubiquitous in the 18- to 29-year-old age group. Older citizens are catching up, with 74% of 50- to 65-year-olds connecting in this way; however, this rate decreases to 42% among those over the age of 65.

Digital technology and much of the “disruptive innovation” that is needed in health care delivery is likely to be driven by start-ups, which are often run by educated 20- to 30-year-olds. However, these people are not the same as those who may be the greatest users of health care, such as the oldest old, lower-income patients, and individuals with chronic health care problems. To ensure that we engage the patient groups who have much to gain from the more flexible health care interactions that digital innovation can provide, we must consider issues of computer literacy, access, and trust.

To learn more, read the full article about USC Center for Body Computing and its partnership with ICT to develop the Virtual Care Clinic.

How VR is Transforming the Workplace As We Know It

IoT Now explores the various ways in which VR is entering the workforce. As part of the piece discussing VR’s role in the workplace, ICT’s Bravemind is featured as an example of technology ‘changing lives’.

Read the full article here.

Why Virtual Reality Could Create a Danger for Actual Reality

At the Electronic Entertainment Expo, consumers get to learn all about new video-games and the hardware that drives them

One hardware that’s considered to be the next frontier for gaming and other media is Virtual Reality.

VR headsets create a new level of immersion –  one that takes a user right inside an artificial world.

But some say that intense experience can also create a level of danger that hasn’t been fully recognized.

“The difference between virtual reality in a headset and a screen based game is the level of embodiment that you have,” Todd Richmond told Take Two’s A Martinez.

Listen to the full interview on 89.3 KPCC.

Website Displays Capabilities for Using Drones to Develop 3D Maps

A new website showcases how camera-equipped drones create three-dimensional models that can help Army leaders plan training exercises.

Dronemapping.org was created by the Institute for Creative Technologies (ICT) at the University of Southern California; the Combined Arms Center — Training (CAC-T), Fort Leavenworth, Kan.; Army Research Laboratory, Orlando, Fla., and the U.S. Military Academy, West Point, N.Y. ICT is a Defense Department University Affiliated Research Center.

The U.S. Army sat down with the team to discuss Dronemapping.org, read the full article here.

Virtual Reality: A by-the-numbers look at how this perception-bending tool is changing medicine

Proto Magazine investigates how VR is used.

Read the full article here.

USC Researcher Uses Virtual Reality — and Her Mother’s Sewing Machine — to Treat Stroke Survivors

For people recovering from a stroke, even the simplest motions can become a struggle. To lift a hand, for example, requires a signal from the brain that travels all the way down an arm to the hand. That’s a lot of moving parts — and when something is damaged, it makes regaining those skills an arduous and slow process.

That could all change, though, with the help of some innovation and advances in virtual reality.

It was almost by chance that USC researcher Sook-Lei Liew started thinking about virtual reality. She was a neuroscientist; so was her husband. When she became a USC faculty member in 2015, her husband got a job in the Mixed Reality Lab at the USC Institute for Creative Technologies — and, between the two, a brain trust was born between VR and stroke rehab.

For Liew, the light bulbs really started to flash when she attended the Neurotech conference — a big industry-academic partnership featuring the latest in tech advances. Liew had already been working on stroke rehab for a while and studying brain-feedback interfaces — devices that essentially allow patients to see what is going on inside their brains.

At the Neurotech conference, something clicked.

Read the full article in USC News.

The Future of Virtual Reality

Start Replay sat down with Dell Computing to discuss the future of virtual reality. In an interview with Director of Workstation Virtualization, Commercial VR and AR at Dell, Gary Rayburn, Joshua Ball learns more about Dell’s partnership with ICT and the DoD – Bravemind.

Read the full article in Start Replay.

25 Individuals of Influence

PTSD Journal names 25 leaders in the study and treatment of the disorder to symbolize the increasing work done nationwide to diminish the effects of trauma. This group of individuals recognizes the need for support, outreach and innovation to defeat the condition.

ICT’s Skip Rizzo makes the list, read the full article in PTSD Journal here.

Smug Losers: DON’T Smile When You Win Say Scientists Because it Only Inspires Your Opponent to Beat You Next Time

A smile might be all that stands between victory and defeat while playing a game, according to a new study.

New research done by the USC Institute for Creative Technologies indicates that the simple act of smiling while winning can actually decrease chances of succeeding against the same person when matched up again.

The Daily Mail investigates further, read the full article here.

Smiling When You Lose May Increase Your Odds of Winning Later On

You may want to save the smile for when you’re losing, not winning, according to a study done at the University of Southern California.

The study, funded by the U.S. Army Research Laboratory and presented by the USC Institute for Creative Technologies, found that smiling decreases your odds of continuing your winning streak against the same opponent.

Read the full article in Men’s Fitness.

Your Smile Gives You Away

Smiling during victory could hurt future chances of cooperation, USC researchers find.

By Ian Chaffee, USC News

Smile and the whole world smiles with you? Well, not necessarily.

In a winning scenario, smiling can decrease your odds of success against the same opponent in subsequent matches, according to new research presented by the USC Institute for Creative Technologies and sponsored by the U.S. Army Research Laboratory.

People who smiled during victory increased the odds of their opponent acting aggressively to steal a pot of money rather than share it in future gameplay, according to a paper presented in May at the International Conference on Autonomous Agents and Multiagent Systems by USC ICT research assistant Rens Hoegen, USC ICT research programmer Giota Stratou and Jonathan Gratch, director of virtual humans research at USC ICT and a professor of computer science at the USC Viterbi School of Engineering.

Conversely, researchers found smiling during a loss tended to help the odds of success in the game going forward.

The study is in line with previous research published by senior author Gratch, whose main interest lies both in how people express these tells — an unconscious action that betrays deception — and using this data to create artificial intelligence to discern and even express these same emotional cues as a person.

“We think that emotion is the enemy of reason. But the truth is that emotion is our way of assigning value to things,” said Gratch. “Without it, we’d be faced with limitless choices.”

Gratch and other ICT researchers hope to imbue virtual humans and even robots with value-based assessment using emotional pattern recognition and reaction to form what might be called intuition or gut level decision-making.

Grin and bear it, but don’t gloat

Part of this research is accounting for the kind of emotion-based reasoning that might lead someone to act against their rational self-interest for the short-term satisfaction of “payback” — that is, cutting off their nose to spite their opponent’s smiling face.

For the AAMAS study, 370 participants played a version of the British television game show Golden Balls, where participants decide to “split” or “steal” a pot of money. If both participants choose “split,” they do just that — split the pot. If one player chooses to split with the other stealing, the latter gets the whole thing. If both choose to steal, neither wins.

Each participant was paid $30, with participants receiving additional tickets for a $100 lottery generated by their total number of successful “steals” and “splits.”

As participants played the game against each other on video Skype, reactions were recorded and encoded using emotion-tracking software that captures muscle movements in the face including cheek, lip and chin raises, dimples, and the compression and separation of lips.

As for the motivations of the players, researchers hypothesize that successful, smiling stealers open themselves to future punishment by the loser, while smiling during such a loss is seen as a gesture toward cooperation and a feeling of mutual success.

Teaching machines the power of a smile

In a similar study Gratch co-authored with ICT senior research associate Gale Lucas and colleagues in 2016, participants were shown to often misread honesty when negotiating with each other because reassuring cues like head movement, positive language and even smiling signal honesty, but actually more frequently represent dishonest action and behaviors.

Gratch has worked closely with the USC Marshall School of Business over the last several years to incorporate virtual humans that can understand these types of nuances into the study of negotiation. The Institute for Creative Technologies also works with agencies like the U.S. Army to use virtual humans in negotiation scenarios.

From Arthur Samuel’s checkers-playing AI of the 1950s and 1960s to the Joshua computer’s tic-tac-toe game of mutually assured destruction in the 1983 movie WarGames, artificial intelligence has been depicted as especially well-suited to beating people at their own, somewhat linear and strategy-based games.

IBM’s Deep Blue also famously and successfully battled chess master Garry Kasparov in the 1990s, and the computer system Watson did the same with its human opponents on Jeopardy! in 2011.

In the last year alone, different AIs have beaten top players in both the ancient game of Go and professional poker, the latter relying on bluffing, tells and accurate emotional readings of the opponent.

What It’s Like to Talk to an Adorable Chatbot About Your Mental Health

With the release of Woebot for Facebook Messenger, Mashable investigates what prompted this initiative and talks to Jonathan Gratch for more about the mental health and technology relationship.

Read the full article in Mashable.

Virtual Reality Lab Opens News Doors for Students in Carson

The Daily Breeze reports on zSpace’s mobile, virtual-reality education lab parked outside Carson’s Eagle Tree Continuation High School and talks with ICT’s Todd Richmond about how virtual and augmented reality will change education.

Read the full article in Daily Breeze.

USC Institute for Creative Technologies Welcomes 2017 Summer Interns

School is out for the summer, and that means a new crop of interns has entered the building. Students from over 40 schools including Penn State, Stanford, Notre Dame, USC, UCLA, Albany State, UC Berkley, University of Texas, Tufts, Ohio Stage, Virginia Tech, UNC, Michigan State, University of Paris, Brandeis and USMA West Point have been accepted for a chance to take part in various internships at University of Southern California’s Institute for Creative Technologies.

Over the course of four weeks between May and June, 80 interns arrive eager to learn and bond with those who share the same passion for artificial intelligence, augmented reality, modeling, simulation, virtual reality, analytics, 3D visualization, biomechanics, robotics, and dialogue.

“So far it has been a great experience,” said Sai Sree Kamineni, USC Institute for Creative Technologies intern and University of Southern California student. “The project I am in demands me to learn new things that I haven’t tried before, and the work culture here is unique which allows us to research, experiment and be more creative.”

This summer, interns will spend 10 to 12 weeks with varying hours doing research for their field of interest. Students will gain experience with educational hands-on opportunities allowing them to work side-by-side with computer scientists, engineers, psychologists, and more.

USC’s Institute for Creative Technologies aims to create compelling immersive systems for effective learning for military, entertainment and educational purposes. Internships are supported, in part, by the Army Research Laboratory and offer creative and technical students the opportunity to gain hands-on experience in simulation, interactive media and virtual reality fields.

IWSDS 2017

IWSDS 2017 (International Workshop on Spoken Dialogue Systems)
June 6-9, 2017
Farmington, PA
Presentations

Virtual, Augmented, and Mixed Reality Opens Up a World of Possibilities for Publishers

lbert Einstein once said that reality is merely an illusion—albeit a very persistent one. Jump ahead to 2017, and that sounds as if it’s a spot-on prophecy, envisioning the rise of the most persistent illusory reality of them all: a virtual one in which the line between truth and fantasy has been blurred to satiate our endless curiosity and fascination.

All it took to set the wheels in motion was the right technology. But moving this train forward will require creativity and ingenuity in the form of immersive content. Therein lies a golden opportunity for digital publishers and content providers who want to climb aboard—and the risk of becoming irrelevant for those who choose not to, say the experts.

EContent Magazine talks with Todd Richmond more about this, read the full article here.

Hear Bombs Fall on London with VR Recreating of The Blitz

VR Focus talks with Skip Rizzo about the emotional reactions users have when using Virtual Reality.

Read the full article on VR Focus.

With VR History You Can Hear the Bombs Drop in Trafalgar Square

VR engineers are wading through archival records and code to recreate our most important moments. Motherboard examines and speaks with ICT’s Dr. Skip Rizzo.

Read the full article on Motherboard.

Virtual Reality a Friend and Foe in Terror Fight

BBC News recently suggested that 2017 will be the year of VR movie making, and Rolling Stone featured an article about efforts at the University of Southern California’s Institute for Creative Technologies to use the technology to treat post-traumatic stress disorder (PTSD). The institute is funded by the U.S. Army Research Laboratory and is at the forefront of VR advances.

Read more in Signal Magazine.

‘House of Cards’ Discovers VR, Shows Us it’s Silly and Deadly Serious

Adario Strange of Mashable breaks down Season 5 of ‘House of Cards’ and how the producers incorporated VR into the show, and how its use of the technology to treat PTSD is quite similar to that of Bravemind.

Read the full article on Mashable.

How Virtual Reality is Helping Soldiers Recover from War

3DVR Central debriefs from Skip Rizzo’s TEDx talk back in May. Read the report here.