Inside the VR Therapy Designed to Help Sexual Assault Survivors Heal by Facing Attackers

ABC News investigates new VR technology used to help treat sexual assault survivors, speaking with ICT’s Dr. Skip Rizzo about virtual reality exposure therapy.

Continue reading the full article here.

Using Virtual Reality to Treat Addiction

The Fix looks at ways in which VR can help support the treatment of addiction.

Spotting Fake News in a World with Manipulative Video

For CBS News, ICT’s Hao Li talks about manipulating images with Carter Evans.

Is the VR Universe in Ready Player One Possible?

For this Giz Asks, Gizmodo reached out to a number of VR experts, including Arno Hartholt and Skip Rizzo, about the plausibility of an OASIS-like platform coming onto the market, and how much computing power would be required to sustain it.

Continue reading the full article in Gizmodo.

Virtual Reality Now Being Used to Treat PTSD

Examining VR treatments for PTS, via News Blaze.

13th International Conference on Persuasive Technology

13th International Conference on Persuasive Technology
April 16-19, 2018
Waterloo, Canada
Presentations

2018 IEEE International Conference on Acoustics, Speech and Signal Processing

2018 IEEE International Conference on Acoustics, Speech and Signal Processing
April 15-20, 2018
Calgary, Alberta, Canada
Presentations

A Holocaust Victim’s Decree: ‘Always Remember Who You Are’

Continued coverage of the New Dimensions in Testimony project from The Jerusalem Post.

At USC’s Dev Night, Nerds and Cinema Geeks Unite

PC Magazine stopped by Dev Night, an after-hours hangout for USC engineering and cinema students, and checked out projects from two filmmakers who fuse programming with art.

Read the full article here.

How Artificial Intelligence Helps Tech Students in the Learning Process

Universities will be among the first to adopt various Artificial Intelligence (AI) technologies, otherwise, they will become irrelevant and ultimately redundant. If done right, the technology can greatly enhance the learning process and empower what universities and students do.

Artificial intelligence is yet to become a standard in schools, but it has the potential to transform the educational field. It’s is a technology whose time has certainly come because it can already outperform humans in many ways. However, it can be very helpful for tech students.

StartUs looks at how AI is helping students, to read more click here.

Synthetic Training Environment (STE)

Download a PDF overview.

The development of advanced simulation technologies for training is underway. Converging live, virtual and constructive experiences will enable units to achieve the highest levels of warfighting readiness and give valuable training time back to Units and their Soldiers.

The U.S. Army must train to win in a complex world that demands adaptive leaders and organizations that thrive in ambiguity and chaos. To meet this need, force 2025 and beyond, the Army’s comprehensive strategy to change and deliver land-power capabilities as a strategic instrument of the future joint force, requires a new training environment that is flexible, supports repetition, reduces overhead and is available at the point of need.

The Synthetic Training Environment (STE) will provide training to the point-of-need using the latest in immersive and mobile technologies. STE is a collective training environment that optimizes human performance within a multi-echelon mixed-reality environment. It provides immersive and intuitive capabilities to keep pace with a changing operational environment and enable Army training on joint combined arms operations. The STE moves the Army away from facility-based training, and instead, allows the Army to train at the point of need — whether at home-station, combat training centers or at deployed locations.

Leveraging current mixed reality technologies, STE blends virtual, augmented and physical realities, providing commanders and leaders at all levels with multiple options to guide effective training across active and dynamic mission complexities. STE will provide intuitive applications and services that enable embedded training with mission command workstations and select platforms.

Augmented Reality is Changing How Newspapers (and Readers) are Seeing Things

Editor&Publisher explores how AR technology can be used in publishing newspapers, tapping into ICT’s Todd Richmond for insight.

Read the full article here.

Technology, Mind & Society (American Psychological Association)

Technology, Mind & Society (American Psychological Association)
April 5-7, 2018
Washington, D.C.
Presentations

A Whole New World – Virtual Reality in Social Work

“In using exposure therapy in its traditional form, you would use the imagination alone. You are asking them to narrate the experience in the safety of the office and then process those emotions in the office. The use of VR helps take that experience to another level.” Skip Rizzo tells Lindsey Getz for Social Work Today.

Continue reading the full article here.

Asos’s New Technology Lets Shoppers See Clothes on Different Body Types

ICT’s David Krum is quoted throughout this Today.com piece about retailer ASOS using AR technology to show clothing on various models.

Read the full article here.

Artificial Intelligence: Promise and Peril

During the HPA Tech Retreat eXtra (TR-X) last month, Phil Lelyveld, program lead for the AR/VR Initiative at the Entertainment Technology Center @ USC, spoke about “Artificial Intelligence: Immersion, Story, Technology and Ethics.” He started by reminding attendees that although the market has divided “virtual reality” and “augmented reality” into two separate verticals, it’s actually a continuum. “The goal is to create objects or experiences indistinguishable from real experiences, which can impact your brain like a real experience,” he says. Researcher Skip Rizzo, director for the Medical Virtual Reality Institute for Creative Technologies, describes all of it as “mental stimuli,” noting that “we already live in a mixed reality world.”

As it advances, this world of mixed reality will also be impacted by social media, world building, crowd-sourcing and data from dozens of Internet of Things devices, from smart watches to smart houses. Then comes artificial intelligence. “AI will shape and filter the information you get through AR or VR, so it can have a huge impact on how you view the world,” he says. Lelyveld showed “Eclipse,” a music video commissioned by Saatchi & Saatchi that was made completely in AI systems. “When it was shown in film festivals, side by side with other music videos created by humans, the audience couldn’t tell the difference,” he reports.

Continue reading the full article in Hollywood Professional Association.

Interactive Holocaust Education

Continued coverage of the New Dimensions in Testimony project. Read the full article in Times of Israel.

Army’s Institute for Creative Technologies Advances Research to Reality

Soldiers, Airmen, Marines and Sailors have benefited from the research created by the U.S. Army Research Laboratory and its academic partner, the Institute for Creative Technologies at the University of Southern California. The research collaborators continue to make great strides in the fields of artificial intelligence, simulated graphics, immersion and virtual reality-supporting both military and civilian research.

Continue reading this ARL and ICT feature on the U.S. Army website.

Army’s Institute for Creative Technologies Advances Research to Reality

ABERDEEN PROVING GROUND, MD — Soldiers, Airmen, Marines and Sailors have benefited from the research created by the U.S. Army Research Laboratory and its academic partner, the Institute for Creative Technologies at the University of Southern California. The research collaborators continue to make great strides in the fields of artificial intelligence, simulated graphics, immersion and virtual reality-supporting both military and civilian research.

Established in 1999 as an University Affiliated Research Center, the Army’s Institute for Creative Technologies launched an effort to link Army and academic researchers with the creative talents of Hollywood and the entertainment industry. Bridging these worlds, researchers believed it would help influence the trajectory of technological research and advancement in both the military and civilian sectors.

This influence has become a reality. In the civilian sector, industry has adopted low-cost virtual reality methods, the research and development has enabled citizens at museums from Illinois to Nanjing to have real-time conversations with digitally recreated survivors of genocide.

In the military sector, leaders from every echelon use its technologies to sharpen situational awareness, interpersonal communication and decision-making skills. One example of this is the Team Assessment and Learner Knowledge Observational Network, or TALK-ON.

TALK-ON is a mixed-reality test bed designed to explore questions of simulation fidelity, assessment and feasibility involved in using consumer virtual reality technologies for armored vehicle leadership training. The TALK-ON prototype is focused on cognitive and communication skills training of novice tank platoon leaders, who must accurately assess tactical situations, make quick decisions, and communicate effectively with the tank crew, other tanks within and across platoons and higher command.

“Our work on the TALK-ON project represents a true collaboration across ARL and ICT, where we have engaged in joint data collection with subject-matter experts and trainees to study how they experience the tank simulator technology,” Dr. Pete Khooshabeh, TALK-ON principal investigator and acting regional lead of ARL West said. “If the project team was exclusively comprised of just academic or military researchers, it would have been less successful than our cross-disciplinary combined effort.”

Since its doors opened, researchers have created scientific knowledge about the ways computers can better simulate the human experience. Its developers have built on this research to create immersive capabilities to engage users to explore, create, innovate and validate.

“Through partnerships with the ARL Simulation and Training Technology Center in Orlando, Florida and the national center of modeling, simulation and training in Central Florida, innovations have been integrated into many solutions that serve Army readiness,” Col. Harold Buhl, deputy director of the Human Research and Engineering Directorate, ARL Orlando said.

While the institute is expanding transfer of technology and research to industry and commercial organizations, the relationship continues to serve as a valuable asset to the DOD and military. Troops continue to deploy in complex environments across the globe confronting highly adaptive adversaries, requiring them to make complicated decisions, lead diverse teams of humans and machines, and operate in joint, multi-agency and coalition environments.

To prepare commanders for these complicated decisions and coalition environments, develops user-driven cognitive trainers.

One such trainer is DisasterSim, a game-based tool that teaches members of a joint task force how to respond to a humanitarian assistance/disaster relief mission in a foreign country.

In DisasterSim, trainees must attempt to restore essential services, reconstruct civil infrastructure and provide humanitarian assistance, all while managing interactions with local civil authorities, non-governmental organizations, and other U.S. government relief organizations.

They must use their judgment to prioritize and execute lifesaving tasks while operating within DOD limits related to medical relief and infrastructure repairs. Trainee actions in the exercise can impact future interactions and may also influence overall scenario. The game is used as part of U.S. Agency for International Development’s joint humanitarian operations course and was sponsored by U.S. Army South.

To simulate multi-domain battle, future training must be a convergence of mixed reality and live training and delivered to the point of need. The applied research project One World Terrain, or OWT is moving the Army toward this future and supports the Army modernization priority for a synthetic training environment.

This project assists the U.S. military by creating the most realistic, accurate and informative 3-D representations of the physical and non-physical landscape. Informing the Army modernization priority for a synthetic training environment, the goal is to establish an authoritative 3-D terrain dataset for next-generation modeling and simulation systems. This capability will reside in infrastructure. Research for this effort has transitioned to the U.S. Marine Corps tactical decision kit.

“The focus of One World Terrain is to create foundational 3-D geospatial data that can be collected, processed, stored and served to any number of different modeling and simulation end points,” said Ryan McAlinden, director of modeling and simulation at ICT. “We want to give field units the opportunity to own and manage the data that they rely upon for training, rehearsal and operations.”

Soldiers also face significant challenges outside of the strategic and tactical domains. To help them tackle these challenges, researchers and developers created the Emergent Leader Immersive Training Environment, or ELITE.

ELITE targets leadership and basic counseling for junior leaders in the U.S. Army. The experience incorporates a virtual human, classroom response technology and real-time data tracking tools to support the instruction, practice and assessment of interpersonal communication skills.

The training application is also used to educate U.S. Army sexual assault response coordinators and victim advocates (ELITE SHARP POST), as well as Army command teams on the SHARP program (ELITE SHARP CTT), and trains junior Army leaders to successfully intervene when observing behavior that could lead to potential incidents of sexual harassment and sexual assault (ELITE SHARP BRAVE).

“Recently the Army Research Institute validated the effectiveness of training with the ELITE platform, as it was shown to increase users’ knowledge of appropriate response to SHARP incidents and users’ confidence in responding to SHARP incidents,” said Matthew Trimmer, project director at ICT. “That was a big win for not just the project members at ICT and ARL, but for the entire ELITE team. It truly has been an honor working on this effort.”

The team also includes the SHARP Academy, Army SHARP Office, U.S. Army Forces Command, U.S. Army Pacific and U.S. Army Training and Doctrine Command Capability Manager Virtual and Gaming, which serves as the Army’s user representative for virtual and gaming capabilities to satisfy Army training requirements.

The Army-funded team collaborates across multiple disciplines to develop cutting-edge technologies to better assist in these scenarios. Together they push educational research forward to achieve readiness faster and sustain it longer.

Researchers have also leveraged this multidisciplinary approach to support service members returning from operations using virtual reality and virtual human characters through Bravemind: Virtual Reality Exposure Therapy, or VRET. With this technology, patients with post-traumatic stress can confront their trauma memories.

Through a virtual retelling of the experience, the therapy produces a meaningful reduction in patients’ symptoms. VRET is an endorsed, evidence-based treatment and can be found at more than 90 sites, including VA hospitals, military bases and university centers. Patients use VRET – guided by a trained therapist – to confront their trauma memories through a retelling of the experience. The therapy has produced a meaningful reduction in posttraumatic stress symptoms and recent clinical research supports the conclusion that VR is an effective tool for delivering this form of “evidence-based” treatment.

Another way the team supports returning service members is through the Virtual Interactive Training Agent for Veterans, or VITA4Vets system. It is a virtual simulation practice system designed to build job interviewing competence and confidence, while reducing anxiety. It was originally developed by the institute with support from the U.S. Army, Google.org, and the Dan Marino Foundation to support young adults with autism spectrum disorder and other developmental disabilities. Because of its success with the Dan Marino Foundation and Google.org, the platform was augmented to support out-processing military service members and veterans.

“Regardless of talent, experience or temperament, some returning service members may find it challenging to express how their skills and experience can translate into the private sector,” Trimmer said.

By conducting interdisciplinary basic research, innovating with applied research, demonstrating with advanced technology development, and creating best-of-breed prototypes to include transitioning and commercializing research and applications, the Army’s ICT team leads the way in maintaining the military’s technological advantage.

“This partnership is revolutionizing the way people understand and prepare for many situations they will face,” said Buhl. “Whether it is a Soldier experiencing combat with confidence gained in tough realistic and iterative immersive training, or a Soldier working through the rigor of post-deployment transition, the technologies are adding value to our Army and to our society.”

Scientists Launch Global Effort to Find the Next Diabetes Drug

USC researchers issue a call to scientists to help them create the first comprehensive model of a cell that is central to diabetes, the pancreatic beta cell.

The challenge of understanding the pancreatic beta cell could be compared to getting to know an unfamiliar city. In a collaboration with the World Building Media lab at the USC School of Cinematic Arts and working with professors Alex McDowell and Todd Richmond, the consortium is trying to use world-building tools, similar to what was used to craft the film “Minority Report,” and portray the cell as a world.

Continue reading the full article in USC News.

Israeli Tech Helps Detect Depression in People

University of Southern California Institute for Creative Technologies researchers, as a part of their pioneering efforts within DARPA’s Detection and Computational Analysis of Psychological Signals (DCAPS) project, developed a tool, SimSensei, to spot signs of depression and post-traumatic stress disorder.

It uses an algorithm called k-means algorithm, which can put large data sets into clusters based on average values, which is then compared to normal’ speech patterns.

Such innovations are intended as a form of healthcare support and not — as in the case of DCAPS — aimed at providing an exact diagnosis. They can be useful in offering a general metric of psychological health.

An alarmingly high number of people across the world suffer from depression and other mental health problems. Studies confirm that if the risk factors contributing to depression are known, and if the most susceptible are assisted early, the condition may be promptly averted.

Continue reading the full article in TechnoChops.

This Emotionally Intelligent Device Is Helping Kids with Autism Form Bonds

An in-depth look at technology helping children with autism, citing ICT virtual human research as early sources of AI use for mental health treatments.

Read the full article in Tonic.

Plenty of New Health Care Technology, but Will Doctors Use It?

The Internet, electronic health records and artificial intelligence can help doctors make the correct diagnosis and make better decisions. And that is just one way that technology can make health care more efficient, more accurate and less costly.

Post-SXSW coverage in the Houston Chronicle featuring ICT’s Skip Rizzo’s commentary on using VR for PTS treatment.

IEEE VR 2018

IEEE VR 2018
March 18-22, 2018
Reutlingen, Germany
Presentations

Fear of Public Speaking Could Be Solved With Virtual Audience

Public speaking can heighten anyone’s anxiety. Cicero, a program named after the famed Roman orator, aims to help people overcome that fear — with the help of a virtual audience.

Continue reading about this new research in USC News.

How the USC Institute for Creative Technologies is Using VR to Treat PTSD

Skip Rizzo, the center’s director for medical virtual reality, spoke with TechRepublic’s Teena Maddox at SXSW about VR’s potential to help those with PTSD process traumatic memories.

Continue reading here.

How Dell Aims to Raise Awareness of Environmental Issues Through VR Partnerships

At SXSW, TechRepublic’s Teena Maddox spoke with Dell’s director of VR and AR about the company’s partnerships, which include one with ICT in treating veterans with PTSD and exposure therapy inside of VR to help rehabilitation.

Continue reading in TechRepublic.

SXSW ’18 – IEEE Tech for Humanity Series

SXSW ’18 – IEEE Tech for Humanity Series March 9-18, 2018
Austin, TX
Presentations

How Virtual Reality Can Transform PTSD Treatment

University of Southern California’s President C.L. Max Nikias pens a blog post about Bravemind, the VR exposure therapy tool to help treat PTSD, for the Wall Street Journal.

Read the full piece on WSJ.com.

New Tech Lets Users Create Their Own Avatars

While video games have long been at the forefront of virtual reality, the technology is already being used in the fields of gaming, architecture, education and military training, among others. The devices offer high quality displays that provide a wide field of view and the ability to track users’ head movements to create high levels of immersion.

Skip Rizzo, director of medical virtual reality for USC’s Institute for Creative Technologies, tells i-HLS: “We can also put service members with post-traumatic stress disorder into a simulated kind of context in which they were traumatized before to help them better cope with how they handle that.”

Continue reading on i-HLS.com.

Creepy, Real Technologies That Could Be On the Next ‘Black Mirror’

Greg Keraghosian of SF Gate explores AI technology for the future, featuring ICT’s research in virtual humans used as tools for clinicians.

Read the full article here.

ACM IUI 2018

ACM IUI 2018
March 7-11, 2018
Tokyo, Japan
Presentations

13th Annual ACM/IEEE International Conference on Human Robot Interaction

13th Annual ACM/IEEE International Conference on Human Robot Interaction
March 5-8, 2018
Chicago, IL
Presentations

How AI is Disrupting Education

We test students today to see how much they have learned. We measure their test answers against the teacher’s questions. And sometimes we measure their scores against their classmates to see how they did relative to their peers.

But in the world of AI, this is child’s play. There is so much more we can learn about our students than their responses to our questions, and there is so much more students can learn on their own.

AI offers the hope of leading us in the right direction and is already disrupting the field of education and the way we help students learn and grow.

Kids already use AI on their phones, getting answers and directions from Siri, Alexa, and Google Assistant. There aren’t too many questions that these virtual personal assistants can’t answer. And they’re part of a strong growth market for investors.

Continue reading in Disruptor Daily.

Army Launches New SHARP Training Tools

ARL and the Institute for Creative Technologies at the University of Southern California recently launched new training applications to support the response and prevention, or SHARP program.

Click here for DVIDS coverage.

Army Launches New SHARP Training Tools

By Joyce Conant and Sara Preto

The U.S. Army Research Laboratory and the Institute for Creative Technologies at the University of Southern California recently launched new training applications to support the U.S. Army’s sexual harassment/assault response and prevention, or SHARP program.

The Emergent Leader Immersive Training Environment Sexual Harassment/Assault Response and Prevention Command Team Trainer, or ELITE SHARP CTT is a laptop training application that gives Army command teams the knowledge, skills, confidence and ability to successfully execute the SHARP program.

These applications are a part of the ELITE platform and will be used by the Army SHARP Academy-the proponent for this type of training throughout the Army.

ELITE SHARP Bystander Resource Assessment Virtual Exercise, or BRAVE is the new laptop-based training application. It was developed under the guidance of the SHARP Program Management Office and in collaboration with the Army SHARP Academy and the Army Forces Command.

The application targets junior leaders and Soldiers by providing instruction and strategies on how to successfully intervene when observing behavior that could lead to potential incidents of sexual harassment or sexual assault. It employs techniques that have been used successfully in earlier versions of the ELITE suite of training applications. The techniques include a virtual human instructor to deliver up-front instruction on new key concepts, animated vignettes to provide visual examples of “good” and “bad” responses to SHARP incidents and practice exercises where students can apply their new knowledge in realistic training scenarios.

“We are excited to begin leveraging the additional CTT content and the new BRAVE application in our SHARP education and training efforts,” said Col. Christopher H. Engen, director of the Army SHARP Academy. “These latest innovations enable us to continually improve the breadth and rigor of available learning products for use and benefit Army-wide.”

The new version provides more training content than was available in previous ones. Its content includes specific strategies and recommended actions for commanders dealing with situations of retaliation prevention and response. It also includes several additional practice exercises providing command teams with more opportunities to advance their knowledge and hone their skills in dealing with SHARP-related situations within their units.

The new ELITE SHARP functions continue to build on the robust research and development conducted by ARL and the Army’s ICT. Both applications and associated supporting documentation to include training support packages, user guides and software installation instructions are now available on the Army MilGaming web portal at https://milgaming.army.mil/.

—–

Here Come the Fake Videos, Too

Artificial intelligence video tools make it relatively easy to put one person’s face on another person’s body with few traces of manipulation. I tried it on myself. What could go wrong?

Continue reading Hao Li’s expert commentary in the New York Times.

Hao Li Earns Office of Naval Research Young Investigators Award

ICT’s Hao Li, a computer graphics and virtual reality expert, has received an Office of Naval Research (ONR) Young Investigator Award to create highly realistic computer-generated (CG) humans for immersive training purposes.

Li is among 31 scientists selected from more than 340 applicants to receive this highly competitive award, which supports outstanding early-career scientists and academics. The three-year grant totaling almost $600,000 will support Li’s project, “Complete Human Digitization and Unconstrained Performance Capture.”

Continue reading on USC Viterbi’s news site.

USC Hosts London Hackathon Event for Young Entrepreneurs

Competition among secondary-school kids helps mark the opening of USC’s London office, the university’s ninth international office and its first in Europe.

Continue reading in USC News.

Virtual Human Role Players for Studying Social Factors in Organizational Decision Making

A perspective with a sketch for research paradigm(s) to study how different factors might interact w/ differential social power dynamics of individuals in cyber decision-making contexts.

Continue reading in Frontiers in Psychology.

1,000 Reactions – Can virtual audiences make you a better public speaker?

New research designed to help people overcome the fear and anxiety typically associated with public speaking. Cicero works with virtual reality glasses that have the effect of immersing the participant fully in the virtual world to make it as real as possible. In the virtual world, animated avatars that look like real people are coded to react to the speaker. Feedback then depends on the speaker’s aptitude. If they are interesting, the audience will appear engaged: lean forward, display interested facial expressions, nod their heads, etc. If the speaker fails to engage the assembly, audience members will demonstrate their dissatisfaction by leaning back, looking disinterested, shaking their heads, etc.

Continue reading about Cicero on USC Viterbi’s news site.

Anne Frank’s Stepsister Shares Holocaust Insights

Continued expansion of the New Dimensions in Testimony project featured in the Star Tribune.

Can Shoot ‘Em Ups Influence Army Recruitment?

The US Army has long used games as a way to teach tactics to troops and recruits, starting with tabletop war games. In the 1980s the US Defense Advanced Research Projects Agency (DARPA) approached video game designers to create games for training purposes. In the following decade, a modified version of Doom II was used to train marines. The 2004 Xbox game Full Spectrum Warrior was developed by the military-funded USC Institute for Creative Technologies as both a commercial product and a training tool. By putting a training game into the wider world, the army must have been aware of the game’s potential for recruitment and for winning hearts and minds.

Continue reading in Play Station Universe.

Oscar-Nominated VFX Supervisor John Nelson on Blade Runner 2049

Some insight into all of the visual effects behind Blade Runner 2049, from Studio Daily.

Fake Videos Are On the Rise. As They Become More Realistic, Seeing Shouldn’t Always Be Believing

All it takes is a single selfie.

From that static image, an algorithm can quickly create a moving, lifelike avatar: a video not recorded, but fabricated from whole cloth by software.

With more time, Pinscreen, the Los Angeles start-up behind the technology, believes its renderings will become so accurate they will defy reality.

“You won’t be able to tell,” said Hao Li, Director of the Vision and Graphics Lab at USC’s ICT. “With further deep-learning advancements, especially on mobile devices, we’ll be able to produce completely photoreal avatars in real time.”

Continue reading in the Los Angeles Times.

Here’s Where the Pentagon Wants to Invest in Artificial Intelligence in 2019

From Amazon’s Alexa to self-driving cars, artificial intelligence (AI) is rapidly improving and promises to soon transform almost every aspect of life. The Defense Department’s fiscal 2019 budget request offers a window into where leaders see opportunities for artificial intelligence.

Continue reading in Defense News.

How the Army Plans to Use Virtual Humans Powered by Artificial Intelligence

The Human-Virtual Human Interaction program is run out of the Army’s Institute for Creative Technologies (ICT) at the University of Southern California. The goal of the program is to create virtual humans, powered by artificial intelligence, that mimic both the physical appearance and social intelligence of real humans, and then study how they interact. Picture hyper-intelligent video game characters that can carry conversations with users and socialize using human emotions.

Continue reading in Defense News and Army Times.

‘I really never thought that we could heal so many hearts’: Heroes Hall veterans museum marks first anniversary

Heroes Hall offers year-round educational programs, performances and rotating exhibitions. During its first year of operation, it drew more than 52,000 visitors and several tours by school and community groups, according to the Fair & Event Center.

As Heroes Hall enters its second year, its focus will continue to be on public outreach and providing educational opportunities including “Bravemind,” an upcoming exhibit that will use virtual reality technology to examine the issue of post-traumatic stress disorder. The Fair & Event Center is collaborating on that with the USC Institute for Creative Technologies.

Continue reading in the Los Angeles Times.

‘Nothing can replace the real thing’: Panel discusses growth of artificial intelligence

Panelists Caroline Bainbridge, Lisa Joy, Jonathan Gratch, Rose Eveleth, and James Blaylock discussed the relationship between emotion and artificial intelligence, and how humans and robots are changing with the expansion of technology at the “Beyond Human: Emotion and AI” event Feb. 13. The panel answered questions about human influence on artificial intelligence and how it impacts society.

Continue reading in The Panther online.

Will Healthcare Be Where the Killer VR Apps Emerge?

By Skip Rizzo for VR360 News

Virtual reality (VR) technology offers new opportunities for clinical research, assessment, and intervention. Since the mid-1990s, VR-based testing, training, teaching, and treatment approaches have been developed by clinicians and researchers that would be difficult, if not impossible, to deliver using traditional methods.

During this time, a large and ever maturing scientific literature has evolved regarding the outcomes and effects from the use of what we now refer to as Clinical VR. This term subsumes VR application developments targeting cognitive, psychological, motor, and functional impairments across a wide range of clinical health conditions.

Moreover, continuing advances in the underlying enabling technologies for creating and delivering VR applications have resulted in its widespread availability as a consumer product, sometimes at a very low cost. So, when one studies the scientific literature, examines the evolving state of the technology, and observes the growing enthusiasm for VR in the popular culture, it is hard not to see the areas of mental and physical healthcare as the fields from which VR’s future killer apps will sprout.

Continue reading on VirtualReality-News.Net.

TALK-ON

Download a PDF overview.

TALK-ON (Team Assessment and Learner Knowledge Observational Network) is a mixed reality test bed designed to explore questions of simulation fidelity, assessment, and feasibility involved in using consumer virtual reality technologies for armored vehicle leadership training.

The TALK-ON prototype is focused on cognitive and communication skills training of novice tank platoon leaders, who must accurately assess tactical situations, make quick decisions, and communicate effectively with the tank crew, other tanks within and across platoons, and higher command.

The VR simulation positions the trainee in the seat of a TC, directly in front of the Commander’s Independent Thermal Viewer (CITV ) controls, as well as a topographic tactical map below the CITV. The trainee’s point of view is a close approximation of what a TC would see in an actual tank and the CCTT, including virtual, animated crewmembers. The TC can issue and receive voice commands via three separate radio channels (Crew, Platoon, and Company), and the communication can be implemented either via a combination of joystick buttons and a USB microphone or a purpose-built helmet switch and control junction box (JBOX) designed and 3D printed to replicate actual equipment. The trainee can also control the thermal and tactical views in the CITV panels in front of him via a joystick. Lastly, the TC trainee has the ability to survey the terrain around his tank by standing up and looking out of the turret.

A synergistic collaboration across ARL and ICT’s Mixed Reality Lab, the TALK-ON project is part of the Army Science and Technology Objective (STO) on Training Effectiveness for Simulations. The team has collected human factors usability data and conducted experimental studies with subject matter experts from the Maneuver Center of Excellence (MCoE) to compare the prototype with current state of the art Armor training (Close Combat Tactical Trainer, CCTT). Lessons learned and knowledge products will inform future immersive training environments.

VR Can Already Help People Heal — and it’s just the beginning

Given the amount of new research showing the potential of VR to heal both emotional and physical conditions, it’s no surprise that many innovative VR companies not bound by traditional methods have stepped up to help find new solutions to old problems. One of the most successful applications is the use of VR to treat PTSD. Virtually Better, a company that Dr. Skip Rizzo and his team founded, developed a simulation that would re-create the conditions that Iraq war veterans experienced. “Virtual Iraq” proved successful, helping treat over 70 percent of PTSD sufferers, and that has now become a standard accepted treatment by the Anxiety and Depression Association of America. They also support applications of VR-based therapy for aerophobia, acrophobia, glossophobia, and substance abuse.

Continue reading in VentureBeat.

VR and AR: The Art of Immersive Storytelling and Journalism

As we move into a world of immersive technologies, how will virtual and augmented reality transform storytelling?

From EDUCAUSE Review:

The Impact of Immersive Storytelling
The sense of presence in VR transforms the storytelling experience. Participants become part of an environment with an incentive to act and respond to the events they encounter. We see this in the early work of the virtual reality pioneer Nonny de la Peña at USC. Her groundbreaking Hunger in Los Angeles experience, which premiered at the Sundance Film Festival in January 2012, places you in the middle of a food bank line in LA (see figure 1). In her TED talk, de la Peña describes the first rule of VR design: Begin by thinking of your body in the space. The focus of VR design is not the camera frame, but the embodied visitor.

Built in Unity 3D with a body-tracking system and audio recordings of the actual event, the experience often triggers intensely emotional responses from the participants. Viewers walk into a virtual re-creation of the event, allowing them to experience and respond to it. The project was funded partially by USC’s Annenberg School for Communication and Journalism and its MxR interaction lab. A more recent experience from 2016, Across the Line, places participants in the middle of anti-abortion protestors outside an abortion clinic. Blending journalism with storytelling, de la Peña refers to her work as “advocacy journalism.”

Virtual and Augmented Reality Put a Twist on Medical Education

Building on nearly 3 decades of experience using virtual reality to build immersive therapies for patients with posttraumatic stress disorder (PTSD), ICT’s Skip Rizzo, is tackling a new challenge—training clinicians to more skillfully handle delicate interactions with patients. To treat patients with PTSD, Rizzo and his colleagues re-create traumatic situations that trigger intense emotional responses and enable therapy. He’s found it particularly useful for a younger generation of veterans who have grown up with virtual reality gaming.

Continue reading in JAMA: The Journal of the American Medical Association.

Virtual Humans with the USC Director for Virtual Human Research, Jonathan Gratch | MIND & MACHINE

When you combine Natural Language Interface, Emotion Modeling, CG Animation, and various forms of Artificial Intelligence, you get lifelike computer characters with autonomous interaction.

This new kind of digital actor is called a Virtual Human, and can be deployed across a wide spectrum of functions — enabling authentic human interaction with machines. From psychological counseling to education to customer service to endless other applications, virtual humans will play an increasingly active role in society.

Mind and Machine interviews ICT’s Jonathan Gratch, Director for Virtual Human Research about his work. Watch the full segment here.

AAAI 2018 Conference

AAAI 2018 Conference
February 2-7, 2018
New Orleans, LA
Presentations

 

Emotional Rescue Could Refresh Uncertain VR Market

In a follow up piece about Sundance, SiliconANGLE explores the VR market some more.

Continue reading on SiliconANGLE.com.

Holocaust Hologram History-Making

Continued expansions of testimonials for the New Dimensions in Testimony project.

Full article in Cleveland Jewish News.

Virtual Reality Expands its Reach

Newly accessible virtual reality systems are delivering results in the therapist’s office, research lab and beyond.

Continue reading the full feature on the American Psychological Association‘s website.

How Businesses Are Using Augmented Reality

From Pokemon Go to Ikea, National Center for Business Journalism covers a few ways Augmented Reality is being used for business.

The Artificial Becomes Real

A back-and-forth interplay of government and commercial funding and research has brought AI to the edge of a breakthrough. DVIDS takes a look at the many ways in which the U.S. Army uses this type of technology.

Read the full article in DVIDS.

ARL 32 – Research Assistant, Learning in Enhanced or Virtual Training Environments

Project Name
Learning in Enhanced or Virtual Training Environments

Project Description
This project examines how enhanced training components (such as game elements and virtual reality) affect learning outcomes from training programs. Individual personality and trait differences will be examined, as will physiological and survey-derived state measures.

Job Description
Research intern may be involved in data analysis or data collection for ongoing projects in gamified training and/or training in virtual environments.

Preferred Skills

  • Psychology
  • Human research
  • Data analysis
  • Familiarity with learning science, VR, gaming, EEG, and/or impedance cardiography

ARL 31 – Research Assistant, Device and Materials Simulations

Project Name
Materials and Device Simulations for Army Research Applications

Project Description
The project is a part of an ongoing emerging materials and device research efforts in US Army Research Laboratory (ARL). One of the research focus of the team is to explore and investigate materials and device designs, both theoretically and experimentally, for low-power, high-speed and lighter weight electronic devices.

Job Description
The research assistant will work with ARL scientists to investigate fundamental materials and device properties of low-dimensional nanomaterials (2D materials, 2D/3D materials, and functionalized diamond surfaces). For this study, various bottom-up materials and device modelling tools based on atomistic approach such as the first-principle simulation (DFT) and molecular dynamics (MD) will be used. In addition, numerical and analytical modeling will be used to quantify and analyze observed data from atomistic simulation and compare them to the groups’ experimental findings

Preferred Skills

  • An undergraduate or graduate student from electrical engineering, materials science, physics or computational chemistry department
  • Sound knowledge of materials and device physics concepts
  • Experience with high-performance computing (HPC) environment
  • Proficiency with an atomistic materials modeling concept
  • Interest in fundamental materials design and discovery
  • Working knowledge of quantum transport simulation tools will be advantageous
  • Familiar with experimental characterization techniques
  • Ability to work in a collaborative environment as well as independently

ARL 30 – Research Assistant, Predicting the Effectiveness of Gamified Training

Project Name
Individual Traits and Training Effectiveness: Predicting the Effectiveness of Gamified Training

Project Description
A research focus of The Army Research Laboratory is the study of how individual traits modulate the effectiveness of game-like reward structures to enhance training outcomes. Encephalography (EEG) and impedance cardiography were used to track behavior and cognitive states during training in a gamification environment to monitor trainee motivation in a continuous manner.

Job Description
The research assistant will work with ARL scientists to identify physiological correlates of training performance and state changes based on behavioral, cardiovascular, and neural data within a gamified training environment. The summer project will aim to define components of these state changes relating to performance and trait-based measures.

Preferred Skills

  • A PhD student working in the field of cognitive neuroscience
  • Experience with EEG analysis and/or impedance cardiography preferred
  • Ability to work in a collaborative environment as well as independently
  • Interest in human behavior and training
  • Strong quantitative/programming skills

ARL 29 – Research Assistant, Predicting the Effectiveness of Gamified Training

Project Name
Individual Traits and Training Effectiveness: Predicting the Effectiveness of Gamified Training

Project Description
A research focus of The Army Research Laboratory is the study of how individual traits modulate the effectiveness of game-like reward structures to enhance training outcomes. Encephalography (EEG) will be used to track behavior and cognitive states during training in a gamification environment to monitor trainee motivation in a continuous manner.

Job Description
The research assistant will work with ARL scientists to identify state changes based on behavioral and neural data within a gamified training environment and during the subsequent transfer task. The summer project will aim to define components of these state changes relating to regulatory focus and other trait-based measures by linking behavioral data with conventional EEG analyses.

Preferred Skills

  • A PhD student working in the field of cognitive neuroscience
  • Experience with EEG/MEG analysis preferred
  • Ability to work in a collaborative environment as well as independently
  • Interest in human behavior and training
  • Strong quantitative skills

ARL 28 – Research Assistant, Neuroadaptive Training

Project Name
Building a Brain Computer Interface

Project Description

The Army Research Laboratory is interested in creating a new type of behavioral training paradigm based on physiological measures. One goal is to create a brain-computer interface (BCI) that is able to modify training in real-time based on neurophysiological responses. Encephalography (EEG) will be used to track behavior and monitor neural activity during a task to build a system for ongoing and future studies.

Job Description

The research assistant will work with ARL scientists to identify neural correlates related to performance during training and subsequently build a brain computer interface that will allow for online training changes based on brain activity. The summer project will aim to not only define the specific components relevant to training but also program a system to process data online in hopes of augmenting performance.

Preferred Skills

  • A PhD student working in the field of cognitive neuroscience
  • Experience with EEG analysis, preferably in a BCI context
  • Ability to work in a collaborative environment as well as independently
  • Interest in human behavior and training
  • Strong quantitative/programming skills

ELITE SHARP BRAVE

Download a PDF overview.

The Emergent Leader Immersive Training Environment Sexual Harassment/Assault Response & Prevention Bystander Resource Assessment & Virtual Exercise (ELITE SHARP BRAVE) is a laptop training application for Army junior leaders to successfully intervene when observing behavior that could lead to potential incidents of Sexual Harassment and Sexual Assault. ELITE SHARP BRAVE supports the mission of the United States (US) Army’s Intervene, Act, and Motivate Campaign (I. A.M. STRONG) and the Not in my Squad (NIMS) Program. Developed under the guidance of the SHARP Program Management Office, and in collaboration with the US Army SHARP Academy and FORSCOM, the ELITE SHARP BRAVE content incorporates evidence-based instructional design methodologies. ELITE SHARP BRAVE also leverages USC ICT research technologies such as virtual humans, story-based scenarios, and intelligent tutoring technology to help create a challenging yet engaging training experience.

The ELITE SHARP BRAVE software features three scenarios based on relevant real-world incidents that Army junior leaders may encounter. Each scenario offers the user a chance to practice intervention when observing behavior that could lead to potential incidents of Sexual Harassment and Sexual Assault. The training experience includes three phases: Up-front Instruction, Practice Environment, and an After Action Review (AAR).

The total training time for ELITE SHARP BRAVE is flexible based on user and training needs. For example, time may vary depending on user experience level, performance, and engagement. The software allows users to review missed concepts based on how well they respond to quiz questions, and offers demonstration support through training vignettes and step-by-step skills comparisons. There is also the option for users to practice all three scenarios, and engage in the AAR after each practice environment.

ELITE SHARP BRAVE offers US Army junior leaders a unique opportunity to recognize potential problem behavior on the continuum of harm related to Sexual Harassment and Sexual Assault, assume responsibility for intervening, and embrace their role of cultural change agents to uphold Army values. Upon completion of the ELITE SHARP BRAVE training, users will be able to demonstrate their understanding intervention techniques for preventing potential incidents of Sexual Harassment and Sexual Assault.

ELITE SHARP BRAVE is available for download on the MilGaming web portal: https://milgaming.army.mil/

Bringing Musical Stars Back Via Hologram

Special effects are being employed to help bring beloved performers of the past back, almost as good as new. CBS Sunday Morning’s cover story investigates the technology and ethics behind these holograms and speaks with ICT’s Todd Richmond for insight, just in time for the 2018 Grammy Awards.

Watch the full segment on CBS Sunday Morning.

Getting Real on VR’s Consumer Appeal at Sundance

SiliconAngle attended this year’s Sundance Festival and discovered a huge presence of VR. Highlighting uses of the technology, R. Danes looks at how VR can be used in artistic form and for good use.

Continue reading the full article here.

Imaging Applications Use LEDs to Span the Spectrum

As more color cameras add infrared bands to their existing red, green, and blue channels, the machine vision market is seeing a resurgence in multispectral imaging solutions that once were only affordable by government institutions. Both multispectral and hyperspectral imaging applications require measuring the reflected energy from an object within bands of the electromagnetic spectrum. While multispectral vision systems may only sample between three and 10 different bands such as RBG and near infrared (NIR), hyperspectral systems may sample as many as 200 or more.

Since multispectral systems sample frequencies at narrower spectral bands, the images collected contain more data than multispectral images and thus can detect subtler differences between features of objects. However, larger data sets require specialized image processing and data analysis to turn spectral measurements into useable information.

Visual computing scientist Chloe LeGendre and her colleagues at the USC Institute for Creative Technologies (Playa Vista, California) have shown that, by analyzing 11 Luxeon LEDs from Lumileds (San Jose, California), as few as five LEDs of distinct spectra can be used for color-accurate multispectral lighting reproduction. These include red, green, and blue LEDs with narrow emission spectra, along with white and amber with broader spectra. To date, a number of companies have taken the approach of using multiple LEDs to produce broad-spectrum illumination products.

Continue reading in Vision Online.

Survivors’ Holograms Interact with Museum Visitors

The Illinois Holocaust Museum and Education Center, located in this suburb about 15 miles north of Chicago, is the first to permanently showcase the New Dimensions in Testimony oral history project, which has created holographic images from extensive interviews of 15 Holocaust survivors shown on rotation. Seven of the survivors are from Chicago.

The images are produced by the University of Southern California’s Institute for Creative Technologies, along with the USC Shoah Foundation — a nonprofit that famed director Steven Spielberg founded in 1994 to preserve Holocaust and other genocide survivor testimonies. The museum’s new $5 million center, titled Take A Stand, opened in October.

Continue reading the full article in The Jewish Star.

Have Holograms Made The Concept Of ‘Live Performance’ Different?

You may have seen images of the late Tupac Shakur from the 2012 Coachella Music Festival. A hologram of Shakur was projected to be on stage. The late Roy Orbison will reportedly be used for an international hologram tour, 30 years after he passed away.

And that kind of technology is advancing — which could impact the music industry and, possibly, how fans view their favorite performers. KJZZ The Show’s Steve Goldstein — and not a holographic image of him — is here with more.

He spoke with Todd Richmond, director of the Mixed Reality Lab at USC’s Institute for Creative Technologies, to learn about holograms. He started by asking him whether holograms have made the concept of “live performance” mean something very different from what we’re used to.

Listen to the full interview on KJZZ.

ARL 27 – Research Assistant, Human Factors in Smart Battlefield Information Systems

Project Name
Human Factors in Smart Battlefield Information Systems

Project Description
As global threats evolve, it is critical that U.S. Army warfighters can access the most relevant information in military databases. Currently, warfighters, their COs, and C2 analysts are flooded with massive amounts of information, making it difficult to quickly form strategic decisions. AI-inspired algorithmic techniques that can quickly and accurately recommend information have matured, however, effecting understanding and trust in complex information systems remains a significant challenge, and under-utilization plagues systems with improperly engineered human-agent interaction (HAI).

ARL’s Battlefield Information Processing Branch is seeking to:
-Quantitatively model individual, situational, and system-design factors that affect usage of complex information systems and agents and their relationship to mission success.
-Explore novel approaches to AI and Machine Learning that are accurate and useful, but also simple and flexible.

Job Description
ARL is interested in integrating machine learning and recommender systems approaches into the personal information systems carried by each warfighter. Summer interns will assist in the development and design of both algorithmic approaches to information filtering and the development of web-based test-beds for HAI. Interns may opt to experiment with the development of novel visualizations/interactions for AI/ML. Additionally, interns will have the opportunity to contribute their ideas to ARL’s ongoing HAI research.

Preferred Skills

  • Programming, especially Java and scripting languages (Python, etc.)
  • Web development, e.g. HTML5, CSS, JavaScript
  • Databases, e.g. MySQL
  • Statistical Analysis Packages, e.g. R or SPSS
  • Algorithms, machine learning, or other AI experience

ARL 26 – Research Assistant, Multi-modal State Classification During Tactical Hand Gesturing

Project Name
Multi-modal State Classification During Tactical Hand Gesturing

Project Description
The goal of the project is to understand what information from multiple sensors systems allows for accurate, robust, and fast classification of changes in a user’s state while they perform tactical hand gestures under high cognitive load. Analysis will include extraction of human interpretable and literature-driven features from time series data and the application of supervised machine learning techniques to classify state changes.

Job Description
The summer intern will be tasked with helping run participants through behavioral experiments, that include the administration of physiological measurements such as EEG and impedance cardiography. The intern will also be responsible for analyzing and modeling the collected physiological and behavioral data.

Preferred Skills

  • Matlab Programming
  • Machine Learning toolkits
  • Experience in Unity/C#
  • Familiarity with physiological data is a plus

Cartoon Clinicians: Precursors of Virtual Therapy?

Researchers have discovered that people are sometimes more willing to share their fears when they are able to open up to a virtual interviewer, rather than to a real person.

Specifically, researchers at the University of Southern California (USC) Institute for Creative Technologies (ICT) found soldiers suffering from posttraumatic stress disorder (PTSD) were more likely to reveal their problem to a on-screen cartoon character (as opposed to a live human being) as long as they could be sure that information remained anonymous.

Continue reading the full article in Communications of the ACM.

AR Glasses Have Learned from Glassholes

Augmented reality is one of the technologies making significant progress these days, and the public has been privy to a few of the more novel ideas utilizing the advances in recent years. ICT’s David Nelson and Todd Richmond talk AR with VentureBeat and possibilities for the future.

At This Holocaust Museum, You Can Speak with Holograms of Survivors

The Illinois Holocaust Museum and Education Center, located in this suburb about 15 miles north of Chicago, is the first to permanently showcase the New Dimensions in Testimony oral history project, which has created holographic images from extensive interviews of 15 Holocaust survivors. The images of the seven survivors from Chicago are shown on rotation at the center.

The images are produced by the University of Southern California’s Institute for Creative Technologies, along with the USC Shoah Foundation — a nonprofit that famed director Steven Spielberg founded in 1994 to preserve Holocaust and other genocide survivor testimonies.

Continue reading the full story in the Jewish Telegraph Agency.

Could New Tech Meant for Plastic Surgeons Change CGI in Films Forever?

A new innovative technology is here to revolutionize filmmaking and James Cameron isn’t leading the charge? Times sure have changed.

Ovio, a Newport Beach, California-based digital imaging startup, is shopping its new visual effects operation around Hollywood and generating plenty of buzz. What is this new service, you ask. A faster and more efficient 3D facial scan that film studios can use to generate computer images of actors.

Reporting news from the Hollywood Reporter, Observer features research from ICT and its Light Stages. Continue reading here.

3D Face Scanning Start Up In Talks With Studios for Film Use

A digital imaging startup is in talks with several Hollywood studios to offer a visual effects solution: faster 3D facial scans of actors. The Hollywood Reporter looks at various technologies similar to the Newport Beach start up, featuring a mention of ICT’s Light Stages.

While 3D scanning an actor’s face is common in Hollywood, systems vary in terms of how they work and are used. For instance, USC Institute for Creative Technologies’ Light Stage facial-scanning tech was used in such films as Gravity.

Continue reading the full article here.

Counseling with Artificial Intelligence

Counseling Today investigates how artificial intelligence will play a role in the future of counseling.

Read the full article featuring ICT research here.

A Blended Environment: The Future of AI and Education

Getting Smart explores the role AI will play in education and the five major shifts schools must make (with examples of how AI is already expediting the process) to maintain a blended environment.

Read the full article featuring ICT research here.

IMSH Conference

IMSH Conference
January 13-17, 2018
Los Angeles, CA
Presentations

ARL 25 – Research Assistant, Simultaneous Image Restoration and Recognition for Low Quality Imagery

Project Name
Simultaneous Image Restoration and Recognition for Low Quality Imagery

Project Description
Most of the currently available visual recognition algorithms assume sufficient resolution of region of interest. However, imagery obtained from surveillance videos and autonomous systems are often of lower resolution. Further, in most real world cases, the images could be affected by noise, haze and blur. In this project, we plan to explore simultaneous image restoration and recognition for these low-quality images through joint optimization of algorithms.

Job Description
The position will involve integrating currently available joint optimization algorithms (object recognition, de-blurring, and de-hazing) and implementing them on a unified platform for analyzing low quality degraded imagery. The project will also involve developing novel algorithms and architectures for processing low quality visual data.

Preferred Skills

  • Computer vision, Deep Learning, Caffe, Tensorflow

Activision’s Virtual Human ‘Emotion Challenge’

Detailed character animation and facial performance is increasingly becoming important for AAA Games. This is directed at both rendering realism and animation performance. Activision was founded in 1979 and was the world’s first major independent game developer. Today, Activision Blizzard is one of the largest AAA game publishers in the world. Inside the company, the Activision Central Technology team recently mounted a project to look at improving their facial performance pipeline.

The CTX team within Activision’s Central Technology division has focused on developing powerful solutions for facial capture, solvers and performance reproduction, but they need to do this across a variety of different project pipelines and occasionally involving external vendors. It work was seen as important as it is often challenging to maintain visual consistency, even from character to character within a single project. The CTX team identified the need to research a unified, robust, and scalable pipeline for actor likeness acquisition, performance capture and character animation.

To tackle this problem they created an internal research project known as the “Emotion Challenge”. Using the latest technology including their own acquisition of an ICT Light Stage, the team set out to research and document each stage in a facial pipeline and assess it for production viability. They then set to translating these research efforts into real world techniques that could be implemented to augment and extend existing approaches across a range of projects.

Continue reading the full article on FXGuide.com.

Seven Corporate Wellness Trends for 2018

The top corporate wellness trends are being driven by technology — some of which now sounds a bit futuristic. But experts say that in 2018 and beyond, you can expect to see employers try new technologies, such as room sensors and chatbots, in an effort to keep employees healthy, engaged, productive and loyal.

Tech Target looks at the seven trends we should expect to see in 2018 and beyond, featuring research from ICT to drive home one of the trends.

Read the full article here.

Magic Leap’s Unveiling Spurred 2018 AR Race

ICT’s David Nelson pens an op-ed for Rolling Stone about why researching immersive human-computer interaction is critical.

Read the full article here.

A Nanjing Massacre Survivor’s Story Lives On Digitally

Engadget writes about how new technology will allow future generations to hear survivor stories in the future, featuring ICT’s collaboration with the USC Shoah Foundation, New Dimensions in Testimony.

Read the full article here.

Is It Time for a Magic Leap Forward?

A few days before Christmas, closely watched and famously tight-lipped “mixed reality” company Magic Leap finally let us backstage a bit, announcing plans to ship a product in 2018.

In the meantime, though, there remains time to ponder all of those reasons—even as Magic Leap’s news is another sign that long-imagined futures are creeping closer each day, advancing what David Nelson, the USC Institute for Creative Technologies’ director, told Rolling Stone represented “a new medium of human-computing interaction” that will mean “the death of reality.

Continue reading the full story in Barron’s.