Immersive AI for Real-World Impact

Published: August 29, 2025
Category: Essays | News
Dhruv Rajvansh

BYLINE: Dhruv Rajvansh, Master’s Student, Computer Science, USC Viterbi School of Engineering, Student Researcher, Virtual Human Therapeutics Lab, ICT

I came to the University of Southern California with a clear goal: to work where immersive technology and artificial intelligence meet, and to contribute to systems that carry real-world consequences. I am pursuing my Master’s in Computer Science at USC Viterbi School of Engineering, with a concentration in emerging technologies, but for me that title has always been shorthand. What really drives me is building technologies that reshape how people learn, train, and heal.

I had already set my sights on ICT and was determined to join their summer internship program. I kept sending cold emails to various lab directors, expressing my genuine interest in their cutting-edge research. Fortunately, Sharon Mozgai from the Virtual Human Therapeutics Lab responded positively to my outreach. We had an excellent interview where I could share my passion for immersive technologies and their therapeutic applications. I was thrilled when she offered me the opportunity to join ICT as a research volunteer, even though it was unpaid—the chance to work on groundbreaking XR and AI projects was invaluable.

Research Projects

At ICT this summer, I worked on three major initiatives that shaped my experience:

The first was Mission Visualization Development, where I was assigned to architect and deploy an interactive military planning tool using Unity. Working with existing war plans like OV1 that soldiers used for attack coordination, I developed enhanced visual solutions that provided comprehensive operational details through immersive simulation. By Researching realistic terrain systems—leveraging solutions like Cesium with real heightmaps and terrain data—I created authentic geographical representations of actual locations that could accommodate various forms of elevation data. The implementation included detailed UI elements and unit specifications, with the entire system optimized for WebGL deployment to ensure seamless performance and accessibility. This project taught me the critical balance between visual fidelity and technical optimization in mission-critical applications.

The second project involved modernizing the existing VITA (Virtual Interactive Training Agent) system for Meta Quest 3 deployment. This meant porting the legacy desktop VR platform to Meta’s standalone headset while leveraging the All-in-One SDK’s passthrough capabilities for mixed-reality integration. I implemented room-scale locomotion, spatial awareness, and adaptive animation controllers so virtual humans could seamlessly exist within users’ physical environments. Rather than building speech recognition from scratch, I studied and integrated the existing AI conversational systems. My key contribution was adding three distinct behavioral dispositions—neutral, angry, and soft—to the virtual human characters, allowing for varied interaction styles depending on the training scenario and user needs. This created more realistic and varied training scenarios, particularly valuable for job interview practice and social skills development, as users could experience different interviewer personalities and challenging interaction styles in a controlled environment.

These projects demanded every skill in my toolkit. My background in C#,C++, Unity and VR/AR  came into play, as did my experience in AI programming, rendering engines, and performance optimization. Yet technical ability alone would not have been enough. Working under Director Sharon Mozgai and supervisor Arno Hartholt, I learned that real progress comes from cross-disciplinary collaboration. Psychologists, designers, and engineers all brought perspectives that changed the way I approached problems. I began to see myself not just as a developer, but as a contributor to a larger conversation about how humans and machines learn from one another.

One highlight for me was seeing theory dissolve into practice. When a system I had written responded naturally to a user’s behavior, or when a visualization clarified a complex tactical scenario, the impact was tangible. These were not abstract achievements. They were glimpses of how immersive technology can alter the way people train for high-stakes environments, or how therapy can be delivered more accessibly and effectively.

Tech Duality

Outside the lab, I continue to see myself equally as a software engineer and a game developer. That dual identity has always been important. I love the rigor of scalable systems, the satisfaction of optimized performance, and the craft of elegant code. At the same time, I am drawn to the creative side of interactive entertainment—the design of experiences that engage, surprise, and resonate. What my volunteer work at ICT confirmed is that these two instincts do not have to live separately. The same mechanics that make a game compelling—responsiveness, immersion, and feedback—can make training more effective and therapy more empowering.

I often think about what success looks like. For me, it is not the perfection of an algorithm or the sophistication of a system for its own sake. It is the effect on the end user. If a virtual human helps a teenager with autism practice social interaction in a safe setting, that is success. If a military trainee can rehearse decision-making in a secure, immersive environment that mirrors reality, that is success. Those outcomes—not the elegance of the underlying code—are what I aspire to deliver.

Future Plans

Looking forward, my ambition is to lead in XR and AI innovation. I want to continue building systems that expand how people learn, train, and heal. That could mean therapeutic VR platforms, conversational AI for education, or advanced mission simulations. What matters most is that the technology I build endures and integrates into people’s lives in ways that matter.

Beyond the work, what I will remember most is the atmosphere of shared purpose. Being part of a lab where everyone—from directors to fellow volunteers—was driven by curiosity and mission was deeply inspiring. It expanded my imagination of what is possible. It also reaffirmed that I am on the right path: not simply learning technologies, but shaping them to have impact.

When I return to my courses, I do so with a new sense of focus. A line of code now looks less like an exercise and more like the beginning of an interaction that could matter in someone’s life. That awareness, formed in the Virtual Human Therapeutics Lab, is what I will carry forward.

In the end, my time at ICT was not a side note to my education; it was a defining experience. It confirmed that immersive technologies, when paired with AI and human-centered design, can transform how we learn and heal. It showed me the future I want to help build. And it reminded me why I chose this field in the first place: not just to code, but to create systems that change lives.

//