Mark Harmon is a Software Developer at the UBC Emerging Media Lab, University of British Columbia and spending the summer at ICT as part of our Volunteer program, under the supervision of Dr. Gale Lucas, Director, Technology Evaluation Lab, ICT. His research EARSIM, a sound localization training application has just been accepted to SIGGRAPH 2025. In this essay he explains how the research came about and what he’s working on with Dr. Lucas.
BYLINE: Mark Harmon, Software Developer, UBC Emerging Media Lab, University of British Columbia; ICT Researcher (volunteer), Technology Evaluation Lab, ICT
I’m Mark Harmon, a Software Developer at the UBC Emerging Media Lab, University of British Columbia, and I’m spending this summer as a volunteer in ICT’s Technology Evaluation Lab under the supervision of Dr. Gale Lucas, Director of the Technology Evaluation Lab. I’m thrilled to share that my research project EARSIM, a sound localization training application, has just been accepted to SIGGRAPH 2025. In this essay, I want to explain how my previous research led to this achievement and describe the exciting work I’m now contributing to Dr. Lucas’s lab.
Contributing to Cutting-Edge HCI Research at ICT
This summer, I have the privilege of volunteering in the ICT Technology Evaluation Lab, which conducts groundbreaking research in Human-Computer Interaction (HCI), Human-Robot Interaction (HRI) and Human-Building Interaction (HBI). The lab’s mission is to optimize interaction and outcomes between humans and social intelligence, including virtual agents, social robots, and other social interfaces in built environments.
Under Dr. Lucas’s guidance, I’m witnessing firsthand how the lab designs virtual and embodied technologies to interact with human users, conducting rigorous user studies with particular focus on building relationships, rapport and trust with machines. Their research extends to systems that can persuade, negotiate, and engage in social influence with humans, including fostering behavior change. For Human-Building Interaction (HBI), the lab focuses on technology for built environments, especially the interaction between smart buildings and their occupants.
What strikes me most about working here is the lab’s commitment to methodological rigor. They don’t just develop innovative prototypes and applications—they conduct user studies with exceptional attention to proper research design and analysis. Through their published studies and leadership in the field, lab members establish best practices for user study research design and analysis, something that has already influenced my own research approach.
My Summer Project: Advancing AI-Driven Game Interactions
This summer, I’m contributing my technical expertise to Dr. Lucas’s Sony project, which incorporates Large Language Models (LLMs) into NPC dialogue interactions in video games. Building on previous work by Dr. Lucas and her PhD student Brian Deuksin Kwon, who developed persuasive AI agents that tested LLM persuasiveness through NPC characters from Sony’s game “Days Gone,” the project is now expanding into more sophisticated territory.
The original project featured a chatbot interaction where subjects could converse with two AI agents and decide which character to give an item to as part of a side quest. This summer’s expansion incorporates Construal Level Theory into the AI agents’ persuasive tactics. This psychological theory explains how individuals perceive and understand their world in terms of relevance and abstraction—some people need step-by-step thinking while others prefer big-picture, abstract perspectives. While everyone operates at different construal levels depending on context, many have a chronic construal level that serves as their default mode of perception.
The summer’s goal is ambitious: testing whether AI can effectively adapt to a player’s construal level to increase persuasiveness. My role focuses on the technical implementation rather than the psychological theory—I’m building an in-game proof of concept that allows users to experience these dialogue interactions in a fully realized 3D environment. Using my expertise in Unreal Engine 5, which I developed through my work at the Emerging Media Lab on the EARSIM project, I’m creating a level that matches the story context with 3D character models that players can approach and interact with naturally. While not a complete game, this prototype will provide the look, feel, and mechanics of a real gaming experience, offering users a more immersive and realistic interaction.
SIGGRAPH 2025 Recognition: My EARSIM Research Achievement
I’m incredibly excited that my paper, “Evaluating the Effectiveness of Configurable Virtual Reality System for Multi-sensory Spatial Audio Training” (co-authored with Sinnie Choi, Eric Tang, Delsther James Edralin, and Julien Roy), has been accepted to SIGGRAPH 2025. This research addresses a critical challenge: people with hearing deficits struggle with sound localization despite using cochlear implants or hearing aids.
The Problem I Set Out to Solve
EARSIM emerged from a recognition of the gap between laboratory training and practical application. Traditional training methods rely on controlled environments like anechoic chambers, but these don’t reflect the complexity of real-world listening situations. At the UBC Emerging Media Lab, I observed how virtual reality could create controlled yet complex environments that better simulate real-world conditions, moving beyond the limitations of existing clinical settings.
My team designed the system’s architecture to center on configurability. Built in Unreal Engine 5.4, EARSIM adapts to individual hearing profiles and performance levels through dynamic multi-sensory cue combinations. Unlike existing rehabilitation protocols that follow fixed sequences and vary only single parameters, our system manipulates multiple environmental factors simultaneously, creating training scenarios precisely calibrated to individual needs.
Rigorous Evaluation and Promising Results
My collaboration with Dr. Gale Lucas at ICT’s Technology Evaluation Lab has provided the methodological framework necessary to validate these concepts empirically. Dr. Lucas’s expertise in evaluation methodologies has shaped my understanding of the complex task of measuring therapeutic efficacy in virtual environments, demonstrating how technical innovation requires rigorous evaluation to translate into meaningful clinical applications.
My pilot study involved twenty-one participants completing three sessions with progressively increasing sensory cues, creating increasingly difficult experiences. Results showed median localization accuracy decreased as cue complexity increased, confirming that my dynamic system can modulate task difficulty predictably. The Friedman test revealed significant session effects, particularly between my Snowy and Forest environments, indicating that environmental context influences spatial audio processing in measurable ways.
These findings validate EARSIM as a configurable platform, though I recognize the work represents early-stage research. The sample size limits generalizability, and I tested only participants with normal hearing. While statistical significance was present after correction, the pilot study’s scope requires cautious interpretation.
SIGGRAPH as a Platform for Impact
Being accepted to SIGGRAPH provides me with a prestigious platform to present this work to the computer graphics and interactive techniques community. The conference’s focus on technical innovation and practical application aligns perfectly with EARSIM’s development trajectory, and I’m looking forward to facilitating discussions with researchers working on related problems in immersive technologies and human-computer interaction.
Future Directions and Broader Impact
My research agenda extends in several promising directions. I plan to isolate individual cue parameters to map their specific contributions to localization difficulty, while longitudinal studies will track performance changes over extended training periods. Most importantly, I want to test the system with hearing-impaired populations to establish clinical efficacy compared to existing rehabilitation methods.
This work contributes to broader developments in medical gaming and therapeutic technologies. As VR hardware becomes more accessible, opportunities expand for deploying evidence-based interventions outside traditional clinical settings. I believe EARSIM represents an innovative approach to making specialized rehabilitation services more widely available.
My research benefits from a collaborative structure spanning multiple institutions: UBC’s Emerging Media Lab, the School of Audiology & Speech Sciences, BC Children’s Hospital Research Institute, and ICT. Each partner contributes specific expertise, from clinical insights to technical implementation, addressing the interdisciplinary nature of developing effective therapeutic technologies.
My SIGGRAPH acceptance confirms that virtual reality applications can serve purposes beyond entertainment or visualization, demonstrating how immersive technologies can address specific clinical challenges through systematic manipulation of environmental parameters. I hope my work contributes valuable insights into how configurable virtual environments can support personalized therapeutic interventions.
As I continue my volunteer work with Dr. Lucas this summer while preparing for my SIGGRAPH presentation, I’m grateful for the opportunity to contribute to ICT’s innovative research while advancing my own understanding of rigorous evaluation methodologies. This experience has already enhanced my research approach and opened new possibilities for future collaborations between academic institutions and research laboratories like ICT
//