A University in Every Garage

Published: April 3, 2024
Category: Essays | News
David Nelson

by David Nelson, Director, Mixed Reality Research and Development

David Nelson, Director of ICT’s Mixed Reality Lab (MxR), David Nelson, focuses on Adaptive User Interfaces, UX Design, AR Visualizations, and innovative training for the Army’s SHARP Academy and others. His career includes directing and producing for top studios, securing a Hugo Award for Television Documentary. This is his backstory on joining ICT; how the MxR Lab pioneered bringing VR to the masses; and what the Lab is up to now.

In 2010, I was between movie projects when a friend invited me to  join them at a talk on “immersive experiences” by Professor Mark Bolas at USC’s School of Cinematic Arts. In the end, they couldn’t go, but, intrigued by the topic, I decided I would still attend – this choice led to a whole new chapter in my life and career. 

Professor Mark Bolas, an energetic man with a kind but serious face, under a tussle of salt and pepper hair, spoke about human perception, “joyful interfaces”, and the nature of reality. He foretold a near future where we would spend significantly more time in the virtual worlds we were just starting to visit through our phones. He divulged that the way we’d travel to these worlds was through devices we’d wear on our heads for most of the day. 

I stuck around after the lecture to meet this curious man and we hit it off instantly, talking about everything from Plato, stand-up comedy, classic gangster movies, and the Kabbalah. He invited me to visit his “virtual reality lab” in Playa Vista. I agreed – though I was completely unsure of what a “virtual reality lab” was.  

DOWN THE (VR) RABBIT HOLE

A week later I found myself in what seemed to be a clandestine warehouse on a cul-de-sac in the outskirts of Playa Vista. There were some visitors from Walt Disney Imagineering trying out a prototype, a virtual reconstruction of Disneyland itself. 

Mark asked me if I wanted to “go in” and a student brought me a device that looked like something out of a steam-punk sci-fi movie. This was the Wide5 head-mounted display (HMD), a groundbreaking piece of technology that Mark built, along with his partner Ian McDowell, at their company, Fakespace Labs, with funding from the Office of Naval Research

With the straps tightened and the cable connected to the workstation nearby, Mark asked me if I was ready. I replied, “It’s the happiest place on Earth, right?”  

With that, the light in front of my eyes flickered and I found myself standing inside the entrance to Disneyland. I turned around and saw the gate behind me, I walked forward and the world around me responded as if I was moving through the park. The graphics were somewhat simple, accurate geometry without high-fidelity textures, but still – I was there. This, I learned, is called “presence.” 

This is when things got interesting. Mark asked me if I wanted to visit another area in the park. I said yes and he told me to “hold on.” Suddenly I started to grow, like Alice Through the Looking Glass, larger and larger until I was a giant looking down on the park from a bird’s-eye-view. I took a step forward and must have traveled a hundred acres. Then as if I had taken a bite from the other side of the mushroom, I shrunk down to my normal size and I was now standing on a street in Frontierland on the opposite side of the park. 

At that time I realized that this was a new medium, one that had affordances unlike those that came before it.  I had to be part of this. So, at the end of this visit I asked Mark what I needed to do to join his lab and start working on projects like this. He said he didn’t have a project manager and needed one. 

I was concerned that he might see me as “only a creative” and so I assured him that I wasn’t solely a director, that I produced films as well; I knew how to manage people, budgets and schedules and could see projects through to their completion. He appreciated that, but also informed me that ICT places a high value on creativity and storytelling, that this was part of the institute’s DNA since its inception. 

INSIDE THE LAB

Within a month I’d started working in the lab, managing projects, sitting in on sponsor meetings, hiring new staff, brainstorming ideas and concepts. Mark was, and is still a mentor, a shaman and a friend. He is a true iconoclast, an original thinker who was never afraid to tell a sponsor that what they thought they wanted wasn’t necessarily what they needed. He managed to do this in a way that didn’t upset them, but rather engaged and inspired them. He is an idea factory. I lined a room at the lab from floor to ceiling with white board panels to keep him from writing on the walls. This was a creatively fertile period. We renamed our team, “MxR” –  the Mixed Reality Lab, in order to embrace the spectrum of immersive technologies we set out to explore.

At that time, we had two main research thrusts: “Sharing Space” and “Stretching Space”. 

Dr. David Krum, a computer scientist and researcher at the lab, co-led the Sharing Space effort focused on interactions with virtual humans, primarily audio cues and proxemics and their impact on immersion and presence. Dr. Evan Suma, initially a postdoc, then Assistant Professor, conducted groundbreaking work on natural locomotion in virtual environments under the “Stretching Space” effort. This culminated in the release of the open source “Redirected Walking Toolkit.” Soon, I began leading an effort called “Emerging Concepts” exploring the language of storytelling within the burgeoning medium of virtual reality, collaborating with students from USC’s School of Cinematic Arts and the Iovine Young Academy. Most of this early research was conducted using the Wide5 HMD.

The Wide5 was a remarkable HMD, built between 2004 – 2007. It used small LCD panels four years before the iPhone, it was a precursor to foveated rendering by creating an extra image for peripheral imagery at lower resolution, and it provided better images by allowing lens aberrations and corrected distortion, not through mathematical models but by trusting what felt right. The wide field-of-view (FOV), as its name suggests, is still unparalleled, over 150 degree FOV, with a vertical of about 130. 

In contrast, nearly twenty years later the majority of HMDs have an FOV of no more than 100 degrees. This innovative hardware working in cooperation with a Phasespace optical motion tracking system enabled the Mixed Reality Lab to achieve some semblance of the canonical “holodeck” –  the ability to naturally move throughout a large immersive volume and be transported anywhere you could imagine. 

One caveat was that the HMD cost close to $50k to build, and the tracking system cost twice that.

A NEW MISSION

When Dr. Jim Blake, the US Army’s Program Executive Officer for Simulation, Training, and Instrumentation visited the Mixed Reality Lab in early 2011, he recognized these capabilities as “the holy grail” of VR, but warned us that unless we could figure out how to exponentially reduce the cost, immersive technology would die on the vine. He tasked Mark and the lab with the mission of “disrupting the supply chain.”

Shortly thereafter, Mark stayed late at the Lab one night, working in the small machine shop. When we arrived in the morning, he presented two iPhones mounted to a wooden board with a pair of LEEP optics lenses similar to those found in the Wide5 fixed over both phones. The dev team worked with Perry Hoberman, a professor at USC’s school of Cinematic Arts who specialized in stereoscopy, and quickly uploaded a Unity game engine environment synched on both phones. The scene was ‘tracked’ using the accelerometer in the phones so when you peered through the lenses you felt like you were inside of the environment looking around. 

This was compelling, and the work was presented as a poster at IEEE VR in 2011, laying the groundwork for low-cost VR. The inherent problem was the lenses used in this ‘prototype’ were custom made, not inexpensive and not feasible at scale.  However, the smartphones were key, they were ubiquitous and were only getting better, with faster processing and higher screen resolution. This experimental design of the “Smartphone HMD” highlighted the potential for a single panel display, built in tracking, and hinted at a future experience of mobile VR.  

OPPORTUNITY MEETS PREPARATION

Right around this time we hired a young student called Palmer Luckey from Long Beach as a temp employee to serve as a Lab Assistant. Mark was impressed by this young VR enthusiast’s knowledge of the history of VR hardware. Showing him around the lab, Mark opened up a plastic tub and was surprised when the young man accurately identified some ancient VR artifacts that were made when he was likely only a toddler. 

Anyone who visited the lab at 5318 McConnell will remember the wall to wall shelving units, stacked high with endless plastic storage containers of items collected by Mark over his storied career; a sight somewhat reminiscent of the final scene in Indiana Jones. I wouldn’t be surprised if we threw away the actual Ark of the Covenant in one of our many ‘purges’. 

Mark thought that this new Lab Assistant would be a great candidate to go through the many tubs and sort out useful hardware components from junk. Lucky him. 

FOV2GO – IEEE VR 2012

Mark told us he wanted to: “Give VR away for free” at the following IEEE VR conference. An outrageous goal, considering the price point of HMDs back then. The idea was that if we could put the virtual content on a single phone, two images on one screen, we could put the phone in a small foam core housing and people could have a VR experience using only the phones in their pockets! 

The team got started on these hardware and software designs. In our ongoing quest toward low-cost VR we found some very inexpensive plastic lenses we could use in our foam core viewers. Although they would not provide the minimum field-of-view that our research showed was required for a true immersive experience, the fact that we could buy them in bulk for practically pennies would enable us to fulfill Mark’s idea of truly low-cost VR. 

We spent hours cutting foam core templates, gluing in lenses, and neatly packing them up. We arrived at IEEE VR 2012 with a stack of manilla envelopes each containing an “FOV2GO” viewer. They were handed out on the opening day to attendees who snapped them together, downloaded our simple Unity3D library on their phones, placed them in the viewer and were able to have a true VR experience.

ICT’s MxR Lab won the IEEE VR Best Demo Award that year and, as a result, introduced mobile VR to research labs worldwide, laying the groundwork for our suite of open source VR designs to come.  

IT’S ALIVE! FRANKEN-VIEWERS AND FIELD OF VIEW SWEET SPOT

We had already been putting together what we called ‘Franken-viewers’, pairing wide-field-of-view lenses with LCD displays, along with Phasepace optical trackers. MxR team members Thai Phan and Evan Suma coded up optical remapping software, while Brad Newman and Thai Phan helped integrate with the lab’s Unity-based toolkit.

These make-shift HMDs (the “Scuba HMD” and the “Antler HMD”) were funded by and used for research projects at ICT, including Skip Rizzo’s Med VR projects, and on student developed experiences like Nonny DelaPeña’s “Hunger” and Juli Griffo’s “Shayd” and the early games of students, Nate Burba and James Illif who went on to found the VR game company Survios.

We were still using high-end lenses but these prototype designs proved that a PC-driven, wide field of View HMD could be created at consumer price points.  

Searching high and low, Mark discovered an inexpensive 7X aspheric lens, and used an old trick he learned at Fakespace Labs to repurpose aspheric magnifiers placed backwards, coupled with remapping software. This would provide the minimum 90-degree field of view needed to create a truly immersive experience. After many prototypes, we created a wider field of view foam core viewer, the FOV2GO-D, and the 3D printed Socket Viewer, designs that changed how VR forever.

We released the open source hardware and software designs at the Maker Faire in 2012.

A SWIFT KICKSTARTER 

In June 2012, our lab assistant, Palmer Luckey left ICT. He launched a Kickstarter campaign to build his own low-cost high-end VR display, looking to turn his passion for bringing VR to the masses into reality. Less than two years later his company was acquired by Facebook for $2 billion. 

We felt that we had accomplished the mission set forth by Dr. Blake to disrupt the supply chain of VR hardware. The new wave of VR was now underway.  

Inevitably, there was press coverage that spoke about a kid from Long Beach who invented VR in his parents’ garage. This was a good story, and as Mark Twain himself has said, “Never let the truth get in the way of a good story.”

I don’t reference this quote in a cynical way, but rather with a nod to what it really took to get the research and development out of the laboratory and into people’s homes. In my opinion, that consisted of the passion and bravado of youth, the energy and conviction of an evangelist, a good story, the capital to follow, and perhaps less glamorously, the decades of research, development and knowledge preceding it.  

There are numerous examples of the pivotal role universities play in fostering innovation and technological advancements. The research and development conducted in academic settings often serve as the bedrock for technologies that transform industries and society at large. 

We often don’t hear the back stories. Whether it was Stanford University’s involvement in Graphical User Interfaces and voice activated assistants, MIT’s work on global positioning systems, G. Samuel Hurst’s resistive touchscreens at the University of Kentucky, Douglas Engelbart building the first “mouse” at the Stanford Research Institute and countless others. 

I believe USC has its rightful place in the history of virtual reality and the Mixed Reality Lab, its role in the proliferation of low-cost head mounted displays. The reality is, great things can happen in garages, especially when they have a connection to pioneering Universities.  

THE FUTURE IS HERE, NOW WHAT? MxR 2024 AND BEYOND

So where are we now, in 2024? 

It’s been almost 15 years since I joined ICT and I am now the Director of the MxR Lab.  Our projects leverage my background in user experience design, content creation and the power of storytelling, and are supported by a talented, interdisciplinary team* with expertise in Computer Science, Human Factors, System Engineering, Screenwriting and Military Domain knowledge. 

Our basic and applied research evaluates use-cases that exploit immersive technologies; future user-interfaces; and narrative-scenario based learning, seeking to inform requirements that will improve learning, training and operations for sponsors including US Army, US Navy, Military and Sealift Command Fleet, Army SHARP Academy and ARL-West. 

We are currently designing a physics based mixed-reality near shore Ship Simulator in collaboration with the Engineer Research and Development Center’s (ERDC) Coastal Hydraulics Laboratory, experimenting with consumer-off-the-shelf head mounted displays now available to us (Varjo X3, Quest Pro). We are also using the Apple Vision Pro to prototype augmented reality interactions for training and education. Six degrees of freedom and full body motion tracking is available now at a fraction of the cost when I started. We are experimenting with generative AI scenes and scenarios, advances in this technology are happening on a daily basis. 

The future is here and it is most certainly mixed

*MxR Lab: David Cobbins, Creative Producer; Dr. Deniz Marti, Postdoctoral research associate; Rhys Yahata, Lead Software Developer; Allison Aptaker, Project/Lab Manager; Spencer Lin, Masters Student, Viterbi School of Engineering.

BIO

David Nelson, Director of ICT’s Mixed Reality Lab (MxR), is engaged in research efforts on Adaptive User Interfaces, User Experience Design, AR Visualizations, Imagineering the Naval workplace and battlespace of the future, and innovative work training Sexual Assault Response Coordinators for the Army SHARP Academy. He is an award-winning Director and Producer, who has worked for 20th Century Fox, Disney, HBO and Sony Pictures Entertainment, winning a Hugo Award for Excellence in Television Documentary for “Positively Naked” (2005).