When We 3D Scanned President Obama

Published: June 26, 2024
Category: Essays | News
Paul Debevec and ICT VGL crew 3D scanning President Obama at the White House

By Dr. Paul Debevec, Chief Research Officer, Eyeline Studios – powered by Netflix; Research Professor, USC Viterbi School of Engineering; Associate Director of Graphics Research, ICT 

Dr. Paul Debevec is a research professor at the University of Southern California and the associate director of graphics research at USC’s Institute for Creative Technologies. Debevec’s Ph.D. thesis (UC Berkeley, 1996) presented Façade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. Using Facade he led the creation of virtual cinematography of the Berkeley campus for his 1997 film The Campanile Movie whose techniques were used to create virtual backgrounds in The Matrix. Subsequently, Debevec pioneered high dynamic range image-based lighting techniques in his films Rendering with Natural Light (1998), Fiat Lux (1999), andThe Parthenon (2004); he also leads the design of HDR Shop, the first high dynamic range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices for capturing and simulating how objects and people reflect light, used to create photoreal digital actors in films such as Spider Man 2, Superman Returns, and The Curious Case of Benjamin Button, and Avatar, as well as 3D Display devices for telepresence and teleconferencing. He received ACM SIGGRAPH‘s first Significant New Researcher Award in 2001, co-authored the 2005 book High Dynamic Range Imaging from Morgan Kaufmann, and chaired the SIGGRAPH 2007 Computer Animation Festival. He serves as Vice President of ACM SIGGRAPH and is a member of the Visual Effects Society, the Academy of Motion Picture Arts and Sciences, and the Academy’s Science and Technology Council.

On June 9th, 2014, I arrived at the State Dining Room of The White House with Paul Graham, Xueming Yu, Graham Fyffe, and the Smithsonian Digitization Team to record a Three-Dimensional Portrait of our 44th President.

The year before, I’d spoken at the launch of The Smithsonian’s 3D Digitization Program about how we’d developed a high-res 3D facial scanning system based on polarized gradient light which had been used in movies like Avatar, The Avengers, and Gravity.  One of their 3D models was the famous life mask of President Abraham Lincoln, then the only 3D record of an American President taken while in office.   In early 2014, the Smithsonian asked my lab at the USC Institute for Creative Technologies if we could bring a Light Stage system to Washington, DC to scan a “VIP” over the summer.  Imagining what this might mean, I said we’d be happy to help.

We didn’t actually have a light stage that could travel, but we’d built a mobile gantry for a new facial scanning idea we called “FlashMob”.  I thought with some refinement, we’d be able to make that work.  It quickly became clear it would be hard to match the quality of Light Stage X.  The Smithsonian asked us to send them example results by the end of May, and I realized we’d have to design something much more customized for the job.

The solution was to remove the cameras and flashes from the FlashMob system, reinforce its aluminum gantry, and move every camera and LED light from Light Stage X that we could onto the mobile gantry.  With a few iterations, I came up with a design with exactly fifty light stage light sources, arranged almost like the stars of the American Flag.  Our lab was performing commercial scans during this time, so some of the datasets for Furious 7 were missing some lighting directions in the back.  We mounted all eight Canon 1DX DSLR cameras and an additional six Canon Rebel cameras on the rig, which could open up to fit down a narrow hallway or fold up into an airline shipping container.

We modified our polarized spherical photometric stereo algorithms to work with the new arrangement of lights and cameras, and we tested the system on three stand-ins to make sure we would be able to scan the President.  I was the stand-in for height, having the same 6’2” height as what was published for President Obama.  Antionne Scott from the finance office was our stand-in for skin tone.  And Paul Graham, my PhD student from the US Air Force Academy, was our stand-in for ear anatomy, having short-cut hair and the requisite wingspan.

We shipped the system to the Smithsonian’s offices near The Mall in DC and tested it on a few of their staff.  The Smithsonians’s digitization leader Gunter Waibel brought a leather office chair from their executive conference room to be the seat for the scanning subject.  The project was highly secretive, so the glass walls of the room where we would process the data were papered up and taped over.  The Smithsonian arranged for one of their staff who had served in the George W. Bush White House to brief us on protocol.  Of course, we’d wear suits, and “Mr. President” and “Sir” would be appropriate ways to address our subject.

Our equipment needed to be delivered in a white, unmarked truck, which was well inspected on the way in.  The White House doesn’t have a loading dock, so the truck backed right up to the North Portico.  I actually wasn’t in town for the delivery, serving as a groomsman at the wedding of my college roommate Tim Hawkins on June 7th in Albuquerque, with high hopes that my flight back to Washington National Airport would arrive on time that night.

June 8th we set up and tested the system at The White House.  The system was waiting for us in The State Dining Room, in front of the enormous fireplace and under the portrait of Abraham Lincoln.  We unfolded the aluminum gantry and wired up the custom lights and cameras.  With permission from the White House staff, I added two GoPro cameras to the rig to document the scanning process.  We ate our packed lunches in the adjoining Butler’s Pantry, having been told, in a line reminiscent of Dr. Strangelove, “You can’t eat in here, this is the State Dining Room.”  The official White House videographer stopped by and interviewed us about the scanning to happen the next day.

I’d heard that Pete Souza, the White House photographer, would be there the next day to photograph the scanning, and was hoping I’d have a chance to meet him before.  There was a particular angle from which President Obama would be silhouetted against the lights in a nice symmetrical way that I wanted to show him.  We’d found a setting for the incandescent lights in the room that would be dim enough for our scan not to be affected but bright enough that the secret service was comfortable with the general visibility of the goings-on.  But it would be hard to see the portrait of Lincoln in the dim light with all of our light stage lights facing into the camera.  The Smithsonian crew had an extra LED ring light which I attached to the back of our scanning rig to illuminate the portrait.

With the system set up, we adjourned to Starbucks to discuss.  Everything was ready, but I admitted feeling nervous.  Some of our equipment was circuit boards which had been hand-soldered and flown in just a few days before.  There was custom software on a MacBook Air to trigger all of the cameras in rapid succession with the ten lighting patterns we’d need to record the high-resolution detail.  At the beginning of a scan, we’d set all eight DSLR cameras into a mirror pre-lock state which would produce a satisfying “clunk”.  That sound meant the scan sequence was very likely to work.  But sometimes the cameras wouldn’t go “clunk”, and we’d need to reset the system, and figure out what piece of hardware or software wasn’t responding.  I didn’t like the idea of having to explain we were having a system hiccup and asking for a few minutes to debug, which we probably wouldn’t get.

That night my main goal was to try to get some sleep.  I met with my PhD student Paul Graham, also staying at the Washington DC Holiday Inn, who as a Captain in the US Air Force knew far better than me how to properly shine my shoes for the following day.  I said good night and stopped by the hotel bar to read the GoPro Hero 3+ camera manual from cover to cover to know the optimal settings for the following day.

On June 9th, we took the Metro to the White House to arrive in time for the 3D scan.  Paul Graham wore his Air Force dress uniform which elevated the look of our operation.  We checked the final framing of the cameras with me in the chair, and got ready for the President’s arrival at noon.  I remembered you can faint if you lock your knees when standing at attention for too long, and so I gently bent them every so often.

President Obama arrived in the room with a few of his staff about 12:30pm.  Gunter Waibel showed him 3D prints of the Lincoln life masks to get him into the spirit of the day.  President Obama in person was impressive.  He seemed taller than I expected – ironic since I was the model for his height – with a sturdier torso, and very charismatic.  I realized the United States was very fortunate to be represented by him.  I silently practiced “It’s a pleasure to meet you, Mr. President” which was never needed; he walked over and said “OK let’s meet everybody, what’s your name?” and the long-anticipated handshakes were complete.

“OK, what are all these gizmos?” he asked, turning his attention toward our mobile light stage.  It was my job to explain where to sit and what to do, and he was clearly briefed and ready.  He sat in the chair and saw himself in the live view screen from the center camera in front of him, and readied his expression for history.  To my relief, the height was perfect, and there would be no need to awkwardly adjust the height of the chair.  But he sat straighter than I did, and his face was too far forward for the side cameras.  As discussed with the White House staff, I told him we needed to make a small adjustment, and grabbed the back of the chair to roll him backwards an inch, as aware as one could be that this was the leader of the free world I was scooting around.  In turn he took the chance to adjust his posture and once again, he was too far forward for the side cameras.  I said that things were fine but I just needed to make one more adjustment.  “It’s hard to get my ears in the frame!”, he joked.

I told him our first pose would be a nice Presidential pose to be used for the life-sized 3D print the Smithsonian would make for the National Portrait Gallery.  He made the pose and it was time to scan: Graham Fyffe, standing behind the rig, shut off the live preview to the monitors, and Paul Graham entered the command to set the mirror pre-lock.  I listened for the “clunk” of all the camera mirrors flipping up in unison, and it clunked.  In my relief I caught a glimpse of Pete Souza, telephoto camera in hand, standing right where he needed to be to get the shot.  I stepped out of the way and signaled to Paul Graham to start the scan.  Ten flashes and a second later, it was done.

We did one more “presidential” pose as a backup, and then, as we’d planned, I asked President Obama if he’d like to record a scan of him smiling.  Well within his repertoire, we got a good one.  I then asked, as discussed, if there were any other expressions he’d like us to capture.  He replied, “Well, I think that just about covers it.”

Pete Souza got the photo I’d hoped he would, perfectly framed and exposed.  The White House later used it to advertise Obama’s 2015 State of the Union Address and it appeared in Souza’s coffee table book Obama: An Intimate Portrait.  The portrait of Lincoln is nicely visible.

After our scan, President Obama was ushered to wooden chair in the middle of the room, where our collaborators Adam Metallo and Vince Rossi used an Artec Eva hand-held structured light scanner to capture the shape of his shoulders and the back of his head to create a complete bust with our scan of his face.  In less than ten minutes from when he entered the room, the scanning was done.

We were told the President might stick around to chat and ask questions, and he did.  “So, tell me a bit about the technology here,” he asked.  I’d been so focused on scanning I hadn’t prepared much for a Presidential chat, and I led with the most accessible explanation of our polarized gradient facial scanning process I could muster.  He listened intently, clearly hoping to understand enough to give an insightful response.  I mentioned that one of his recent dinner guests, Angelina Jolie, had been to our light stage at USC ICT to be scanned for her digital double in Maleficent, which he seemed to think was interesting, and I remembered to thank him for the government funding which made much of our research at USC ICT possible.

We did our jobs well enough that The White House gathered us together for a group photo, with Gunter and I on either side of The President.  Just fifteen minutes had elapsed, and the President was off to the next business of the day.  After a minute I had a sudden rush of elation that everything had gone so well.  We took a few minutes to document the scanning setup, and I held the two GoPros together as an improvised stereo rig to shoot an early version of VR180.  I enthusiastically called Xueming, on standby down the street, to let him know that everything he’d built had performed perfectly.  We got excited enough for our White House liaison to ask us to dial it down a bit.

It was still sunny when we loaded the equipment back onto the white truck.  When it drove away, we took a final group photo and walked across Pennsylvania Avenue to Lafayette Square, where Gunter suggested that we should start a tradition of performing a 3D Presidential Portrait for every new administration.  We imagined what it would be like to scan Hillary Clinton a few years hence.

Graham Fyffe took the lead in processing the 3D scan that evening in the Smithsonian’s offices, achieving the highest-resolution scan we’ve ever recorded before or since.  The framing and focus were perfect, save for one tiny bit of ear which indeed didn’t quite get into the frame, but was covered well enough in the other views.  The Smithsonian crew merged our 100-micron resolution facial scan with their Artec Eva data and sent the complete model for life-sized 3D printing by 3D Systems in Langhorne, PA.  Just nine days after the scan, the life-sized bust was presented to President Obama at the White House Maker Faire on June 18th, and is now in the collection of the National Portrait Gallery.

It was a long wait before we could say much about the project.  The White House decided to keep the scanning effort under wraps until after the November midterm elections.  Then on December 2nd, the White House released their video, “The President, in 3D” on YouTube and it showed everything we hoped: the scanning process, our interviews, and the fully-rendered 3D model we’d produced the night of the scan.  Scores of media outlets across the political spectrum covered the work.

In 2023, the ACM SIGGRAPH conference celebrated its 50th anniversary and invited my lab to bring one of the light stages to demo for their history exhibits.  I asked the Academy Museum of Motion Pictures if they would loan us Light Stage 3 from our SIGGRAPH 2002 paper, which we could show as a forerunner to today’s virtual production LED stages.  Unfortunately, they needed loan requests to be made a year in advance.  Then we had the idea of bringing the Presidential Light Stage to SIGGRAPH, which was conveniently displayed in the lobby of USC ICT.  ICT agreed and we brought it to life on the SIGGRAPH show floor for a week of demonstrations, alongside a nearly-complete collection of other light stage light sources, and a model miniature of the very first Light Stage by Gene Kozicki.

For the exhibit signage, I wanted to include a large print of Pete Souza’s photo of our 3D scanning system in operation with President Obama.  But the best resolution I had was a 1K JPG I’d saved from the White House web site.  I’d never before had the courage to ask Mr. Souza for the high-res original, but this was the time.  He replied the next day, “TIFF or JPG?  Which color space?”

//