TIME’s Emmy Award-winning division TIME Studios has launched a groundbreaking immersive project, The March, which brings the 1963 March on Washington for Jobs and Freedom and Martin Luther King Jr.’s iconic “I Have a Dream” speech to virtual reality for the first time. The project transformed intense research into the event and TIME’s own materials to create a 10’30” location-based experience within a 10 x 15 ft. virtual slice of civil rights history.
Co-created by Emmy winner Mia Tramz, TIME’s Editorial Director of Enterprise & Immersive Experiences, and industry-leading immersive storyteller Alton Glass of GRX Immersive Labs, with Viola Davis as executive producer and narrator, The March features the most realistic recreation of a human performance in VR to date, through advanced VR, AI, film production processes and machine-learning techniques.
“Through thousands of hours of research, we have endeavored to be true to the history of that August day. But we at TIME also see the project as a call to each of us for all that is yet to be done in the unfinished fight for equality, including in our own work,” said TIME Editor-in-Chief & CEO Edward Felsenthal. “Our hope is that it will not only change the way we see history, but also help awaken in all of us an understanding of the power of our own voice to have a positive impact on the world.”
“The March is a very ambitious project that was over three years in the making,” said Tramz. “We, along with our production partners, are proud to bring this immersive, educational exhibit to visitors, who will be able to experience this iconic moment in history first-hand.”
TIME Studios partnered with the Emmy Award–winning V.A.L.I.S.studio, Verizon Media’s Academy Award and Emmy-nominated immersive media studio RYOT, Academy Award-winning visual effects and immersive studio Digital Domain and award-winning JuVee Productions for the project.
Digital Domain was charged with creating a highly accurate 3D likeness of the iconic historical figure Dr. King, relying on a mix of its Masquerade system — which combines performance capture, machine learning and proprietary software — to create a base model, which was then perfected with Maya and Houdini by the studio’s artists.
The model was then placed into a historically accurate version of the National Mall, housed within Unreal Engine. Real-time tools were also developed to run in the background, ensuring viewers could see the most realistic form of Dr. King using a single GPU, without having to wear heavy gear. By using wireless technology, the experience will offer a freedom of movement that is more accessible to all attendees.
Digital Domain began by casting a performer who was similar in size and shape to Dr. King in August of 1963 to recreate his body movement. The King estate suggested their official orator Stephon Ferguson, who has been publicly re-enacting the “I Have a Dream” speech for the past 10 years, since he could match these characteristics and provide a realistic reference for Dr. King’s gestures during the speech.
Ferguson’s performance was recorded at Digital Domain using Masquerade, a motion-capture system designed to track minute details in a subject’s body and face. Facial scans, combined with machine learning, were also performed on another person to build pore-level details into the base model. From these bridge scans, the team was able to compare eye position, skull shape, cheekbone and nose ridge position, skin tone and blood flow to the current build of Dr. King. These elements were then given to DD artists, who took the model (now about 50% complete) and began adding in Dr. King’s distinct visual traits.
Artists assembled a large reference library of images and video from the speech and week of the march to aid in this process. Since our bodies and faces change month to month, this was the only way to accurately portray Dr. King on that particular day. In addition to different angles, the team also sourced as many color photos as possible, since unlike a lot of old footage which is predominantly in black and white, the VR experience would be in full color. The result is a historically accurate recreation that matches his features and mannerisms in real-time 3D.
At Dimensional Imaging, Digital Domain tracked how Ferguson’s face and eyes moved while making certain words. To help, he was given visemes to recite — silly sentences designed to push the mouth around and activate all the facial muscles. After the Dimensional Imaging (Di4D) session, Ferguson returned to DD’s Performance Capture Stage, where a dot marker set was drawn on his face, and he was fitted with a Head Mounted Camera (HMC). This marker set, specific to DD, comprises roughly 190 dots which are applied to the face. Using the HMC running at 60 fps, Stephon gave the same performance that he did at Di4D. With both the HMC performance and the Di4D performance feeding into software that DD’s machine learning (ML) team has written, the team was able to correlate 2D video data to 3D model data quickly, with high degrees of accuracy.
Ferguson was then outfitted with a motion-capture suit and asked to recite the full “I Have a Dream” speech – all 16 minutes of it. Using a video reference, Ferguson performed the speech, both physically and vocally. Using this facial and body capture, DD synchronized and mapped his facial performance to Dr. King’s current digital likeness, while the animation team vetted and adjusted his performance to match. The ML software offered a wonderful starting point for the animators, who spent most of their time adding human nuance to the digital Dr. King. The FX team then used this body and face performance to add additional secondary animation onto the cloth and face, giving some procedural motion to the final build.
CAA orchestrated the project partnership on behalf of clients RYOT, TIME and Viola Davis, and also serves as strategic advisor. In addition to Davis and Julius Tennon of JuVee, exec producers include Tramz and Orefice of TIME Studios, Peter Martin of V.A.L.I.S.studio, Jake Sally of RYOT, Guru Gowrappan of Verizon Media and John Canning of Digital Domain. Ari Palitz of V.A.L.I.S.studio is lead producer. Dr. Karcheik Sims-Alvarado is the historical advisor.
The creation of The March has led to a special TIME project, digital destination and print issue on the abiding meaning of the March on Washington and the state of equality in America today, with reporting and reflections by writers, leaders and activists. The cover of this special issue, realized by artist Hank Willis Thomas, features a stunning and historically precise 3D rendering of Dr. King from The March.
The March will debut as an experiential exhibit on February 28 at the DuSable Museum of African American History in Chicago — the first independent African American history museum in the country. The exhibit will run until November 2020.
Learn more at time.com/the-march.