***This article originally appeared in the August ‘19 issue of Animation Magazine (No. 292)***
One of the more surreal moments experienced by animation supervisor Andy Jones over the two-plus years he spent heading up the character animation work on Jon Favreau’s new version of The Lion King occurred following a screening of clips from the movie this summer. Jones ran into the film’s cinematographer, Caleb Deschanel, ASC, who introduced him to his companion — famed actor Warren Beatty.
“Caleb introduced me and said, ‘This is Andy Jones, the animation supervisor on the show, and he was responsible for the performances you saw,’” Jones recalls. “Warren shook my hand and said he had never been that involved watching an animated film, that the performances were so compelling, and that he felt more for these characters than he has ever felt in an animated movie. That was such a huge compliment for our animation team.”
It won’t be the last compliment for the animation group at MPC, London. The project is a faithful retelling of the story from the 1994 cel-animated classic (directed by Roger Allers and Rob Minkoff) — a coming-of-age tale of a young lion forced to fight for his royal birthright after he is betrayed, deep in the heart of an African jungle. However, the conceit of the project was to make the entire Lion King world and every character in it stunningly photoreal. But, of course, it isn’t. In fact, every shot in the movie is key-frame animated. The achievement was built on the backbone of various virtual production and animation techniques largely the same group of filmmakers pioneered for 2016’s The Jungle Book, also directed by Favreau.
Moving to the Next Level
“This movie does not have a single live-action component in any shots,” explains Adam Valdez, the movie’s VFX supervisor. “For MPC, this is the first time that we have created every shot of an entire movie, every pixel, from beginning to end. However, every technique we used was more in line with how we would work on live-action movies. You might call it a new kind of movie in that sense, a kind of hybrid style.”
Favreau, Deschanel, and co-vfx supervisor Rob Legato, ASC, headed the lengthy “virtual shoot” for the movie at the facility in Playa Vista, California, where the virtual production process took place on a specially configured stage. In their shooting area, called the OptiTrack volume after the OptiTrack Active Tracking motion-capture system used to record real-world camera movement, they wore VR goggles from an off-shelf version of the HTC Vive VR system, and first “scouted” their locations. Then, they chose angles, lights, lenses, camera positions and movement while interacting with sophisticated previs animation built by MPC and processed in near-real time by a Unity Game Engine. They then had real cameramen, using real camera interfaces like pan-and-tilt wheels, dollies, cranes and more, “shoot” sequences while OptiTrack sensors recorded data for every aspect of their movement. Later, MPC animators applied that data while constructing final shots in post, resulting in Deschanel’s creative camera choices being organically wedded with some of the world’s most sophisticated animation.
But MPC’s two-phase animation work—creating and evolving previs material for the production to use during virtual production, and then transforming the results into stunningly realistic animation—was, in essence, the straw that stirred the entire drink. It was a process that relied on up to 110 animators at times, mainly in London but also employing MPC’s Bangalore and Los Angeles crews, over the course of 2017 (previs) and 2018 (final animation). The look of everything was based on a two-week reference trip key department heads made to Kenya in March of 2017 to scout locations and make a plan for recording, scanning, photographing, and filming reams of detailed reference material, from rocks to mounds of dirt, hills, trees, watering holes, mountains, and of course, Kenya’s teeming wildlife.
Previz of the Pride
The movie’s set supervisor, Audrey Ferrara, suggests “previs for this movie was really something unique. At the time we started this process, I don’t think anyone else in the world was using this methodology. Basically, we had to create a VR environment into which we could bring the filmmakers—a virtual world they could shoot in. It was almost like creating a videogame and the game was ‘shoot your movie.’ So we had to build tools for cameras, cranes, lenses, all the stuff [traditional] filmmakers would be comfortable with. Therefore, there was a need for something a bit more elaborate than traditional previs. In a sense, it was almost version zero of the movie.”
Andy Jones explains that “a typical previs would have a team of previs animators, we would divvy up storyboards, and say you do that shot and you do this shot, everybody taking on a different shot, each one three or four seconds long. And then, the next shot is a fresh start, depending on where you are cutting, and your continuity, at that stage, is less important. But here, it was more of a master scene previs situation, which means it worked as a one-er, a single camera for a minute covering an entire scene. Or it worked as all sorts of coverage closeups, but every shot was continuous and so it felt like there was perfect continuity along the way, because the animation is doing the same thing every time we play it back and shoot it from a different angle or try a different camera move. This approach made it ‘feel’ a little more live action. Caleb was able to witness the previs animation in the VR space with goggles on and then decide what type of camera coverage he wanted.”
Along the way, crucial to this process, MPC created a workflow tool which facilitated the virtual production pipeline by allowing the company to export and import models and animation assets from Maya at MPC London to the Unity game engine on the stage at Playa Vista, and back again. It also tracked all changes that were made to sets in VR space, so that they could be recreated in Maya, and tracked camera data and takes.
Realistic Animals in Motion
On the question of photorealism, Jones emphasizes that the company “took lots of leaps forward in fur rendering on this show. We did a lot of work on fur shading to make it react with light in a more realistic manner. And the amount of fur we were rendering, the total hairs per character and the fineness of the fur, we pushed it much closer to what happens in real life.”
Indeed, in addition to extensive reliance on Maya (animation and character deformations), Nuke (compositing), Houdini (fur dynamics), Katana (lighting and lookdev), RenderMan (rendering) and other popular tools, the project also required MPC to write new software to address a wide range of issues. Among those was a radical improvement of the company’s muscle/flesh simulation technology, built upon MPC’s core C++ libraries for math/geometry, which the company calls “Muggins.”
Likewise, for fur, hair and feathers, the company used MPC’s in-house fur tool, Furtility, which MPC character supervisor Ben Jones refers to as “a procedural grooming system where multiple different operations are layered up to achieve the final look of complex fur.”
The key obstacle in selling realism for the animation team was the fact that the animals sing and talk in the movie — the one thing real animals obviously can’t do. Jones points out that “clearly, these animals are not supposed to talk or sing. But we learned from Jungle Book how far to push or not push it, and to show a lot of restraint, to make lip synch work by making sure the mouths could not move in any way a real animal’s mouth could not move. For us, the mantra was, try to keep the movement of the lips and mouth more like real animals, and don’t try to push phonemes so much that they are doing weird funnelers or rolling their lips around in ways real animals can’t.”
One of the biggest challenges was the portrayal of Zazu, the red-billed hornbill bird voiced by actor John Oliver. By their nature, bird facial and beak movements are not typically conducive to emotive performances, Jones emphasizes.
“But Jon made it clear we could not do anything with Zazu that a real bird’s beak couldn’t do,” Jones adds. “We’ve all seen parrots talk, and you can see their beaks don’t do anything funny. They kind of form sounds with their throats. So we took that approach of using the beak more like a jaw, but with most of the sounds coming out of the throat. So we built sophisticated throat controls for Zazu, while having him move like a real bird, moving his head back-and-forth, quite frenetic, with fast starts and stops to the motion.”
One of the subtlest challenges involved the film’s villain, Scar, who evolves during the film. He starts out as something of an outcast, and later adopts a more regal bearing.
“With Scar, we did a lot with posture,” Jones says. “We gave him a bit of an arc, where in the beginning, he is depressed about his situation, and wishes he was king with more power. So, he holds his head lower and has a kind of saunter. By the middle of the film, once he has taken power, he carries his head higher, walking with more pride. And then [at the climax], when the power is taken away from him, he kind of goes back to the Scar we met in the beginning.”
Out of Africa
On this film, Audrey Ferrara was double-tasked with both working closely with production designer James Chinlund to visualize and build the entire virtual world of The Lion King for use during production, and then the final version of that world in post-production.
“I was brought in really early in the process to work with the art department and production designer specifically to put in place all the rules of those environments, collect artwork from the production designer — building a green library appropriate for different locations in the movie — and put in place an organization and a plan to shoot references [in Kenya] in order to create all the CG libraries, and then basically take all of this into post-production,” she says.
Ferrara emphasizes that the reference trip to Kenya was crucial to her work, because “it allowed me to put in place a reference shoot itinerary and then have an MPC team go back and shoot all of those elements, ranging from trees to rocks, rivers, and mountains. We collected a huge panel of material, shooting panoramas, HDRI [high dynamic range still photos], doing some photogrammetry, going around objects or rocks or trees in order to photograph every angle so we could produce scans of those elements. We also did a texture shoot for leaves and trees.”
One of the challenges she highlights is the fact that many locations in the film are iconic, easily recognizable from the original animated film. “You can’t really change too much the design because, otherwise, the audience won’t respond to it, since they have a profound memory of it,” she explains. “That was the case with Pride Rock, certainly. James Chinlund chose one specific place in Kenya for that and we had to modify it slightly to make it [reminiscent of the location from the original film]. And in other cases, like the Elephant Graveyard, James used pieces of Kenya environments and other references to create something similar to what we saw over there, but not 100 percent similar, so they would work for this story. And then, in other cases, we have places and elements that directly replicate references from Kenya. For instance, James really liked a place nicknamed Challenge Beach, a little bend of river. He used that location to create the watering hole scene, where the ‘Can’t Wait to be King’ musical number happens.”
The mystical Cloud Forest is another example. That location was directly inspired by a spot on the side of Mt. Kenya where the environment is very similar to what the story calls for. In fact, Ferrara adds, using helicopters and state-of-the-art scanning tools, filmmakers scanned the shape of Mt. Kenya and also Mt. Kilimanjaro, which is seen in the background in certain scenes.
Mufasa’s Spirit and Other VFX Highlights
Adam Valdez’s role was, in his words, “to kind of bracket the whole thing, with one foot in the [MPC] facility and one foot on the production side. So I helped set up a lot of the virtual production workflow in the beginning, then I went to Africa with everyone and then spent a year in Los Angeles helping to create virtual production shots before the post-production work started in London.”
But along the way, he also had to take charge of “the high complexity and naturalism” that numerous visual effects sequences required, particularly the final battle while a giant fire rages; the rampaging of a herd of wildebeest through a canyon; and the mystical appearance of Mufasa in the clouds to pass the torch to his son, in a direct homage to a key sequence in the original film.
“That moment when Mufasa comes to visit Simba from beyond the grave, he is sort of a spirit form up in the clouds,” Valdez explains. “That is a pivotal moment from the original film. So it took us months to try out lots of ideas and figure out exactly the right balance between fantastical imagery and realism. I had to routinely drill into cracking those kinds of little nuts.”
He adds that MPC’s painstaking work to improve its physics simulation tools was crucial for scenes like the wildebeest rampage.
“It’s an entire world built in three-dimensional space, which is very complicated,” he says. “But the good thing about it is that you can get fairly true to how the animals stampede, and the effects that surround them when that is happening — dust flying off them, or when they crash to the ground, and so on. Our simulations let us know how the physics should work. So the simulations we used to create fire, water, dust and other things all have a good, solid foundation, and are in a completely legitimate space. That, in turn, makes it easier for us to light them and ultimately composite them and craft the final look.”
For instance, Valdez says that to build “a good foundation” for the fire sequence, MPC conducted “a variety of little test shoots to understand things like heat ripple on the horizon. Our guys shot real heat ripples over grids, and then built systems for creating optical heat ripples that were very natural, based on real-world references.”
Disney’s The Lion King begins its theatrical run on July 19 and is expected to break records at the box office.
Michael Goldman in the author of The Art and Making of The Lion King (Disney Editions, $50) which is available in bookstores and amazon.com this month.