***This article originally appeared in the August ’22 issue of Animation Magazine (No. 322)***
SIGGRAPH is often considered a popular barometer of progress in the field of computer-generated imagery, and the 2022 gathering in Vancouver, B.C. continues that tradition. Technical papers and presentations chronicle the latest breakthroughs by researchers, production companies and product developers. But the hottest ticket is always the Electronic Theater — the juried “Best Of” show that honors the year’s achievements. “ET” has been a qualifying showcase for Oscar consideration since 1999, and this year received 325 entries from around the world.
Overseeing the show is Darin Grant, Chief Technology Officer at Animal Logic, who flags several trends worth watching. “The award winners this year — two of which are students — are experimenting with new types of visualizing animation,” says Grant. “Some purposely chose non-photorealistic styles to help underscore the mood and tone of their content. Back in the day, they couldn’t achieve photorealism. That’s not the case anymore.”
He cites the example of the breakdown of The Bad Guys by DreamWorks Animation. “They did standard rendering, and then they did all this additional work on top of that to make it look non-photorealistic. It’s inspiring to see visuals that are now pushing the boundaries of the medium.”
Grant himself has contributed to that growth. Prior to joining Animal Logic four years ago, his multi-decade career has included tenures at Digital Domain, DreamWorks Animation and Method Studios. And in addition to Grant’s SIGGRAPH responsibilities, he’s involved in the Academy’s Software Foundation and Sci Tech Council as well as the Visual Effects Society.
So, Grant brings an experienced eye to a survey of the SIGGRAPH landscape. Catching his attention this year are developments in machine learning, deep fakes, procedural tools, open-source software and the growing use of virtual production techniques like LED walls.
Given the current buzz about self-driving cars, the use of machine learning as a strategy for generating synthetic environments will be a topic at SIGGRAPH. It’s an appealing vision for making animated films and movies featuring live actors in virtual environments. But Grant cautions, “Machine learning is dependent on having a huge sample set from the world to create a model that can run effectively. To train machine learning models, you need tons of consistent data.”
Grant cites the five feature films in The Electronic Theater this year — The Batman, Encanto, The Bad Guys, Dune and Everything Everywhere All at Once. They show vastly different worlds, and when environments differ widely, he says it’s difficult to apply machine learning. “If something needs to run in a metaverse, that’s 24/7 content — it’s not 90 minutes of curated content telling a story.” He adds, “Sometimes, creating a machine learning model for a particular rig is more work than creating the rig itself! That’s why machine learning gets applied to things that are continuous.”
But strategies are emerging to benefit other production processes. As Grant explains, “There’s a software company in Australia called Kognat which applies machine learning to rotoscoping.” Stay tuned. Deep fakes will likely get attention at SIGGRAPH, especially given the expectation that a digitally anti-aged version of Luke Skywalker may appear in Disney’s Boba Fett. When the digital doppelgangers of famous actors survive scrutiny at 4K resolution, Deep fakes will represent more than lower res sleight-of-hand.
As Grant remarks, “We’re seeing a trend where people can do fully CG characters, and VFX studios are embracing deep fakes in their workflows and processes. At the Visual Effects Bakeoff last year, the creators of the VES nominee Free Guy talked about how at the last minute they tried to do a deep fake with a character and it worked … and how they should have started with it!”
That’s what’s exciting about these developments: the opportunity to enhance the artistic workflow, Grant asserts. “You still must do the animation of the character. But you can now add an extra level of realism that used to be prohibitive in terms of time and money.” The film business adage of ‘Time, Quality, Cost: Pick Two’ still holds true.
A trend that’s worth watching is the use of virtual production techniques like LED walls to bring synthetic backgrounds onto live-action sets. This approach was highlighted in ILM’s SIGGRAPH presentation of The Mandalorian last year, and Grant says that the current ET features a breakdown of LED effects in Warner Bros.’ The Batman, with VFX supervisor Dan Lemmon (an Oscar winner for The Jungle Book) presenting in-camera effects shots.
Grant explains, “LEDs are projected light, not the reflected light used in rear projection. The actors sit in an environment, and are actually being lit.”
This harkens back to the obvious rear projection used in Hitchcock’s classic films. Grant observes, “LED walls are the replacements for that. They speed up the process.” He notes that LED walls were used for the creation of the exteriors rolling past the train windows in the most recent Murder on the Orient Express. And the use of large LED walls has led to massive upgrades of the stages where Marvel shoots hybrid movies like The Avengers.
“Here’s the crazy thing,” Grant remarks, “People are now trying to do phased recordings at a normal rate of 23.99 — but they’re alternating frames on the LED wall. They can record the in-camera VFX on one stream and an LED greenscreen background, so they can do some compositing work later. It’s imperceptible to the eye, but it’s putting a greenscreen back there so that you can still capture an actor live against greenscreen while you’re capturing them lit by the effects team. It’s a different type of filmmaking.”
As a veteran technologist, Grant keenly watches software trends at SIGGRAPH. He’s long been involved with tool development, including Digital Domain’s popular compositing tool Nuke. “I managed the first generation of Nuke to be released,” he recalls. While Nuke became commoditized, Grant says, “Companies still use proprietary tools. At Animal Logic we use an internal renderer called Glimpse. DreamWorks and Framestore have their own renderers. And Pixar developed RenderMan to establish dominance, not to make a profit.”
A trend Grant is now watching is the proliferation of open-source tools. “It may not make sense for companies to keep some of their components proprietary — whether it’s a file format or a library,” he points out. “ASWF is developing an open-source playback tool to rule them all based on open source contributions from Autodesk, DNEG and Sony so that so none of us has to invest internal proprietary efforts in our own playback tool . When you’re talking about a differentiator, my playback tool is not differentiating my studio. There are more and more toolsets in the open-source community, which help lift all boats.”
Proprietary efforts are more likely to remain focused on areas like procedural software — like Animal Logic’s fabric simulation tool Weave, used to produce convincing cloth for the Peter Rabbit films. “Proceduralism is about finding ways to have machines do things that are hard to do by hand.”
The goal is to not waste artists’ time doing something that could be done via computer. “Independent films like Everything Everywhere All at Once could not have been made without the broad commoditization of effects,” Grant believes. “They created 500 VFX shots and 90% of them were done by five artists.”
He’s heartened by the trend that more indie films may be spotlighted at SIGGRAPH. That’s notable, given that this is not Grant’s first rodeo. He previously oversaw the Electronic Theater back in 2003. “Nineteen years later, I’m doing the same job!” he says with a smile.