ADVERTISEMENT

The VFX of ‘Here’: Time Travel Enabled by Machine Learning

Getting your Trinity Audio player ready...

For more than four decades, audiences have come to expect stunning visual effects in Robert Zemeckis movies. The man who pioneered visual effects and animation in such movies as Who Framed Roger Rabbit, Back to the Future, Death Becomes Her, Beowulf, The Polar Express and Flight is back again this year with another incredible achievement: an adaptation of Richard McGuire’s graphic novel Here, which follows the history of a plot of land in America from prehistoric to contemporary times using a single camera position and lens.

Complicating matters is that the eight different storylines do not happen chronologically but constantly transition between each other within the same frame. “The language of the time transitions evolved throughout the editorial process,” explains Kevin Baillie, the film’s production VFX supervisor, who also collaborated with Zemeckis on Flight, The Witches, The Walk and Welcome to Marwen. “Bob had shot a lot of options to keep doors open so he could arrive at the pacing and order that landed with the audience. We transition through a lot of different timelines using these graphic panels; they were extremely hard to complete logistically, because each scene is dependent on every scene that goes into it being finished. Some of those transitions had 22 or 23 different scenes feeding into them.”

Here [c/o TriStar Pictures]
Age-Reversing Machines: New AI technologies helped the filmmakers move Tom Hanks and Robin Wright’s characters through several decades in their lives.

LED Wall Opens Windows

Many of the tasks were much more challenging than what was thought from the outset. “What seems that you just build a neighborhood outside of the window is actually 80 different stages,” recalls Baillie. “Fences are changing, trees are in different season and growing, and the road is at various states of repair, because we wanted the neighborhood to be a character that works with the mood of the room. We couldn’t prepare all of those backgrounds ahead of the shoot. It had to be live, so there was a LED wall out the window of the room and the environment was built in Unreal Engine; that way if [cinematographer] Don Burgess or Bob were inspired to make it later in the afternoon, then we could respond to that on the LED wall and Don could tweak his practical lights to match.”

Kevin Baillie [ph provided by subject]

‘The hopeful note that I want to end on is that I know people are trepidatious about some of these AI tools. ‘Here’ is a great example of a film that wouldn’t have been able to be made without them.’

— VFX supervisor Kevin Baillie

 

 

The LED wall was especially beneficial for a scene where the front door is opened and snow blows inside. “One of the reasons that the special effects team was able to nail it is because we could see on the LED wall through the window how fierce the storm was. We could match that by dialing the amount of snow that was coming in through the door to support that in a way that everything visually made sense. That is very difficult to do if you just have a blue screen instead of the window, which leads to a lot of guesswork.”

Leveraging the expertise of the on-set crew was one of key benefits of the LED wall. “When you’re doing the visual effects in camera, all of a sudden everyone on the set, who have all of these decades of expertise, gets a chance to weigh in and make it as good as it can be. If we do it in all in postproduction, then it’s just the visual effect team trying their best.”

Here - Tom Hanks de-age variants
Digital de-aging progression of ‘Here’ star Tom Hanks

Since the movie features younger versions of Tom Hanks, Robin Wright, Kelly Reilly and Paul Bettany’s characters, machine learning played a big role in the de-aging process. “I’ve spent over 20 years of my career learning how to do the digital human face with a traditional CGI method, and it’s painstaking,” explains Baillie. “We look at human faces every day, so any missed detail sets off an uncomfortable response. With the machine-learning approach, literally Take 1 out of the neural networks is out of the uncanny valley. It’s wild because the neural networks pick up on the nuances. If you feed it too much data of other people, even though they look similar, you will get what we call ‘identity creep.’ We were cognizant of involving the actors in that whole process of building their likeness at the younger ages. It was amazing to see how emotional it was for some of them.”

A preview of the de-aging results was available in real time on set. “In one monitor, we would have the actors in their current age, and in the other monitor, we would see them in their 20s. Bob would direct a scene with Tom Hanks and yell, ‘Cut.’ Then, Tom would be able to come back behind the monitors and see himself acting at 25 and go, ‘I’m overacting my youth.’ That was an example of bringing visual effects technology to set and presenting it as [a] tool not only for the director and myself but for the actors, costumes and hair and makeup.”

Here - Tom Hanks de-aging pre-production test

Here VFX before and after

The production relied upon prosthetic makeup for the elderly versions of the characters, except for one shot. “The problem with the machine-learning approach is that Robin has never been old before, so we have no data to train on,” says Baillie.

“What the team at Metaphysic did was to use other AI-based tools to take the images with the prosthetics and create synthesized new versions of those that had a more natural skin translucency, less bumpy, and fixed the areas where the prosthetic was coming loose. It was like an assist that leveraged the brilliance of the prosthetics team to take it over the edge from a skin-quality perspective.”

Here [c/o TriStar Pictures]

Much deliberation went into the camera position and lens for Here. “If you think about the challenge of choosing a camera position and lens for this movie, it’s wild because Day 1 of shooting, Take 1, the first scene that you shoot, you have made every camera choice for the entire movie,” says Baillie. “The same for the construction perspective. Where the fireplace is located affects where the couch is in a totally unrelated scene. Everything had a knock-on effect on everything else because of the fact that we had this unmoving perspective.”

One of the constant characters in all the storylines is a single hummingbird. “The hummingbird embodies the Indigenous American who dies and is a reminder that the spirit of all of us lingers through history even when we’re not there anymore,” explains Baillie. “The hummingbird was a fun character to do, because technically we had to figure out where it was going to be in a shot, especially if a character’s eyeline had to follow it. We tried some fancy things, but at the end of the day, we ended up using ping-pong ball on the end of a stick to get the correct pacing. That gave our animators a baseline to start from.”

Here [c/o TriStar Pictures]

Spreading Its Wings

The bird was treated as a hero character in the project. “Hummingbirds have this iridescent sheen to a lot of their feathers, so if you cheat with the feathers it doesn’t look right,” says the VFX supervisor. “The hummingbird was painstakingly done, from how it breathes and how the audience can notice its neck move.”

There are 185 to 200 shots in total, some lasting four minutes, with Dimension Studio looking after the LED wall, DNEG acting as the main vendor, Metaphysic handling the de-aging, and additional support provided by Crafty Apes, Luma Pictures and Wise Reflection.

“The hopeful note that I want to end on is that I know people are trepidatious about some of these AI tools, and Here is a great example of a film that wouldn’t have been able to be made without them,” concludes Baillie. “These tools enable a lot of creative achievements.”

 


TriStar Pictures released Robert Zemeckis’ Here in theaters in November.

ADVERTISEMENT

NEWSLETTER

ADVERTISEMENT

FREE CALENDAR 2024

MOST RECENT

CONTEST

ADVERTISEMENT