Movies about space travel have always provided visual effects artists with opportunities to push the medium in exciting new directions. The new Netflix feature The Midnight Sky, which is directed by and stars George Clooney, continues this time-honored tradition.
The sci-fi drama is based on the book Good Morning, Midnight by Lily Brooks-Dalton, which centers on a scientist and a young girl who embark on a journey across the Arctic to communicate to a returning space expedition that Earth has been destroyed by an ecological disaster. Emmy Award-nominated visual effects supervisor Matt Kasmir (Catch-22) and Oscar winner Chris Lawrence (Christopher Robin) helped bring Clooney’s project to realistic life.
The emotional void felt by the terminally ill protagonist Augustine (Clooney) echoes the fate of the planet. As Kasmir points out, “We only ended up doing approximately 700 shots for the movie. It’s quite an intelligent drama that is emotionally captivating: It’s the opposite of our usual visual effects spectrum, which is a lot of things crashing around the place and aliens trying to eat everyone!”
The team at VFX house Framestore looked after the film’s signature spacewalk, the bloodletting, face replacements, and collaborated with production designer Jim Bissell (The Rocketeer) on conceptualizing and executing the spacecraft Aether.
“I’d previously worked with Framestore and Chris Lawrence, who has done such great work on Gravity and The Martian,” Kasmir notes. “It was a natural fit. Framestore London did the spacewalk, which was the single biggest effects sequence, and Framestore Montreal did the Arctic environment and a few additional space shots including the ‘Sick Earth.’” ILM was responsible for Augustine and Iris (Caoilinn Springall) attempting to escape from a sinking pod, and One of Us handled the various holograms and the proposed new planetary home for humanity, K23.
A Weightless World
One of key challenges for the team was how to realistically depict the weightless moments in outer space, especially since actress Felicity Jones’ pregnancy during the shoot limited her usage of wirework.
“Shooting in zero-G is always difficult because it’s impossible to achieve it here on Earth without going into orbit,” says Lawrence. “There are various well-known tricks that we use, like wirework and digital doubles. You have to plan for digital doubles early because they take a long time to do.”
Critical to the process was the Anyma facial capture system which allowed the actors to perform in and out of costume: These sessions were directed by Clooney with cast members reading their lines with one another. “Our initial data, which did arrive before we started shooting, was encouraging,” says Lawrence. “We went in with a previz and a plan to include long swinging wires and rails. As we got closer to it, we realized how many challenges that we were presenting to physical production. We were able to say, ‘We’re comfortable about the faces in this shot. We can probably get away with a digital double.’”
“Felicity is incredible on wires; to be fairly heavily pregnant and to be able to do that is one of nature’s miracles as far as I’m concerned,” remarks Lawrence. “But we did have this contingency that we could do a digital face replacement.”
To that end, complimentary data was provided by Clear Angle Studios utilizing the company’s latest scan technology while the team at Framestore hand tracked the eyes. Everything had to be precise down to the millimeter to ensure that proper emotion was translated and believably conveyed.
“Hopefully, Netflix will let us put out the CG render versus the captured plate,” says Lawrence. “You will have to stare at them to figure out which one is which. It’s like an inverted way of filmmaking to an editor where you’re looking at something out of the context of the lighting. You’re previously looking at your select in flat lighting as a square picture in picture, and you’re now seeing it from a different angle with the dramatic shot lighting. It takes on a different cinematic meaning. To his credit, Stephen Mirrione [The Revenant] tried to tune into what he originally liked about the shot. That was a great collaboration. It was him pushing us to not let any of the visual effects get in the way of the storytelling, which was a big part of the overall brief.”
Another important collaborator in the visual arena was cinematographer Martin Ruhe (Control), who allowed the visual effects team to adjust the StageCraft lighting to better integrate the live-action footage with the plate photography being projected on the LED walls. “I collaborated with Martin and Julian White, our chief lighting technician, on this Rosco system which is a gel that diffuses light,” explains Kasmir. “Behind it we had hundreds of SkyPanels that could be controlled as a low pixel ratio monitor. We could control intensity and it got us out of so many holes. One of them was creating the heat and light of re-entry.”
The team worked closely to emulate the visual language devised by Ruhe in the CG shots. “Normally, we are used to the hyper contrast blown-out super dark of space whereas this is quite muted,” he adds. “It was a beautiful and simple aesthetic. Martin used a lot of detuned lenses which meant that the edges were often soft. Even on fully digital shots in space we would just doff our hats toward these lenses and rendered in a lot of grammatic aberrations to try to keep the language of our film going.”
After the spacewalk, an injured Maya (Tiffany Boone) returns to the Aether only to find herself and crewmates surrounded by floating blood. “Even though one of our characters is dying she is actually hypnotized by the beauty of her blood in zero-G around her,” remarks Kasmir. “Everyone stops for a beat. It’s very graceful.”
Each blob of blood was art directed as if they were characters in their own right. “George talked about it being a ballet moment,” states Lawrence. “It’s quite apt. We’ve seen those references of water and other substances floating on the space station but we’ve never seen blood in that way.”
The International Space Station served as the design foundation for the Aether. “When Jim started coming forward for the design of this ship, it was reassuring because he was on the same page as the people at SpaceX, NASA and JPL [Jet Propulsion Lab] who are doing this for real. What Jim was talking about was using present-day technology for the core of the ship and a more futuristic 3D printed design language for the habitation quarters. It is the best kind of science fiction where the truth is extrapolated to tell us where we might be going.”
The Midnight Sky is currently streaming on Netflix worldwide.