***This article originally appeared in the Sept./Oct. ’21 issue of Animation Magazine (No. 313)***
Acclaimed filmmaker Neill Blomkamp (District 9) goes beyond his usual realm of science fiction with the late summer release Demonic. The well-received horror film centers on a young woman (Carly Pope) who enters into the mind of her comatose mother (Nathalie Boltt) and unleashes a malevolent creature.
After making a series of experimental shorts for his production company Oats Studios, Blomkamp decided to tackle the low-budget project after the pandemic shutdown delayed other big-studio productions. Another motivating factor was the desire to utilize an emerging technology.
“On a purely visual effects level, I was obsessed with volumetric capture as an idea for the last three years,” explains Blomkamp. “Oats Studios is well-suited to experiment with vol cap because we don’t have to answer to anyone. Under that framework, I started speaking to Metastage in Los Angeles, which is backed by Microsoft. But when COVID-19 happened, the cast and I couldn’t cross the border, so I was not able to use them.”
Disguised as Prototype Tech
The director began conversations with Volumetric Camera Systems in Vancouver and went through discussions about the size of the volume and the number of cameras he needed. “I knew that the visual resolution would be glitchy,” he explains. “The best way to use vol cap would be to incorporate it into the story as a prototype technology; therefore, the audience would accept the way that it looked and it would give me the opportunity to experiment with it like I always wanted to do. Once we decided to do that, then the next thing was to drop volume capture data into Unity. We would be in a real-time environment and could use virtual cameras at any point as well as change the lighting.”
Different camera styles were adopted for the real world and dreamscape scenes. “The fictitious technology is used as a way where a person in a coma or a quadriplegic patient would be able to venture into a virtual world in a Unity-like real-time environment and get out of their body,” explains Blomkamp. “We have traditional photography when the main character goes to this hospital and interacts with her mom, who is in this virtual space. The virtual space can be as wildly different from the live-action photography as we want.”
“I wanted the on-set photography of Demonic to have no handheld quality whatsoever,” he continues. “It would have a sense of control that would lead to slow tension and dread. To counterbalance, I wanted to use handheld photography in the virtual space.”
The team used a Sony Exmor IMX377 camera to record video sequences in 4K, and a modular camera system which allowed the crew to shrink or expand the camera array depending on what was needed for a particular shot. “At maximum, we had 239 of these 4K cameras, and at minimum we had 179 cameras in the smaller rig configuration,” Blomkamp elaborates. “The cameras were recording in 4K, and we used a machine learning neural network in order to enhance the details of our captured images through a process called ‘super-resolution imaging’ (SR). So, although the original videos were recorded in 4K, our upsampled image ended up at 6K, giving us a bit more detail and latitude to work with in post-production.”
One advantage was that real time offers no restrictions in lighting. “You can grab the sun and move it,” says the director. “All of the shadows change and the light bounces differently. That messes with your head if you come from scanline rendering like I do. The other benefit is the way you could watch it in VR. We can actually give people executable files that they double click on and enter the world of Demonic simulations.” Jakub Lesniak served as main lighting lead for the production.
The film features three virtual simulations and each one takes place in a real location: Carly Pope’s character’s childhood home, a 100-year-old sanitorium in British Columbia (where the scary third act finale takes place) and a synthetic house that is the result of her mother’s memories. “All three of those locations were scanned,” says Blomkamp. “We did wave points where a drone flies in a grid and shoots billions of aerial shots. My brother Mike Blomkamp and VFX supervisor Chris Harvey both had Canon EOS 5D Mark III cameras and took billions of photos of the locations.”
All the information was then given to UPP in the Czech Republic, the company that took care of the visual effects for the movie. “They extrapolated those three locations from these banks of photos,” says the director. “The texture detail was embedded into the RGB data from the photos. We had to screw-up the geometry to make it match with the level of vol cap.” What was supposed to be two months of work for UPP became five months, with most of the time spent on R&D.
According to UPP owner Viktor Muller, the VFX studio delivered about 260 shots for the film. “There are traditional visual effects like set extensions, driving plates and monitor displays,” he notes. “We also did the demon at the end, but the most demanding parts of the movie were the virtual reality simulations.”
An array of 260 cameras captured footage at 24fps for the project. “What is great about volumetric capture is it gives you amazingly realistic movements even in the tiny details,” notes Muller. “You feel like you’re in front of a real-life person.”
Making things quite challenging was the lack of artificial intelligence that could handle the vol cap data. “It was done model by model, which was difficult for us because you don’t have a graphics card that has such a big memory to handle all of these models,” says Muller. “Unity provided us with a solution in the form of a volumetric point cloud player. Point clouds require much less memory. We were able to load those 10 to 15 seconds of vol cap footage into the graphics card of the computer and play it in real time. Because we used the point clouds instead of the polygons, we had to reduce the quality of the final resolution.”
While the virtual camera can be moved around anywhere, certain restrictions applied. “Once you do the volumetric capturing you can’t modify the asset. You’re getting what you’re getting,” explains Muller. “Rendering required several different passes in Unity. One render was for the environment and all of the rooms. The other was done for vol cap and shadows. It was tricky.”
Of course, a good possession movie always needs a terrifying monster. “The creature was designed by this amazing concept artist named Eve Ventrue,” remarks Blomkamp. “I instantly loved it. We got Werner Pretorius’ company Amazing Ape to build a physical on-set monster that was occupied by a tall suit performer, like the Predator. The creature existed in the real world and could shoot with our live-action cameras, which we did. But the part that people in visual effects will think is hilarious is I took a physical 1980s-style Nightmare on Elm Street suit and put it in vol cap. It’s this incredibly weird, totally reversed situation of a physical prosthetic suit becoming a CG asset! It’s in a darkly lit CG hallway with blinking lights and existing in 21st century real-time game technology, which is so cool.”
Overall, shooting and processing the vol cap was the most difficult aspect of making Demonic. “That’s because it’s new technology,” says the director. “In the first simulation I bring the audience down to the level of Carly experiencing her childhood home. Because it’s a hospital providing this virtual escapism to their coma patients, it means that the doctors have a God’s eye view of what is essentially a Unity game level. We could delete layers of the house in order to see her. It has this insane video game look that I haven’t seen before in cinema. That’s probably what I’m most proud of, because it’s unusual!”
Demonic was released theatrically by IFC Midnight in August; available now on video on demand.