Wacom’s Cintiq 22
Wacom keeps makin’ things, and I just keep lovin’ ’em. The latest Cintiq 22 is the “prosumer” version of the Cintiq Pro one, and is the big brother of the smaller version I wrote about a few months ago.
The new tablet is big and robust, so much so that it has its own adjustable VESA stand to prop the screen up from around 30 degrees, up to near 90 straight up and down. Amazingly, even when it’s vertical, the tablet is really stable, requiring quite a bit of effort to tilt it over.
The reason I say “prosumer,” is that it’s for the artists who don’t need the high precision color or 4K resolution of the Cintiq Pro. Nor do they require the optical bonding of the Pro version, which is the tiny chemical layer between the LCD screen and the drawing surface, that bonds them tightly together to reduce a perceived gap or parallax between the tip of the pen and what you are drawing.
No, the Cintiq 22, is more of the gateway tablet for people wanting to invest into a tablet for around $1,200, and not quite ready to take the plunge into a $2,500 Pro version.
But even if your image isn’t as clear and your colors not as accurate, you still get all the responsiveness of the pressure-sensitive Wacom Pro Pen 2. You’ll have the anti-glare glass which comes with a bit of a grit that gives a satisfying experience of drawing on paper (or at least a close analogue). And you will get the pleasure of drawing on the surface, rather than onto a normal tablet while watching your monitor. That offers a completely different feeling.
I do miss the additional control buttons on the older Cintiqs — or the remote on the newer Pro models. It makes it so you have to have one hand on the keyboard while the other draws. This is just an inconvenience because I’m so used to being able to quickly modify the function of the brush or access HUD menus without breaking my grip or my flow.
At the risk of being a broken record: I love my Wacoms, and I love that the company is working on tiering the products to allow access at various price points. Being an artist isn’t always the most profitable career. Having access to high-end tools that allow us to pursue that career is always a plus.
Epic’s Unreal Engine 4.23
I am so geeked about virtual production. Full admission, I haven’t been focused on the game development side of things, so I’m a bit behind on the Unreal/Unity/Steam curve. But now that Unreal Engine and the hardware that supports it has pushed the realism to the point that we can develop storytelling in real time, I’m more and more gung ho to dive into this Unreal world.
Recently, Epic Games officially released Unreal 4.23 with a bunch of new features. I’ll mainly focus on the filmmaking side of things, but they all tie together. With these features, something that strikes me is that a number of them are still in beta. The idea of pushing features out to have them tested by the user-base isn’t necessarily new or innovative, but the way Epic releases them with such a casual nod just makes what used to be unheard of into something that is, and should be, absolutely normal.
Chaos Destruction was revealed as a demo at GDC, and now it is now in beta for this Unreal release. Very similar to systems in the VFX industry, you can take static objects, fracture them with some varying algorithms like voronoi, radial and planar cuts, and then have forces affect them, or different parts of them, to have them destroyed in epic fashion. The simulation can be cached and activated through user events. These can then feed Niagara, Unreal Engine’s particle system (which also got a boost in this release), to generate interactive smoke and dust.
Ray-tracing has been improved and optimized — and not just for RTX acceleration. It’s more stable, additional geometry and materials are supported, and it’s more intelligent when running into reflection bounce limitations. Virtual texturing allows for huge textures by swapping them in as tiles based on what pixels are visible. There are better utilities for analyzing the Engine’s performance. And, there is support for Microsoft’s HoloLens (it was mainly Unity when I was working on the project years ago).
All of the above, for me, just drives the workflow for the Virtual Production Pipeline tools. We’ve all been seeing the example of the dude on the motorcycle in front of Unreal-driven LED screens with a CG canyon behind him. All driven in real time and captured in-camera. That example is the top of the line, with lots of big brains — and money — behind it. But even if you don’t have all that supporting equipment, the production tools are still available to you to start making animations, or pre-planning your next shoot. With VR Scouting, you or your DP can plan shots in VR, dynamically, as your production designer plans out the sets. And you can plan realistic lighting with Cine Tracer (which I will review next time).
I’m really just scratching the surface on the features in Unreal Engine 4.23. If you are a filmmaker, and you haven’t started looking into these tools as a way to optimize your shoots and make them more efficient and less costly — then you are going to fall behind. And you will fall behind fast. Unreal Engine is free, so there are no excuses!
Part of incorporating visual effects into film is to know what was happening on set when the live action was shot. What was the focal length of the lens? What was the camera doing during the shot? Where were the lights? What lights were they? Was anything changed between takes? There is so much information to keep track of. So much so that the Visual Effects Society has been working on standardizing the data — and it has been put into a FileMaker Pro database, which can be found at www.camerareports.org. But, to use it effectively, you need FileMaker Pro. And to make it efficient on-set, you can put FileMakerGo on your iPad and work light and agile, inputting the data as you try and stay out of the way of the film crew.
Setellite is an app that streamlines the process. Working with the Visual Effects Society, VFX supervisors and on-set data wranglers, the development team has put together an easy-to-use interface and a UX designed to work the way those on-set teams work. The data is synced to the cloud when a connection is available, but you can also work offline, with syncing happening when you are connected again. This is perfect when shooting on location when internet is dicey or non-existent.
The workflow begins with preparing the project with pertinent data — like the key crew, shoot dates, the gear (camera bodies, available lenses, lens filters). The internal library is already jam packed with presets, but you can add custom gear as well.
Then you move into the real work. Using the slate interface, you coordinate with the camera crew and the script supervisor to align the slate numbers and takes to the production slate. Activate the cameras used, and start putting in the data — focal length, height, focal distance, tilt, dutch, filters, etc. Use the camera on the iPad to take some reference shots. Tag additional set media if applicable; balls, charts, HDRI, survey data, LIDAR, etc.
And then, the iPad becomes a slate in itself that can be used alongside the production slate. On the clap, the Setellite plate quickly flashes through all the data. In real time, it feels like just flashes — but in editorial, each frame is a different set of information — although the specific duration can be customized.
All of the collected data can be exported to various formats and conforms to the VES Camera Report Interchange as a CSV spreadsheet, or you can export to PDFs. There are future plans to support Shotgun (and I assume others) as well, so set data will be easily ingested into your production tracker.
Setellite can be a standalone license, limited to two devices (about $100). There is also a subscription-based model with additional features like project collaboration, backups and web readouts (around $17/month for single user or $28/mo. for a team of two, and $11/mo. for each additional user). And if you think that’s expensive, you haven’t had to deal with effects shots that have no data — or worse, the wrong data.
Price: $17 month (single user), $28 month (team of two) + $11 month (additional user); Annual rate: $209 (Single user); $340 (team of two)
Todd Sheridan Perry is a VFX supervisor and digital artist who has worked on many acclaimed features such as Black Panther, The Lord of the Rings, Speed Racer and Avengers: Age of Ultron. You can reach him at firstname.lastname@example.org.