Connect with us

October 2015 Tech Review


October 2015 Tech Review

V-Ray Cubic VR
Chaos Group V-Ray 3.2

When I asked the guys at Chaos, “How many new features have been implemented in V-Ray 3.2?” their answer was, “Hundreds.” So, I said, “For the review, let’s boil that down to something maybe … less than that.” And here’s what we came up with.

At SIGGRAPH 2015, the Big Thing was Virtual Reality and Augmented Reality and Virtual Augmented Reality. Unsurprisingly, one of V-Ray’s highest profile new features is camera support for VR rendering, more specifically targeted (at the moment) to the Oculus Rift and Samsung Gear. Basically, you can render stereo 360-degree views in either a cubemap or spherical format, which simply and seamlessly fit into the new technology. I remain unconvinced that the technology will take off as a “film” experience, but I have been convinced of its importance in tons of other fields — education, medical, industrial, mechanical, real estate — and Chaos seems to have taken the hint.

More advances are filed under “RealTime”, which is rendering on the GPUs. Depending on your specific video card, your results may vary. But, using the Progressive render to get quick feedback for lighting and shading, V-Ray RT was throwing its calculations to the video card rather than the CPU. Now 3.2 has added QMC Sampling (the stuff that deals with noise), displacement, composite maps, texture baking, UDIM support, et al. I think they just plan to keep throwing stuff at the GPU to speed things up until all of our display cards explode. But, it means faster feedback, which means we go home earlier in the day — or that the director feels he can tweak things a few more times.

For me, the next big, important feature is Volume Grids — the containers that hold all the FX-y stuff like smoke and fire and explosions. In the not-too-distant past, you may have seen smoke rendered in Houdini’s Mantra, water in RenderMan, robots in Arnold and environments in V-Ray. That smoke is a nightmare if your robot, and water and buildings happen to be inside the smoke, which is very likely.

See, the smoke had to be rendered with a holdout matte of the other stuff. Depending on how the renderers deals with motion blur, antialiasing, etc., those mattes would never fit properly. (Just ask John Knoll about Pacific Rim.) But now, V-Ray’s Volume Grid can import lots of standard fluid formats — OpenVDB, Field3D and Chaos’ own PhoenixFD — so you can render your smoke, with holdouts, in V-Ray. And you benefit from all the other stuff you get from rendering objects in volume, like GI bounce, shadows, all sorts of neat bonuses.

These are the standout features. I’m not even including code optimization and everything that makes the whole thing run faster. Definitely a worthwhile upgrade, especially if you are in the VR space or are doing lighting for effects work.

Bitmap 2

Allegorithmic Bitmap2Material

It’s a common theme in my reviews that sometimes the most innocuous tools save the most time. We get so caught up in our Mayas, Maxs and Houdinis that when something flies in under the radar, we catch the blip and then say, “Huh, of course this is a great idea!”

This is the case for Allegorithmic’s Bitmap2Material application. The guys at Allegorithmic have been getting the most buzz from their Designer and Painter tools — the only way they could have been adopted faster is if someone told Angelina Jolie and Brad Pitt about them.  Everyone seems to be jumping on the bandwagon as physically based shaders are becoming more ubiquitous. But this isn’t about those products. It’s about their little cousin.

Bitmap2Material is a simple, ingenious bit of software that takes a photograph, performs voodoo, and extracts from that photo the maps you need to drive the PBR shaders. Normals, height, metallic (i.e. reflectivity), diffuse, ambient occlusion, etc. are all made available for shader development. Caveat: Depending on the quality of the photo.

On top of that, you can tile your texture in a multitude of different ways to avoid the telltale patterns that evolve from tiled images. And, you can add in additional grime and dirt, which contribute to the organic structure and breakup of the tiling.

If you are familiar with Painter and Designer, then you’ll have no trouble getting around B2M.  The same 3D display with HDR IBL (although I can’t seem to change the HDR image … I’ll have to check in about that) and the 2D maps displays are in them all. You get real-time feedback for the material as you make adjustments to the sliders. And all the nomenclature between the software is consistent.

All this is in the less expensive Indie license. The Pro license brings the magic directly into the Maya, Max, Modo, Unity, Unreal, etc. shader trees. So instead of bringing the images into the standalone B2M and exporting the maps to be brought back into your favorite 3D package, you bring the image into the 3D software, apply it to the B2M node, and you get dynamically generated map extractions inside your shader. Pretty handy.

And remember, the whole Substance suite is available not as a subscription, but more as a rent-to-own plan: You actually own the software at the end of the year. If you want to upgrade, you go through the whole cycle again. But, it gives artists access to the tools without a large upfront cash payment.

Ricoh Theta

Ricoh Theta & HDR 460 Bracket Pro

Obtaining HDR images can be a real pain. For production, you generally need a special mount like a Nodal Ninja with a remote control to bracket the exposure on your DSLR, which needs a lens with a near 180-degree field of view. Then you take those images and stitch them together and flatten the exposures to a single HDR in your prefered software flavor… It’s nuts.

So, Ricoh came out with the Theta HDR camera. It’s about the size of an Apple TV remote and has two small wide-angle lenses facing opposite one another. You press the button and get an automatic 360-degree selfie (if that’s what you like to do). Or you can download the app to your smartphone and control the camera through its built in wireless, and then transfer the photos to your phone for easy social media-ing.

However, you can take things one step further and download (at least for iOS devices) HDR 360 Bracket Pro. At $59, its a comparatively hefty price for a phone app, but you’re a professional, right? What Instagrammer is taking bracketed selfies?

With HDR 360, you can auto bracket up to nine exposures (I’m not going to say stops because the math is probably not one to one), and control the ISO of the camera from the app. The front and back are automatically stitched into the full 360 view and they are ready to be moved to either your phone wirelessly or your computer through the USB.Next, you take those images and smash ‘em together in Photoshop or Nuke or HDR Studio or what-have-you, and you have a quick and dirty HDR for your game or VFX lighting needs.

I don’t think it’s going to be used on the next Marvel feature … The limitations of the hardware prevent the broader exposure range of a DSLR and it’s limited to JPGs — so that 8-bits just wouldn’t cut it for high-end production work. But, at $300 (or $359 with HDR 360), you are getting most of the way there for a fraction of the cost, and a fraction of the setup time. A perfect stocking stuffer for that special Lighting TD in your life. |

Continue Reading
You may also like...

More in Technology


Recent Posts

Featured Trailers

Could not generate embed. Please try it again.

To Top