Conductor Technologies launched its Conductor cloud rendering platform for VFX, VR and animation studios. The first enterprise-grade rendering service of its kind, Conductor is able to offload regular workloads and burst renders in whatever configuration works best for a particular pipeline or project. Fine-tuned through an extensive beta development cycle, it has already enabled productions to scale to more than 36K simultaneous cores and render over 30 million core hours for big-time flicks like Deadpool, Star Trek Beyond, Transformers: The Last Knight, Pirates of the Caribbean: Dead Men Tell No Tales and The Walk. Studios which have used Conductor include Vrtul, Riot Games, Atomic Fiction and Magnopus.
Conductor is a cloud-agnostic platform, initially backed by Google Cloud Platform. Support for Microsoft Azure is now in private preview. The platform currently supports Autodesk Maya and Arnold, Foundry’s NUKE, Cara VR, KATANA and Ocula, Chaos Group’s V-Ray, Golaem, Ephere’s Ornatrix, and Miarmy. New partnerships with Isotropix for Clarisse and Pixar for Renderman are being announced at SIGGRAPH, with more to come. Stop by the Microsoft Azure booth #923 for live demos or earn more online.
Epic Games revealed advancements in real-time CG production workflows. Datasmith, a workflow toolkit to simplify moving data into Unreal Engine for architectural and design visualizations (private beta in August; unrealengine.com/beta) was previewed publicly for the first time at the Unreal Engine User Group. Using data from Italian architects Lissoni and designers at Harley-Davidson Motorcycles, Epic’s Chris Murray showed how Datasmith’s 3ds Max plugin and CAD importers enable users to import files that retain visual fidelity to the source, saving hours in data transfer and preparation time, taking the user most of the way to creating a fully interactive, photoreal, real-time visualization experience.
And an Alembic-based pipeline for the creation of high quality animated content in real time was demonstrated. This workflow enables the integration of quality animated mesh transformations with skeletal deformations in the engine, in real time. Developed for the Fortnite cinematic trailer “A Hard Day’s Night,” the tools are available in Unreal Engine v4.17, due early August. More on Epic at SIGGRAPH here.
Google Spotlight Stories is showcasing amazing animated VR projects at the show. In the new VR Theater, catch work-in-progress screenings of Son of Jaguar from Jorge Gutierrez and Sonaria by Scot Stafford and Chromosphere. On Wednesday at 10:45 a.m. in South Hall K, the directors will be joined by Kevin Dart (director, Sonaria), Theresa Latzko (Technical Art Lead, Chromosphere), Cassidy Curtis (Technical Art Lead, GSS), with Max Planck and Chris Horne from Oculus Story Studios’ Dear Angelica for the “Behind the Headset” Production Session. Gutierrez will also feature in the VR Theater Director Q&A with Eugene Chung (Arden’s Wake), Eric Darnell (Rainbow Crow) and Tyler Hurd (Chocolate) that day from 2:50-3:30 p.m. in Theater 411, and hold a Production Gallery Meet & Greet at 4:30 p.m. in the Concourse Foyer, Level 1.
OptiTrack has delivered the “missing links” for arcade VR with two key advancements on display at SIGGRAPH, which mark a huge leap forward in quality of experience and usability for single and multi-site deployments of out-of-home VR.
Designed to deliver accurately moving avatars for each of the participants in a multiplayer VR game, the latest in Full-Body Motion Tracking delivers a low latency, real-time stream of every player’s position, orientation and skeletal pose in the entire playing area. Now participants will see other players through their VR HMDs, significantly enhancing multi-user experiences. OptiTrack Active “pucks” are attached to the hands and feet of each participant, delivering real time animation for each player present in the experience. The pucks are small (3.75″ x 3.75″) and lightweight (3 ounces), powered with a rechargeable battery and are designed for the rigors of VR Arcades.
Significantly reducing the day to day operational complexity, as well as the staffing expertise required to run even the largest VR arcades, OptiTrack’s Self Calibrating Tracking Systems remove the need for the “wand wave” that has been a daily component of motion capture and tracking systems for over 30 years. No calibration maintenance is required following initial installation, and there is no longer a deterioration of the calibration over time. It simply produces the very best accuracy that OptiTrack is well known for 100% of the time. Stop by booth 731 or visit OptiTrack online to learn more.
Reallusion revealed real-time face and body motion capture for its iClone 7, following the tech’s award-winning debut. Created in partnership with industry leaders, the dedicated tools turns a single PC into a streamlined (and affordable) ig for character creation, animation and live mocap.
Faceware RT for iClone is a totally independent face mocap tool that uses a highly accurate markerless capture using a regular PC Cam and iClone 7 to achieve real-time facial motion capture recording. This gives indies and studios of all levels access to tools geared for ease of use and performance results for any project, previz, or production. (Available from Reallusion September 2017.)
In order to fully animate the facial capture detail, iClone 7 updated the Character Creator 3D Character Generation Systems to Faceware standards, with up to 60 Facial Morph capability. Features include: compatibility with iClone, Character Creator and DAZ Genesis characters; custom character import via FBX; two face simultaneous capture; feature-based facial strength filters; live face capture and imported image sequence face mask for selected capture; optional mouth blend with audio lipsync viseme; refinement and face key timeline editing; and character morph animation export via FBX.
In partnership with Xsens, iClone 7 is enabled with the Xsens mocap suits and MVN motion capture software. The forthcoming Xsens Mocap Plugin for iClone will join the currently available plugin for Noitom Perception Neuron and add to the live motion capture options for indies and studios.
Check out the new toolsets at booth #1219 or see more online.