Jawahar Deep Bhatti has worked on landmark projects including Vikings , Orphan Black , Wynonna Earp and Penny Dreadful . These shows, as well as Game of Thrones and Cosmos , have broken new ground in quality, thanks to technological advances and budget boosts in television. Having also worked on recent feature films Resident Evil: The Final Chapter and Colossal , he is able to see these changes closely.
And now, with platforms such as Netflix backing TV exclusives, production houses get very similar budgets, meaning these platforms require the same output resolution, and in some cases more. In fact, most TV is being output at 4K, which can be enjoyed on an IMAX screen.
Evolving for TV
Bhatti recalls, “Back in 2003, it was so different. I worked on a show for Discovery Channel, called Blueprint for Disaster , recounting engineering catastrophes around the world. It was very low-budget and the phrase ‘It should be good enough for TV’ in the TV and VFX circles at the time was its justification. Basically, it’s 740 pixels, or 720 pixels x 486 lines with NTSC digital frame (the colour encoding used with television signals which comprises 29.97 frames of video per second), which is just really low resolution.”
Intelligent Creatures, his studio in Toronto, makes the most of the latest VFX developments. A largely useful piece of equipment for Orphan Black has been Pacific Motion Control’s Technodolly, a programmable telescoping camera crane, which emulates a series of movements that the cameraman makes. The silent and minutely accurate hardware allows crews to create movement of unlimited length and complexity.
Considering the number of clones audiences have to keep up with in Orphan Black , the VFX team have plenty on their hands. Digitally CG-ing lead actress Tatiana Maslany’s limbs and face onto body doubles requires using high-performance rendering systems, such as Redshift, which support biased rendering techniques for fast and noise-free outputs. It also helps artists troubleshoot issues such as splotches and flickering during animations. There are also hands-on methods; Bhatti and his team take on; in one scene, a character’s head explodes. To digitally recreate that, they blew up a watermelon in their lunchroom against a neutral backdrop.
“It uses interactive progressive rendering (IPR), where essentially we create something three-dimensional using the software and are able to see the output rightaway, as we drop shaders and lights where we wish. That allows us to show something rightaway to our VFX producers and CG supervisors, who can advise live adjustments.” And because time means money in this industry, the faster the better.
Bhatti largely favours Autodesk’s Maya for its real-time rendering capacities, “In the past three or four years, I’ve been using that for modelling, textures and lighting, as well as The Foundry’s Nuke for compositing, as well as SideFX’s Houdini. But mostly Maya, because that software pipes through every other department.”
With more software being released, it’s natural that Bhatti adds them to his skillset. But his love for Photoshop and its ease of use has been pivotal in his work: “Software like Maxon’s BodyPaint and The Foundry’s Mari, which is more recent, are all choice programmes for painting directly onto 3D models. I was fortunate enough to work with JohnKnoll, one of the two makers of Photoshop before it was sold to Adobe. I still love Photoshop though; you can patch a lot of things quicker than any of the newer software.It’s like a new car; if you buy a 2017 SUV with 500 features, there’s a higher chance of something going wrong, and if that does happen, you’ve got to take it to a mechanic who knows how to fix it.”
The coming years will surely see a further empowerment in the VFX realm for television. With artists no longer bound by budgets and time, we can expect more sightgasmic shows—and way more binge-watching.
Check out Bhatti’s website for more insight.