Digital wizardry in the dream factory

Hours of labour and special digital tools go into creating those breathtaking visual effects for films

March 10, 2013 09:36 am | Updated 04:03 pm IST - Bangalore

Realistic: In the Oscar-winning Life of Pi, while live action captured the boy and the boat, the tiger was pasted into the scene by the visual effects team of Rhythm & Hues. Photo: AP

Realistic: In the Oscar-winning Life of Pi, while live action captured the boy and the boat, the tiger was pasted into the scene by the visual effects team of Rhythm & Hues. Photo: AP

The success of Life of Pi , to a great extent, depended on the performance Ang Lee, who won his second Oscar for Best Director for the film, extracted out of his second protagonist — Richard Parker, the Bengal tiger. The director of course did not tame a ferocious creature to get remarkable acting, but got the right team to bring this tiger to life on screen, in 3D.

About 80 per cent of the sequences involving the tiger were computer generated, by Rhythm & Hues, a visual effects company.

In fact, today, more often than not, breathtaking scenes in Hollywood films are computer generated. So, how are these realistic computer graphics created?

Digital artist’s palette

It took hundreds of hours of labour involving about 600 artists from the Rhythm & Hues team to put together the visuals of the Bengal tiger and other scintillating graphics sequences for Life of Pi .

Unlike movies such as Shrek or Kung Fu Panda , Lee’s Academy Award winning film is not strictly animated, but uses a lot of computer graphics; live action footage is used as a base. For instance, in the scenes involving the boy and the tiger on the boat, live action captured the boy and the boat, but the tiger was pasted into the scene by the visual effects team.

At the first level, computer graphics is much like sketching a comic strip. The initial designs are sketched by the artists, not on paper, but on digital slates. Samsung’s Series 7 Slate or tablets such as the iPad with a digital pen are used by digital artists to draw the first form.

The touch screen monitors capture the artist’s strokes via digital pen and store them in digital formats, which are further enhanced with tools such as Adobe’s After Effects or processed using specialised programs.

The proprietary tool Maya and the Free software Blender can also be used to create computer graphics directly on the software platform. Such approaches are used when an entire landscape or a character is to be independently created, as in most animated films.

The wonder of 3D does not simply lie in the dark glasses one wears while watching the movie. Making a 3D film involves its own complexities.

For the 3D effect

When we look closely at an object, each eye sees something extra (verify this by looking at an object with only one eye at a time). This difference gives humans 3D or stereoscopic vision.

So, the trick in making 3D movies is to capture scenes using two different lenses that are separated by a small distance, much like the human eyes, or by replicating a similar effect using computer generated images.

The 3D glasses isolate the two captures and give a slightly offset visual feed to each of our eyes, making our brains believe we are seeing in 3D.

Getting the science right

The use of sophisticated software can make the creation seem real, but if Richard Parker did not appear soaking wet when it jumped into the water, no matter how realistic the tiger might appear, the digital artefact would still appear unconvincing. Simulating physics with decent fidelity is a crucial aspect of making special effects come alive.

Rhythm & Hues sought help from researchers from the Applied and Computational Mathematics Department at the California Institute of Technology to mathematically model the response of the tiger. This complex mathematical model was applied to the digital tiger, which could then roar, pounce, jump and swim as a real tiger would.

Putting it together

Once the graphic content and the physics are ready, binding the art to the behavioural model results in the scene. This is commonly termed as ‘rendering’.

This step requires humongous amounts of computation. Life of Pi was shot at 48 frames per second, implying that we would be watching 48 really high resolution images in a second. To render each of these frames in a 127-minute movie requires a lot of computing power. A single high-end computer or even server is inadequate.

Movie rendering today uses a grid of computers, and are increasingly using cloud computing services. DreamWorks for instance, used Intel-powered Xeon processors running cloud servers to render the movie Kung Fu Panda , which required 55 million hours of rendering time, which was executed in parallel on multiple CPU cores on the cloud.

With so much work going into making a film, it is not the performances by actors, but the technology that has better chances of giving you that unforgettable movie experience.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.