Posted on March 25, 2011

Animation company aims for 3D without nausea and headaches

Globe and Mail

Friday, Mar. 25, 2011 10:21PM EDT

by LIAM LACEY

A spot-bellied little animated bluebird flutters against a garden backdrop of plants and mulch. His ancestors may have been the helpmates of Cinderella, Disney’s 1950 movie, but there are some big differences. He’s starring in Lovebirds,a short movie being shot with new techniques to push forward the boundaries of three-dimensional film.

Lovebirds, a mix of animation and live action by Toronto company Starz Animation, is the showcase production of the Toronto-based 3D Film Innovation Consortium, a York University initiative that has brought academic researchers and filmmakers together to explore the burgeoning world of 3D filmmaking to achieve better results.

The movie, which unites new research into visual perception with the practical aspects of 3D filmmaking, is part of an attempt to boost the local film economy and improve the 3D viewing experience – with less nausea, eye strain and headaches.

The computer-generated animation portions were created by Starz (which did the 3D animation for the recent Disney feature Gnomeo and Juliet). The live action set was shot by York University professor Ali Kazimi using a LiDAR device (light detection and ranging, or laser radar) to create a 3D map of the set. The information was integrated into the software with the animated images to ensure accurate placement of the birds against the backdrop and to study depth perception.

Though the current Hollywood wave of 3D production is a few years old, it was really in 2009, with the success of such films as CoralineUp and especially Avatar, that it truly took off. Rob Burton, Starz’s vice-president of technology, decided to jump in.

Initially, Lovebirds was intended as a 2D short, developed as an in-house experiment. When York’s 3D Flic program began looking for industry partners. Mr. Burton saw a learning opportunity.

“What appealed to me was that they were working on the kind of information we needed. What we started to understand is that you‘re fooling your vision system into believing there’s depth when there isn’t, and it becomes very challenging,” says Mr. Burton.

“In real life, our eyes converge and focus at the same time. Stereoscopic film goes against our natural instincts, which, if you’re not careful, can make for a very uncomfortable experience. For the first month we were working with 3D, I came home every day with nausea.”

The technology of simulating depth perception through two proximate overlapping images (which is how humans see) is simple enough, but creating a realistic sense of visual movement is much harder. Our eyes don’t zoom in and out like a camera, and the distance between the cameras (known as interaxial or interoccular distance) has to be constantly adjusted. The movies are typically shot with cumbersome rigs, in which two cameras are placed perpendicular to each other, one shooting straight ahead and the other capturing the image from a mirror, with each camera capable of being tilted to create converging sight lines.

There are also aesthetic decisions. With Lovebirds, Prof. Kazimi and the team gradually adjust the perspective from a human point of view to one that’s much closer to what bird-sized human would see.

Prof. Kazimi, whose background is in documentary filmmaking, is cautious about the kind of sweeping generalizations that are currently being thrown around about 3D film language, but he believes it heralds fundamental changes in film storytelling, especially in slowing down the pace of films.

“There’s a lot more visual information for the viewer to absorb and you need to provide the time,” he says.

In contrast to the trend toward shallow depth of field, quick cuts and natural lighting used in films such as The Bourne Identity, 3D favours the importance of light as a tool in providing depth cues, reviving lost or forgotten arts. Prof. Kazimi suggests filmmakers may have to go back to the model of Gregg Toland, the famous cinematographer onCitizen Kane, and the use of chiaroscuro in the Renaissance paintings of Caravaggio.

His York colleague, psychologist Laurie Wilcox, is studying how people see 3D, including issues of ghosting, image disparity and motion that can make the experience unsatisfying. Simple things such as screen size and even where you sit in the theatre make a big difference. By sitting at the middle, or toward the back, the viewer can enjoy the most comfortable experience. Seats on the aisles, she suggests, “should probably be discounted.”

Complicating 3D experience is the issue of “vection” or the illusion of self-motion which can occur while watching 3D. For some, it may create motion sickness. For others, that sense of getting lost in the screen is what 3D is all about. Prof. Wilcox recalls that feeling when she saw the scene of a passing jelly fish in the 1994 IMAX film, Into the Deep. “Every once in a while you get one of those transcendent experiences in a 3D film. That one was my favourite so far.”

Lovebirds will get its world premiere at the Toronto International Stereoscopic 3D Conference, June 11-14 at the TIFF Bell Lightbox in Toronto.

USING MILITARY PRECISION

A LiDAR (Light Detection and Ranging) scanner, used on the Lovebirds set, is often deployed in geology, archaeology, atmospheric physics and military operations. The technology is used to map a 3D surface with laser beams, analogous to the flash bulb of a 2D camera.

As the laser beam is emitted, a counter is triggered until the reflected beam returns, and the time of flight divided by the speed of light indicates the distance to each point. The data are used to create a 3D model of the set, to be integrated with the animated images.

A small cardboard cut-out of the bluebird was placed on the table-top set at the Cinespace Film Studios. The 3D LiDAR map was then imported into the computers to be integrated with the animated image of the bird. “We needed to make sure our bird was standing on the actual ground, rather than walking on air,” explained Starz’s Rob Burton.

There were some glitches. Between mapping the set and shooting the film image, the mulch settled, leaving a space between the bird’s feet and the ground in some frames. Computer-generated mulch had to be created to keep the bluebird from walking on air.

Sources: York University and Starz Animation