Friday, July 19, 2013

Monster mayhem: Pacific Rim

It’s not hard to see why the prospect of ILM creating giant monsters versus giant robots for Guillermo del Toro’s Pacific Rim was something many filmgoers were hungry to see. Perhaps aware of that anticipation, and knowing the challenges that lay ahead in realizing the enormous ‘Kaiju’ versus ‘Jaegers’ fights on screen, the studio retooled many elements of its effects pipeline for the show, including simulation, lighting and rendering.
Up close on Striker Eureka.
Up close on Striker Eureka, a Jaeger.
“The battle scenes that we tackled are all quite complex in just the recipe that is required to make a shot,” notes senior ILM visual effects supervisor, and now chief creative officer, John Knoll, “so when we’re in the ocean, there’s the robot and the monster and it’s raining, with rain striking the creature, being flung off the creature and cascades of water. The character’s disturbing the ocean, the foam, the sub-surface bubbles and mist and there’s a cocktail of many, many items that all have to be balanced just right to get a shot done.”
We explore ILM’s work for Pacific Rim. The studio was aided in its efforts by Ghost FX, Hybride, Rodeo FX and Base FX. Plus we highlight practical and model work by Legacy Effects and 32TEN Studios, the main title and main on end title design by Imaginary Forces, and the stereo conversion by Stereo D. In addition, Mirada produced the film’s two-minute prologue made up of documentary-style footage and VFX.
Above: watch a featurette on the VFX work.

Giant freakin’ robots and killer Kaiju

Twenty-five storey tall robots brought with them, of course, several challenges. Animation, scale, weight, interaction, rendering – ILM had to solve all of these in order to convince audiences that the Jaegers could move in the dynamic way that they do in battle. “Some of the bigger challenges with these battles is that the accurate physics would dictate that everything would move slower,” says Knoll, adding that del Toro did not want the action to feel too slow. “He wanted to hypothesize that the mode of force behind these robots was strong enough that they could push through all that air drag, and that it’s OK to push it through a little faster.”
“There’s also then figuring out the scale issues,” adds ILM animation supervisor Hal Hickel, “figuring out how fast can we move. We didn’t want everything to feel like it was underwater or in super slow motion – typically you go very slow trying to sell that scale, but then all the fights are happening this fast. We couldn’t have that. We had to figure out how fast to go and still have it feel big.”
Gipsy Danger rescues a fishing boat early in the film.
Gipsy Danger rescues a fishing boat early in the film.
“And then on top of all that was how to invest in a sense of it being a machine,” continues Hickel. “Because we didn’t want to use motion capture – we talked about it at the beginning of the project and motion capture’s a great tool for certain things, but for this project, particularly with the Jaegers, we just really wanted to make sure it wasn’t completely fluid or organic, that it felt robotic.”
Hickel had time to complete a few test animations for del Toro to establish the look of the colossal robots. “I just had an early version of Gipsy walking down the street with some attitude,” he says, “partly to play with attitude but also play with speed and what are cool looking shots to help sell size. So I had her walking over camera and then with a camera tracking perpendicular to the line of action and tracking along with buildings in the foreground.”
Gipsy Danger in the streets of Hong Kong.
Gipsy Danger in the streets of Hong Kong.
The animators worked in Maya and not only had to take note of the hugely complex moving parts of the Jaegers, but also had to be mindful of the ocean they might be standing waist-deep in, or the buildings they were ‘crunching’. Interestingly, occasionally their work would be completely masked by a huge splash of whitewater during the fight – something that was only apparent once simulations had been completed. Another issue was getting the characters to read at night. “We had this great idea of using the helicopters circling around all the time and you see through the shafts of light cutting through the white water and silhouetting the characters,” says Hickel.
As the battles rage in Pacific Rim, ILM also had to deal with progressive destruction on the Jaegers. “For Gipsy, we had versions A, B, C, D and each of those had a 1, 2, 3, 4,” explains co-visual effects supervisor Lindy DeQuattro. “I think we ended up with about 20 different versions of Gipsy. After each battle she would have new scuffs, dents and scratches, burns. Then she’d go in for repairs and get an arm replaced.”
Like the Jaegers, the Kaiju monsters came in several different forms. That meant that ILM also had to look at various reference for the creatures – from crabs to gorillas, reptiles, bats, dinosaurs and, of course, previous Kaiju and monster movies. The organic nature of the Kaiju and their incredibly large muscles meant that the studio turned to shapes for animation rather than anything automatic or built into the rigging. Flesh sims for wobble and muscle tensing were added, however. “We chose to do that,” says Hickel, “because the Kaiju were so radically different from one another – it didn’t seem economical for us to set up an overall muscle system for the show because it didn’t seem portable from one to the next. I thought, we don’t need a huge amount of muscle detail – we need large masses that do certain things at hero moments, so to me that said shapes.”
A Kaiju ready for attack.
A Kaiju ready for attack.
Much of these battles occur in water, either waist-deep to the creatures or completely submerged. For fluid simulations, ILM relied on its existing proprietary toolkit, but then also moved into Houdini for elements like cascading water from the characters and for particulates. “We use Pyro for some of that too, and Plume,” says DeQuattro. “We have a variety of tools that we can use depending on what we’re trying to achieve.”
To help show the detail of the Jaegers and Kaiju, ILM elected to go with Arnold as its primary renderer for the robots and monsters (V-Ray was also used to help achieve environments and RenderMan remained the main renderer for water). Solid Angle founder and CEO Marcos Fajardo helped ILM with the Arnold transition. “Pacific Rim was an incredibly complex project to work on,” he says. “The amount of geometric detail, the amount of texture detail – some of the shaders have 2,000 nodes and it was really difficult to render those shots. We spent some time with ILM on site optimizing the render, seeing what the hot spots were and improving the renderer. As a result, all of our customers will benefit from this optimization. Things like texture threading – very important for Pacific Rim. It was just crazily complex and we just had to double our efforts to optimize rendering for that movie.”

Watch Gipsy Danger and a Kaiju in action.
Katana was another tool ILM more fully embraced after experimenting with it on Mission: Impossible – Ghost Protocol, where Arnold had also been used to some degree. “The greatest thing about Katana,” says DeQuattro, “is that we could really light multiple shots from one scene file, which is not something we were previously doing. It made it easy for say a sequence supervisor to really light just about every shot in the sequence. We had some scenes where the sequence supervisor lit all the shots out of one scene file.”

Destroying cities

If giant robots and monsters were not enough, then recreating cities – and destroying them – became additional challenges. For Hong Kong, which bears the brunt of a fight mid-town and in its docks area, ILM scouted the city and shot moving footage and stills. “We selected a lot of streets that were the basis for where the fight was going to travel along,” explains DeQuattro. “And we had to widen the street because the creatures are so big they don’t actually fit between the high rises!”
Sydney is one of the cities attacked in Pacific Rim.
Sydney is one of the cities attacked in Pacific Rim.
There were then two approaches to modeling CG buildings and architecture. If the buildings were not going to be damaged, then ILM called on its digital matte group. But if destruction work was required as, say Gipsy Danger and a Kaiju rampage through Hong Kong, then the studio entrusted its ‘DMS’ pipeline – model, paint, simulation, destruction.
For the first time, ILM relied on Houdini for destruction effects rather than its traditional internal tools. “In Houdini you want to approach things as procedurally as possible,” says DeQuattro, “so you make definitions of how a pane of glass will break, how a piece of metal will break, how a piece of cement will break, and then name things appropriately so they will be dealt with accordingly inside of Houdini. We had to think about things from the beginning as we were modeling them. How do we want this to break? What material is this made out of? And make sure it was put together correctly.”
A Jaeger appears in the rubble in a flashback sequence.
A Jaeger appears in the rubble in a flashback sequence.
Deep compositing in Nuke, another tech adopted by ILM on a larger scale for this show, became crucial in working with the Jaeger/Kaiju fights and the resulting destruction. “It helps solve a problem that’s always existed with volumetrics and motion blur holdouts,” notes Knoll. “I’m really familiar in previous shows of having to do two renders – a hold-out render and a non-hold out render. You always end up with funny edges around your object that you have to split in the non hold out version. There’s a lot of manual work – time is money. The deep compositing gets you past a lot of that. It’s not free. You pay the price in disk space.”

Re-creating locations

While many sequences wound up being completely computer generated, a major effort involved the collection of on-set camera data, photo reference and other reference for live-action sequences requiring visual effects and to inform the CG creations. The ILM team oversaw LIDAR set scans, HDRi capture and detailed photo modelling of props and set pieces. “We’d do two different levels of this,” says DeQuattro, “a high angle and low angle and then we’ll go around in a circle and take say eight photos from different angles. It allows us to do a rough reconstruction of a model that we can use for layout purposes.”
Inside the Shatterdome.
Inside the Shatterdome.
“We had these large scale huge soundstages full of greenscreen and the characters and then had to turn it into the Shatterdome,” says co-visual effects supervisor Eddie Pasquarello, who notes the space was akin to where they might build the space shuttle – but this time for multiple 25 storey-tall Jaegers.
Production built partial sets on the greenscreen Toronto stage which were then extended by the visual effects team at Base FX in China with ILM assets. Oftentimes, artists would also replace the existing floor to deal with reflections and make them look more soaked. For shots where actors would have to be looking or interacting with a computer generated effect, ILM opted for a relatively low-tech solution. “We would take a laser pointer and say ‘Look here!’, says DeQuattro. “We knew approximately how tall the Jaegers were so with some quick calculations on set we could figure out the right angle.”
Battles occur near Hong Kong harbor.
Battles occur near Hong Kong harbor.
However, occasionally an augmented reality tool on an iPad was also employed to help nail down shots of wide environments, such as the mammoth Shatterdome. “We used rendered panoramas of what the virtual environment looks like from the center of the stage where we were,” says Knoll. “Then we had a gyro enable panorama viewer that we would just have on my phone or on my iPad.”
ILM also took from the on-set filming camera data from the RED EPICs and all the preferred lenses and shooting style of del Toro and DOP Guillermo Navarro, and then would often replicate these in CG moves. “Guillermo del Toro has certain camera moves he loves such as a PIJU – a push in and jib up – which is his favorite camera move,” relays DeQuattro. “He is a big fan of indicating the camera operator – he really likes the sense that you’re trying to find the subject, you’re changing focus during the shot, that you’re doing something that indicates there’s a man behind the camera. So we do that with the digital camera so they have the same feel.”
00:00 | 00:00
Download Video

In this featurette, see how the pilot sequences were filmed.

How to operate a Jaeger

Inside the Jaeger heads – known as Conn-Pods – two pilots control the robot via ‘drifting’ in which their minds are locked in a neural bridge. They can then operate the Jaeger by physically performing the required actions while connected to each other and the robot itself via mechanical ‘stilts’. Production filmed the pilot actors on a gimbal-operated set that was partially fitted out with interiors, cables et cetera but also had interactive water, sparks and movement filmed in live action. Legacy Effects was responsible for the practical Conn-Pod machinery of pull levers, springs and metal parts. The studio also fashioned pilot suits and helmets. Extra moving machinery and holograms would be added in via visual effects.
Inside the
Inside a Conn-Podd.
Rodeo FX, in cooperation with ILM, handled the Conn-Pod interiors that combined the practical work with CG additions. Also in collaboration with ILM, Hybride created the holographic projections for the Jaeger cockpits and hand disk and arm grid holograms around the pilot suits. For the film, Rodeo also delivered an alien brain – the studio’s first organic creature – as well as matte painting and rain sim work for environments of Hong Kong including the helipad and Shatterdome. And Hybride crafted holographic projections inside control rooms (Loccents) and on various monitors throughout the film.
Above: see how Legacy Effects built the Conn-Podd mechanics for use on set.

Practically speaking

32TEN Studios, working with ILM, contributed several practical effects too. For one sequence in which the fist of a Jaeger comes through an office building, 32TEN dressed an area with office cubicles at quarter scale, shooting with RED EPICS on a 3D rig.

Miniature cubicle.
Miniature cubicle.
Soccer stadium seating.
Soccer stadium seating.
From left: model supervisor Nick Dabo, directorGuillermo del Toro, ILM VFX supervisor John Knoll.
From left: model supervisor Nick Dabo, director Guillermo del Toro and ILM VFX supervisor John Knoll.

The studio also produced shots of soccer stadium seats being ripped apart when a Jaeger crashes into a stadium. For this, quarter scale seats were blown apart using air canons. 32TEN’s work extended also to the creation of dust clouds, breaking glass and water effects used by ILM to comp into the film’s massive destruction scenes.

Battles in stereo

Stereo D converted over 1900 shots in Pacific Rim to 3D, oftentimes relying on multiple elements from ILM to make up the shots. ILM also completed several sequences in full stereo itself. John Knoll notes that the style of the film allows for some natural stereo moments to occur, such as “debris from an impact or as the characters are fighting each other,” but that the style of the battles with ‘human type’ coverage meant that foreground objects were often needed to show dynamic range. However, he says, when the characters are in the ocean, the artists could engineer some of the water coming off them towards the camera to make better stereo moments.
Foreground elements and water splashes were useful stereo components.
Foreground elements and water splashes were useful stereo components.

An epic end

Pacific Rim concludes with a rousing main-on-end titles sequence created by Imaginary Forces featuring macro shots of Jaegers and Kaiju in action as well as iconography from the film. Creative director Miguel Lee oversaw the work of a small team that included designer Ryan Summers. We talked to Summers about the main-on-ends and the main titles also produced by Imaginary Forces.
Above: watch the main-on-end titles by Imaginary Forces.
“The main principle was to focus on the Jaegers and Kaiju in a way that allowed the audience to take a breath and marvel at them in a way we couldn’t during the breathtaking, action-packed fights,” says Summers. “After 2 plus hours, the audience should be exhausted and need a breather.”
IF realized several style frames ranging from schematics to anime inspired designs to full particle-based procedural solutions. “In my opinion, the genius behind Miguel’s approach is that he always designs to the brief in terms of limitations on time or budget, but always finds a way to reach beyond those limitations creatively,” adds Summers.
CINEMA 4D was the tool of choice to create the main-on-ends with outputting of 30 title cards at 2K and a production time of 6 weeks (this was done by two full time artists, an intern and a Flame artist). “We carved out a couple days at the very beginning trying to optimize one of the scenes we used for our style frames,” says Summers. “We made a rule that no frame could take more than 5 minutes on our render farm in order to make sure we could deliver the flat version on time and save ourselves a couple weeks to figure out how to deliver a stereoscopic job. We hadn’t done a stereo delivery with our little team up until now.”
“Thankfully we were able to build out some render presets that let us get really close to that rule, as well as a super fast and dirty preset that let us test our ‘lighting’,” adds Summers. This was almost entirely based on just reflections, as Summers explains. “We ended up turning all of C4D’s whiz-bang features off: no GI or AO, no motion blur or depth of field. We aimed to address as much as possible in post, at both the After Effects stage and with some of the great tools Flame has to offer. We took the rare tactic of folding our in-house Flame artist into the conversation from the pitch stage, and the returns were tenfold. We ended up rendering with slightly lower anti-aliasing setting then I’m normally comfortable with and sent out tons of object buffer passes to drive frame-averaging and various blurs in the Flame to save time on our renders. A well-prepped Flame artist can work small miracles!”
A key frame from the main-on-ends.
A key frame from the main-on-ends.
To create the robots and monsters, Imaginary Forces received models direct from ILM to work with in C4D. “We started with their previs models, heavily leveraging C4D’s Motion Camera and Camera Morph tags to rapidly previs out ideas,” says Summers. “Each artist shot tons of ‘coverage’ for each Kaiju and Jaeger, handing off a bunch of QTs to our in-house editor. Tons of fun was had matching the appropriate title card to the perfect mecha or monster. We were particularly proud to get Hal Hickel and John Knoll’s credits in our San Francisco scene.”
For stereo, IF continued to make tweaks right through the C4D – After Effects – Flame workflow. “We made a ton of late night runs between our interlaced stereo monitor in our Flame suite and our anaglyph setups in the C4D viewport,” recalls Summers. “There were many nights where we were running around with 3 pairs of glasses stacked on our foreheads! We got lucky that we hammered out a solid game plan early on: all kaiju/jaegars would take place in aquarium space, and all of our onscreen titles would live slightly past the zero parallax plane into audience space. We ended up making changes to many of the flat shots for stereo, as some of our heavy macroscopic frames were difficult to read with such intense DoF.”
Pacific Rim’s ‘war room’ main titles – designed and animated by Miguel Lee – turned out to be a late decision by the director after he had seen IF’s pitches for the main-on-ends. “After he saw the frames that eventually ended up as the main-on-end titles – which he lovingly referred to as the ‘Sexy Robots’ look, by the way – he started flipping through the rest of the book really quickly,” outlines Summers. “All the way up until he saw the war room images.
Above: watch the main titles.
Working with Guillermo del Toro was probably one of the most exciting parts of the project for the IF team. “He knew precisely what he wanted, but always looked to the room for new ideas,” says Summers. “On our final delivery day for our 2D delivery, he stopped by while we were checking out colors in the booth. After watching the titles through for the first time, he actually asked if he could have permission to make notes. I’ll never forget that. His two notes were spot on, and we happily made the changes for the better. The guy has razor sharp eyes.”

All images and clips copyright 2013 Warner Bros. Pictures.

No comments:

Post a Comment