Advertisement

Disney rendered its new animated film on a 55,000-core supercomputer

Disney's upcoming animated film Big Hero 6, about a boy and his soft robot (and a gang of super-powered friends), is perhaps the largest big-budget mash-up you'll ever see. Every aspect of the film's production represents a virtual collision of worlds. The story, something co-director Don Hall calls "one of the more obscure titles in the Marvel universe," has been completely re-imagined for parent company Disney. Then, there's the city of San Fransokyo it's set in -- an obvious marriage of two of the most tech-centric cities in the world. And, of course, there's the real-world technology that not only takes center stage as the basis for characters in the film, but also powered the onscreen visuals. It's undoubtedly a herculean effort from Walt Disney Animation Studios, and one that's likely to go unnoticed by audiences.

"We've said it many, many times. We made the movie on a beta renderer," says Hank Driskill, technical supervisor for Big Hero 6. "It was very much in progress." Driskill is referring to Hyperion, the software Disney created from the ground up to handle the film's impressive lighting. It's just one of about three dozen tools the studio used to bring the robotics-friendly world of San Fransokyo to life. Some, like the program Tonic originally created for Rapunzel's hair in Tangled, are merely improved versions of software built for previous efforts, or "shows" as Disney calls them. Hyperion, however, represents the studio's greatest and riskiest commitment to R&D in animation technology thus far. And its feasibility wasn't always a sure thing, something Disney's Chief Technology Officer Andy Hendrickson underscores when he says, "It's the analog to building a car while you're driving it."

"We've said it many, many times. We made the movie on a beta renderer," says Hank Driskill, technical supervisor for Big Hero 6.


For that reason, Hendrickson instructed his team to embark on two development paths for Big Hero 6: the experimental Hyperion and a Plan B that hinged on a commodity renderer. It took a team of about 10 people over two years to build Hyperion, during which time Driskill says resources were being spread thin: "We were running with a backup plan until around June of last year ... [and] we realized we were spending too much energy keeping the backup plan viable. It was detracting in manpower ... from pursuing the new idea as fully as we could. So we just said, 'We're gonna go for it.' And we turned off the backup plan."

Hyperion, as the global-illumination simulator is known, isn't the kind of technology that would excite the average moviegoer. As Hendrickson explains, it handles incredibly complex calculations to account for how "light gets from its source to the camera as it's bouncing and picking up colors and illuminating other things." This software allowed animators to eschew the incredibly time-consuming manual effort to animate single-bounce, indirect lighting in favor of 10 to 20 bounces simulated by the software. It's responsible for environmental effects -- stuff most audiences might take for granted, like when they see Baymax, the soft, vinyl robot featured in the film, illuminated from behind. That seemingly mundane lighting trick is no small feat; it required the use of a 55,000-core supercomputer spread across four geographic locations.

Disney Animation CTO Andy Hendrickson demonstrates Hyperion's real-world lighting simulation.

"This movie's so complex that humans couldn't actually handle the complexity. We have to come up with automated systems," says Hendrickson. To manage that cluster and the 400,000-plus computations it processes per day (roughly about 1.1 million computational hours), his team created software called Coda, which treats the four render farms like a single supercomputer. If one or more of those thousands of jobs fails, Coda alerts the appropriate staffers via an iPhone app.

To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion "could render Tangled from scratch every 10 days."

If that doesn't drive the power of Disney's proprietary renderer home, then consider this: San Fransokyo contains around 83,000 buildings, 260,000 trees, 215,000 streetlights and 100,000 vehicles (plus thousands of crowd extras generated by a tool called Denizen). What's more, all of the detail you see in the city is actually based off assessor data for lots and street layouts from the real San Francisco. As Visual Effects Supervisor Kyle Odermatt explains, animating a city that lively and massive simply would not have been possible with previous technology. "You couldn't zoom all the way out [for a] wide shot down to just a single street level the way we're able to," he says.

"This movie's so complex that humans couldn't actually handle the complexity. We have to come up with automated systems," says Hendrickson.

Beyond the supercomputer cluster and software tools devised to make the movie, Big Hero 6 leans heavily on cutting-edge technology for its visual majesty in one other way: its characters. Both Baymax, the aforementioned, lovable robot sidekick and the microbots, swarm-like mini-drones controlled by telepathy, are steeped in some very real scientific research. That decision to ground the world of Big Hero 6 in near-future technologies led Hall and co-director Chris Williams on research trips to MIT, Harvard and Carnegie Mellon in the US and even to Tokyo University in Japan.

A soft robotic arm developed by researchers at Carnegie Mellon University.

"You know, we try to look at, like, five to 10 years down the road at what was coming ... It seems counterintuitive because in animation you can do anything, but it still has to be grounded in a believable world," says Hall.

Indeed, there's even a moment where supergenius lead character Hiro Hamada uses a 3D printer in his garage to create an outfit for Baymax. In discussing the scene, Roy Conli, the film's producer, credits the "maker movement that's going on right now." He adds, "These kids are makers. So it's a little bit the celebration of the nerd."

To put the enormity of this computational effort into perspective, Hendrickson says that Hyperion "could render Tangled from scratch every 10 days."

It was during a visit to Carnegie Mellon that Hall came across researcher Chris Atkeson, who'd been working in the field of inflatable, soft robotics; robots intended for the health care industry. Hall says Atkeson pleaded with him to "make a movie where the robot is not the villain." But Atkeson didn't have to do much convincing -- Hall's vision for Baymax meshed nicely with his research. He'd wanted a robot audiences hadn't seen on screen before. Hall continues, "The minute I saw this [research], I knew that we had our huggable robot. I knew that we had found Baymax."

The team also drew inspiration for Baymax from existing compassionate-care tech out of Japan. "They're a little ahead of the curve," Hall says. "I mean, [health care robots] are actually in practice in some of the hospitals in Japan. They're not vinyl; they're not Baymax. They're plastic robotics."

The high-tech city of San Fransokyo represents a mash-up of eastern and western culture.

Robotics research out of Carnegie Mellon also provided the basis for the unwitting pawns of the film: the Lego-like, mind-controlled microbots. Of course, the version we see in the film is a much more fantastical approach to the simple, water-walking bots Hall's team glimpsed during their visit. That, coupled with a heavy dose of inspiration from swarm-drone tech, led to the insect-like creepiness of the microbots in the final film.

By design, the electromagnetic microbots move as if part of a chain: Each individual "link" travels from front to back to propel the swarm forward in a circuit-board-like pattern. On average, the visual effects team says there are about 20 million microbots onscreen in a given shot, and that level of complexity is where Hyperion once again comes crucially into play. Originally, however, the team didn't think its full vision of the microbots would even be possible to render.

In a way, Big Hero 6 is a love letter to technology.

"We thought the technology would never actually be able to handle it happening in all of the shots," explains Head of Effects Michael Kaschalk. "And to do that from shot to shot, that takes artists' work to just be able to create the [lighting] cheat. But as Hyperion developed, and we actually built the system, we found that it was handling all of this data just fine. So we actually built the real thing."

Hiro scans Baymax to create 3D-printed armor.

Though tech innovation clearly plays an important role in development at Walt Disney Animation Studios, it's not the sole guiding force for each film and, for that matter, neither is the story. The studio's process is entirely collaborative. "We are looking for input from everybody that works here for storytelling ... there's no doubt that those ideas can rise up from anywhere to become a big piece or small piece of the story," says Odermatt. There's no one single source of motivation other than a love of research and functional design -- key concepts imparted by Chief Creative Officer John Lasseter.

"The movie does celebrate science and technology in a way that we haven't really done before."

In a way, Big Hero 6 is a love letter to technology. It's a fantasy film that gives audiences a knowing wink toward the robot-assisted near-future, as if to say, "This is exactly where you're headed. And it's coming soon." Big Hero 6 also represents a perfect storm for Disney: The subject matter (makers and robotics) and setting (hyper-tech San Fransokyo) dovetailed with the economic feasibility of cutting-edge computational hardware (that massive render farm) and the development of advanced animation techniques (Hyperion). It's a film for, by and from lovers of technology.

That Big Hero 6 has a technological heart and soul is not lost on Hall. In fact, he's keenly aware of this. "The movie does celebrate science and technology in a way that we haven't really done before."

[Image credit: Walt Disney Animation; Carnegie Mellon University (soft robotic arm)]