Into the Spider-Verse

In the past decade, Sony Pictures Animation has produced the Hotel Transylvania trilogy, a Smurfs movie, and the infamous Emoji Movie, which holds an 8% approval rating on review aggregator Rotten Tomatoes. In a sudden turn away from its somewhat floppy creations, the same team has released an animated Spider-Man movie that features not one, but seven different iterations of the friendly neighborhood hero. Beloved by both critics and the general audience, Spider-Man: Into the Spider-Verse has already secured a Golden Globe and is a front-runner for the Best Animated Feature Oscar. So what is, in terms of technology, the secret to its success?

“We had this mandate to basically challenge how animated movies are made and what they can be, from top to bottom,” says Rodney Rothman, one of the film’s three directors.

The movie’s creators decided to reject the traditional rules applied to CGI animation and drew inspiration from ‘60s comic books by adopting a limited color palette and using a drawing technique called ‘half-toning’, in which shade and light are represented through colored dots and patterns.

Spider-Man: Into the Spider-Verse. Credit: youtube.com
Spider-Man: Into the Spider-Verse. Credit: youtube.com

Most CG-animated movies feature smooth movements and a camera that imitates the behavior of actual film cameras. In Spider-Verse, the movement is shown via squiggly lines and blurred colors; in some scenes, the animation team reduced the number of frames-per-second to imitate the look of stop-motion animation.

However, the biggest distinguishing feature of the new Spider-Man movie as compared to its animated peers is the characters’ facial animations.

“When we looked at what made comic books so interesting, it was how the illustrators used lines on faces for the extra emotion,” says visual effects supervisor Danny Dimian.

Specially-developed software was used to apply hand-drawn effects to the characters’ 3D-modeled faces. In effect, Sony Pictures Animation and Sony Pictures Imageworks have created an innovative visual language and method.

Spider-Man: Into the Spider-Verse. Credit: youtube.com
Spider-Man: Into the Spider-Verse. Credit: youtube.com

Revolutions don’t come cheap. Due to the use of new technology, production lasted much longer than it usually does: most animated projects require a week of work to create four seconds of animation; Into the Spider-Verse spent that amount of time on just one second of screentime.

Was it worth it? That’s still up for debate. From a business standpoint, the film has underperformed in the global box office, but after its Golden Globe win, Sony has submitted a patent application for the unique rendering and compositing technologies used in the film, as well as a machine-learning component that streamlines the animation process by predicting the position of lines in the next frame.

While no verdict has yet been made on the patent, the Sony Pictures Animation team’s hard work has already put them among the ranks of pros from Pixar and Disney.

The Incredibles 2

As Brad Bird, the director of both Incredibles movies, recalls, in the late 90s Steve Jobs, then head of Pixar, was reluctant about putting the story of a superhero family into the pipeline.

The Incredibles. Credit: hollywoodreporter.com
The Incredibles. Credit: hollywoodreporter.com

“At the time, Steve was worried that people’s associations with the combination of superheroes and animation were all negative,” Bird says. “People had it in their mind that it would be cheap and it would be crummy because at the time maybe 99.9 percent of the superhero animation was done very quickly and on really low budgets for Saturday morning television.”

But Bird was able to convince Jobs about the project’s ability to change people’s minds, in large part thanks to Pixar’s RenderMan software engine that uses the Reyes algorithm to control camerawork, geometry, materials and lighting in the frame. Simply put, it renders images by taking into account ray tracing, global lighting and materials qualities. That’s why Pixar began their path with a movie about living toys: in 1995, when Toy Story was released, the engine was at its best when rendering wood and plastic. Water and other more complex textures remained problematic, which is evident in the studio’s second feature film, A Bug’s Life.

With time, Pixar’s might grew, RenderMan received upgrades, and CGI animation became increasingly realistic.

RenderMan is commonly used by other studios as well, including the legendary VFX studio Industrial Light & Magic (Star Wars), Walt Disney Animation Studios (Frozen), and Blur Studio (David Fincher’s The Girl with the Dragon Tattoo and numerous CGI commercials). In Russia, the Petersburg Animation Studio used Pixar’s technology to create the Smeshariki (marketed abroad as Kikoriki/GoGoRiki) television series and feature films.

RenderMan. Credit: twitter.com
RenderMan. Credit: twitter.com

In 2015, Pixar released a free non-commercial use version of the RenderMan engine, which can be downloaded here.

Isle of Dogs

Maintaining 20 years-old software or mimicking the visual style of comics are, of course, daring tasks. But it is no more challenging than stop-motion animation, which calls for a great amount of manual work and precise attention to the smallest details.

2,000 figurines and 700 faces have been custom-made for Wes Anderson’s new animated film Isle of Dogs. The scrupulous director, known for his distinct visual style, reunited with the team that worked on his first full-length animated movie, Fantastic Mr. Fox.

The characters were built in several stages: first, a plasticine sculpture would be made and used to cast a mould. Then, a metal “skeleton” would be created based on the mould that would allow the animators to move the dolls’ limbs. Finally, the skeletons were covered with silicon, painted, and detailed. A member of the animation team drew 22,000 freckles by hand on each of a main character’s facial expressions.

Making of Isle of Dogs. Credit: awn.com
Making of Isle of Dogs. Credit: awn.com

The most difficult part of the production, however, had to do with small details and fur, as millions of natural hairs had to be attached to the dog characters, as well as the human characters’ hair.

It took the animators eight months to create the minute-long sushi making scene filmed “in the first person”. According to the film’s head puppet master Andy Gent, the scene “broke quite a few people” because of its arduous detail. For instance, one of the challenges was to depict a glove being put on in stop-motion. Several different gloves in various states of expansion were made and swapped out at precise moments to create the impression of one smooth motion.

Isle of Dogs consists of thousands upon thousands of such details and feats of animation. Just like the creators of Into the Spider-Verse, The Incredibles 2 and others, its creators gave life to the characters onscreen and redefined the standards of animation. It’s hard to tell which team will take home the coveted golden statuette, but all of them deserve appreciation and respect for their work.

The 91st Academy Awards ceremony will be held on February 24, 2019.