Calling All Autobots

A behind-the-scenes look at an interactive WebGL experience

6 min readJun 29, 2017

--

To promote the long-awaited sequel in the Transformers saga, The Last Knight, we were tasked to develop an interactive website centered around Bumblebee.

With this article, we want to share how the project came into fruition, the challenges we faced, the technology we used to develop it, and the creative aspects that made it worthwhile.

Concept

Paramount partnered with Spotify to create this experience — so, naturally, we knew music was going to be an integral part of the interactive site.

Bumblebee quickly became the ideal choice. Being one of the most well known Transformers, he is beloved for his loyal character and nimble way of moving. That — coupled with the fact that his voice box has been replaced with a radio — made him the ideal personage for this experience.

Framing the experience with an interactive personality test in WebGL, the website matched fans to protagonists from the film based on their personality.

The idea was to have Bumblebee as the backbone throughout the experience. Every time a fan picked an answer to a question, Bumblebee would interact by dancing to a track that would play along for an average of 7 seconds.

A core production component was constructing the logic behind the experience. Answering questions in sequence, a complex decision tree was used to decode the user’s personality match. A hyper-personalized, Transformers-themed playlist on Spotify would then be uniquely assigned at the end of the experience and went hand in hand with the character match.

Modeling

Having no access to the latest Bumblebee model, one of the initial challenges was determining how to stay true to the character without compromising any of its distinguishing characteristics.

To do that, we worked with an older version and remodeled the parts that have changed this year. Luckily for us, while Bumblebee does change with every movie, the updates are incremental, leaving us with only a limited amount of remodeling work.

Additionally, poly count wasn’t too much of an issue. We had performance tested up to 400k vertices in WebGL and after our optimizations we knew that the 65k vertices (which our model ended up being) would not be an issue.

Texturing (Quixel + PBR)

Beauty, Metalness, Gloss, Normal, AO & Albedo Passes

Because PBR is the safest bet nowadays, we decided to use a rendering software — in this case Quixel — to texture Bumblebee. Its relative ease of use meant that we could texture Bumblebee quite intuitively and let the software take care of generating Albedo, Gloss, Roughness and Normal Maps for us. It runs off AO, Object Space Normal, Curvature and Material ID maps (for the most part). These were straightforward enough to render out of Maya.

The software is intuitive and efficient at determining the dimensions, cavities and edges of an object. This is handy when procedurally texturing an asset such as Bumblebee.

In the saga, he’s portrayed as a new car with lots of battle damage, most of which occurs along the edges. With Quixel, we were able to quickly iterate different amounts of damage on him and provide our Art Director with several options for us to refine.

Once the general look was determined, the texture team refined the various materials (metals, rubbers, car paints, reflectors, etc) that Bumblebee is comprised of and saved them as material presets. These were then quickly distributed throughout the model following the common Material ID theme determined earlier. Painting on specific pieces of emblems, edge wear, dirt and grease proved effortless as we were able to paint these on manually within the suite.

Finally, the background consisted of a single sphere with a matte painting which was created in 360° using Photoshop.

Rigging + Animation

Due to the short timeline, we decided to use Motion Capture data to drive Bumblebee’s animations. For that reason, we kept the rig simple by settling on an FK rig, with controls to animate facial expressions and other ancillary bits of animation.

Early animation testing

All the animation was handled in Maya. Consequently, there was a small degree of cleanup to do, but the repositioning of Bumblebee to our camera was simple enough to manage with Maya’s timeline editor.

The user triggers up to 8 different dance animations based on the selection , which means Bumblebee has to blend between idle states and dance states, then back to idle. This blending was handled in Playcanvas.

PlayCanvas

When deciding what WebGL framework we would use, we considered both Three.JS and PlayCanvas. We knew the visual fidelity of this project was the most important aspect we had to get right, so PlayCanvas ended up being the best choice because:

  • FBX to WebGL pipeline. It was a very straightforward process to import 3D models and animations that the artists were creating in Maya.
  • PlayCanvas online editor. It allowed the artists to directly tweak and iterate the lighting, cameras and post-process effects.
  • Ready to use shaders. We had access and documentation to PBR and post-process shaders, as well as a way to create or extend them.

Animation in PlayCanvas

One roadblock we bumped into when working with the animation component in PlayCanvas was the blending system; it is very limited and ultimately, we were not satisfied with the result.

Animation blending

Basically, the problem was that a transition from one state to another would pause the initial animation, interpolate to the initial pose of the second one, and then resume playing.

To solve this problem, we looked at how other engines (mainly Unity and Unreal Engine) handle animation blending, and set out to write our own solution.

UI

For the front end, we had used React and Redux extensively on previous projects, so as long as we could get both React and PlayCanvas integrated and talking to each other, we knew we would be in a good spot.

After the exploration phase, the very first thing we set out to do was to build a small prototype that would include 3D assets from the artist in PlayCanvas, as well as a React layer on top. Once we achieved that, the rest of the development cycle was straightforward.

This component was built with spritesheets, dynamic animations like the progress meter or when it first appears on screen, as well as a pre-baked animation just before the results page.

For our tech nerd friends, here’s the shader that controls the spritesheet, the masking, and its opacity:

It’s a wrap

The opportunity to incorporate WebGL into another project was a treat. We’re constantly exploring new avenues and discovering digital innovations that make web development a progressive technological art. More importantly, we recognize there is always room for improvement, which is why we took this project as a great learning experience.

Explore the site and unleash your ride here: http://www.transformersmovie.com/CallingAllAutobots/

--

--

Thinkingbox
Thinkingbox

Written by Thinkingbox

Thinkingbox is an experience agency shaping the future of brands through craft and curiosity.