Unreal Engine, Nickelodeon brings Teenage Mutant Ninja Turtles to life at Comic-Con 2018

Jul 31, 2018 at 12:00 pm by Press Release

Nickelodeon debuted "Rise of the Teenage Mutant Ninja Turtles VR Interview Experience" at San Diego Comic-Con 2018 where users could step inside the world of the Teenage Mutant Ninja Turtles to conduct a live interview with the cast of Rise of the Teenage Mutant Ninja Turtles in virtual reality.

This one-of-a-kind interview allowed users to have a conversation with Mikey or Donnie - voiced live on the scene by series voice talent, Brandon Mychal Smith and Josh Brener, respectively.

UnrealEngine spoke with Chris Young, Entertainment Lab Senior VP at Nickelodeon to get the inside scoop.

What inspired you to use VR as the medium for the TMNT Comic-Con press junket?

The idea originated from a meeting that we had at NAB in Las Vegas with Epic Games, Adobe and NewTek. We had put together a pipeline to stream Adobe Character Animator into UE4 using NewTek's NDI technology, and were looking to do a live cartoon in a game engine, and VR was the obvious way to get up close and experience it.

How did you devise a live two-way conversation in VR with Turtles Mikey (Michelangelo) and Donnie (Donatello)?

We had done a lot of exploration around real-time puppetry along with full body and facial performance capture streaming into Unreal, and when the opportunity to do something with the Turtles came about, it seemed like an innovative approach to allow journalists to speak directly to our characters in a real-time experience.

Was mo-cap integrated into the live experience?

Since the Turtles in the Rise of the Teenage Mutant Ninja Turtles reimagined series are a 2D design style, we relied on Adobe's Character Animator tool to handle the animation, which allowed us to create on model, art directed character rigs that used actual poses and cycles from the show. We have a large motion capture volume where we are working on several projects that use more traditional performance capture with a pipeline for streaming into UE4.

rtmnt_image01.jpg

Please describe the workflow (hardware and software, headset, capture) from creation of the initial CG/environment assets, through to the final produced interviews.

The virtual reality experience was developed using Unreal Engine. Creating this experience in a game engine allowed the ability to create real-time interactions and conversations. The experience streams in Mikey and Donnie puppets by using NewTek NDI Technology. NDI allows streaming of large video files over a shared network. The Mikey and Donnie puppets were created and are driven live using Adobe Character Animator. MIDI Keyboards, with mapped animation cycles, are used to trigger the various poses Mikey and Donnie can do. These MIDI Keyboards are puppeted live during the interviews. NDI allows the puppets to be streamed out of Adobe Character Animator, and picked up by various other machines which use these live animations for the experience and compositing.

rtmnt_img10.jpg
Mikey and Donnie in Nickelodeon's Rise of the Teenage Mutant Ninja Turtles VR Interview Experience.

As the user stands in front of a green screen, they are composited into the New York City rooftop, with a Nickelodeon character avatar head as their head. To maintain performance for the VR viewer and to create high-quality outputs, compositing was accomplished by having a network spectator version of the experience. This spectator version separates foreground and background layers, which are picked up over NDI. These layers are then surfaced in Resolume and real-time composited with the live action footage. That final composition is sent out of Resolume via NDI.

The final composition of the live action footage, the POV of the VR player, and close-up and wide-angle shots of the two Turtles are all recorded with a TriCaster. The TriCaster allows for real-time editing as the interview is occurring with the ability to hand off ISO records and the final program edit to journalists upon completion of the interview on a thumb drive.

We used a combination of 2D planes for background elements and CG meshes in the foreground to get correct perspective for the VR viewer. In order to replicate the 2D look in a 3D space, we adopted a workflow for taking deformed meshes from Maya into UE4 until we felt that we had captured the correct pushed, warped perspective for a stylized hand-drawn environment. Back in Maya, from the VR player's POV we projected flat planes on the background geometry and painted textures on those flat UVs using the actual Photoshop brushes from the show. This perfectly captured the hand-drawn, painted style of the backgrounds.

Does this experience use similar technology to the Nickelodeon experience created for Imax VR centers? How is this Comic-Con experience an extension of that technology?

We were able to merge a lot of our code for how we handle VR pawns and VoIP along with some of our CG avatar assets we created for SlimeZone VR into the Turtles experience.

Why did you choose to build this experience in Unreal Engine?

We've had a lot of success rapidly going from prototypes to fully functional builds in UE4. The knowledge we've acquired for authoring VR in UE4 mixed with the ability to mock-up ideas in Blueprints allows us to iterate quickly and gives us freedom to focus on creating cool things.

rtmnt_image02.jpg

Which version of UE did you use, and was there frequent use of any favorite UE features?

We used 4.19. Since our development tends to be experimental and cycles are short, we are pretty aggressive about updating to the latest version of the engine. We relied heavily on NewTek's NDI plugin with alpha support for streaming video onto UE4 materials.

Did the animators from the series collaborate with the creators of the VR interview activation for Comic-Con?

We worked closely with the creators and the animation team to bring the look of the show and the animation style into the VR experience. A lot of attention was spent on getting the 2D look to translate to VR.

What was the biggest challenge in pulling this off?

The live aspect, with synced audio/video/game sources, which required a network of machines all sending and receiving data to each other, that had to be trucked from our studio in Burbank and built on-site at Comic-Con required a lot of pre-planning.

How are the production considerations different when designing a VR experience that requires live updates on the fly?

This was a hybrid of animation, game development, live-action performers, puppeteers, virtual cinema photography techniques, on location in a live-broadcast setting. It was everything rolled into one, and like nothing we had ever done before!

Autodesk Subscriptions

Imagine your best photo ever