Reallusion’s Charles Chen, John Martin Talk Ai & the BMW Omniverse Collaboration

Oct 09, 2021 at 05:00 am by Warlord720


When BMW was looking to simulate its factory floor for layout, testing, and efficiency they ran into one need that was outside of their area of expertise, creating real-time characters for use in the simulation. These needed to be more than just casual NPCs in a game. They needed some real-life qualities while being as light on resources as possible to simulate the floor traffic around production layouts.  

For this Reallusion provided workforce characters as the company has a lot of experience in characters and animation. Throw in their vast knowledge of real-time environments and it was a good fit to provide natural-looking and moving characters to the project.  

This project was highlighted in several media avenues, so I was thrilled with the opportunity to sit down with Charles Chen, founding partner of Reallusion, and John Martin, Vice President of Marketing for Reallusion to dig deeper into the project. Finnie Lu, also of Reallusion, was instrumental in setting up the meeting and was in attendance as well.  

As Charles, John, and I have known each other for almost 20 years the discussion covered a lot of areas besides the project. It veered off into the upcoming iClone 8 many times and while I can’t speak to everything that was discussed concerning the new version, I can say it will be a major difference in many areas while retaining the look and feel of version 7.  

Reallusion is known for powerful, feature-packed updates but the iClone 8 update will be in a league of its own.   And the vision of Charles as he explained some of the things that are coming down the line was infectious. He knows Reallusion can never stop improving, adding features, and must continually be under development.  

So, let's move on to the first part of the questions. 


(Below is a paraphrasing of the conversation) 

MD McCallum: How did his collaboration get started? What kicked off Reallusion’s involvement? 

Charles Chen: NVidia’s RTX render makes the real-time shader so cool. So, we start to think that someday iClone needs RTX rendering but is too slow to integrate as iClone’s default renderer. We want to find quicker ways to work with Nvidia's new render technology and we started to engage with them and their USD standard which NVidia hopes to become an industry standard with 3D applications. USD format will allow us to overcome some limitations with the FBX format. 

Omniverse is an open platform and NVidia wants developers to join which we see as a chance to integrate with them. We need to make sure that all our shaders are compatible with their RTX (mdl) shader. Also, we can provide NVidia with a simplified path to bring in real-time characters and animated characters. That’s a start with technology integration. 

Most people don’t see under the hood the strong AI behind Omniverse. NVidia is very strong in AI and has big teams on everything AI. Their vision of digital twins is the replication of everything in the real world. From manufacturing equipment to workers. And to be as close to the real world as possible. 

BMW is a perfect case to show 3D isn’t just used in the 3D entertainment world but is being embraced by many industries.  

John Martin: This is all powered by their AI system, and they can see how certain areas will work and how certain people are interacting with each component and process. 

Charles Chen: They were using robots instead of people. We used Headshot and our system to produce a simplified character with reduced polygon count so they can simulate up to 200 workers in the factory. This includes different ethnicities and backgrounds from different countries with masks and types of dress. 

BMW Project Character Creator 3

BMW Project Character Creator 3

They have their own AI-driven character animation system. A sophisticated system to tell each character where to go, what to do at what time. We are working with this tool to make sure our characters become their character standard from skeletal to retargeting motions and to define bone structures that are compatible. 

John Martin: We captured all the motions with XSens inertial system and did a series of movements that they were able to train their AI with. 

Charles Chen: Once they received a motion, they worked with it to make sure it was machine controllable for work in the factory. This will allow them to see how the new assembly line will work. To build an assembly line takes a long time and huge cost so they can use this to optimize everything anytime they change things. They can use the realistic characters to demo the project in terms of visual communication and workflow traffic. 

BMW Project 

 BMW Project

 

MD McCallum: What is Reallusion’s role in the collaboration? 

Charles Chen: We see the opportunity to become their character system since they don’t have a character system. We have started to work with them on several internal research projects to make our character more tightly integrated into their pipeline.  

The big thing is how AI will eventually change the landscape of animation. In the past animation was all about skill, timeline editing, curves, keys, and everything else. Nowadays it’s about using natural behavior to drive animation. Audio2Face being one. We hope this kind of technology can be integrated into iClone. Maybe not immediately. Maybe need to send to Omniverse and back to iClone and all the animation would be baked into iClone eventually. All these things would be facial clips in the timeline. This is coming depending on their (NVidia) release time. 

 

MD McCallum: What are Reallusion’s strengths? 

Charles Chen: Generating characters, motion blending, mocap, character assets, motion, and photo to 3D head are just a few of our strengths.  

[End Part 1]


 

As you can see in some of the answers Charles provides, there are some things to be excited about and this is only the first half of the questions that Charles and John graciously took the time to address. The fact that Reallusion is actively pursuing RTX render technology will make a lot of users happy and possibly entice new users from other fields outside our entertainment industry.  

BMW Project

As Charles explained during our conversation, he feels a big part of the future of animation will be AI-driven instead of keyframe animation. This process involves AI-trained characters instead of keyframes which will open up a more professional type of animation for all types of animators. In a recent conversation with a programmer friend, I noted that motion capture is real whereas anything keyframed is contrived of which he agreed. It is not an uncommon sentiment amongst animators. 

I think it was a Corridor Crew YouTube video where one of the animators said everything is right, everything is natural until keyframing comes into play. This involved a discussion about hard it would be to use CGI to make the Boston Dynamics dancing Robots video. This further illustrates a desire within the industry to use natural movements versus keyframed. 

Tight integration into Omniverse will position Reallusion and its userbase with a powerful rendering tool to go along with the AI-driven animation and other Omniverse features. It is a win-win situation as it leaves Reallusion’s in-house resources to focus on other areas of improvement. 

In the next installment, we’ll go through the rest of the questions and take a quick look at what the future holds for iClone and Character Creator in general. 


M.D. McCallum, aka WarLord, is an international award-winning commercial graphics artist, 3D animator, published author, project director, and webmaster with a freelance career that spans over 20 years.  Now retired, M.D. is currently working part-time on writing and select character development projects. You can learn more about MD on his website.  

 

 

 

Sections: News & Features




Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!