During the past days I've continued to explore Unreal Engine 4, while I also work on my regular stuff. I've continued work on Amelia (the protagonist of Just Let Me Go, one of my micro-company's projects) and her hair, but that's for another series of articles.
As for the rest of the Unreal tests I've done, one of them is bringing morph-based facial animation into UE4. If you have read my articles covering Autodesk products, then you can guess what tool I specifically wanted to test for the facial animation. Obviously I wanted to try out Face Robot.
To make things fast, I decided to just open one of the sample files that Softimage includes with its installation. There's an animation that is around 4000 frames long, but I decided to only use the first 1,000 frames.
When I was working on one of my first games, Enola, I also used Face Robot for the facial animation, but that time I used a bone-based facial rig (image below). That one worked, but joining that facial rig with a body rig was a big pain, so, for Just Let Me Go I decided it was better to use a morph-based workflow (I had already tested it with Unity and it worked just fine).
RELATED: Unity user explores Unreal Engine 4: Part 1
RELATED: Unity user explores Unreal Engine 4: Part 2
RELATED: Unity user explores Unreal Engine 4: Part 3
RELATED: Unity user explores Unreal Engine 4: Part 4
RELATED: Unity user explores Unreal Engine 4: Part 5
RELATED: Unity user explores Unreal Engine 4: Part 6
RELATED: Unity user explores Unreal Engine 4: Part 7
RELATED: Unity user explores Unreal Engine 4: Part 8
RELATED: Unity user explores Unreal Engine 4: Part 9
RELATED: Unity user explores Unreal Engine 4: Part 10
In Softimage, I exported the animation using the “Linked Shapes” export method. What this method does is create a bunch of blend shapes from the Face Robot animated face, to replicate the movement of the original face. When I say “a lot” I really mean “a lot.” As you can see below, in the resulting Maya file there are 300 blend shapes.
I could spend some time refining the lattices created for the animation export, but I figured this was just a test so I wanted to check if I could actually export the face. I went through the entire export process (creating the placeholder facial righ, morphs, transferring animation, and exporting the XSI file), and then I imported it into Maya. I have to use my old version of Autodesk EC suite to do all this, because new versions of Maya cannot import XSI files (XSI import is performed using the Crosswalk plug-in included with Softimage). That's why I still have my old Maya version installed even if I am using newer versions.
In Maya, all the morphs are driven by the plug-in, so I use Edit\Keys\Bake Simulation to bake the animation directly on the face's BlendShape node. After doing that, I can save the file and then open the file on my current version of Maya, and then export to FBX so I can bring it into Unreal Engine 4.
The file imports into UE4 just fine (I only needed to activate a checkbox to import morph targets). The import creates the model asset and the animation asset. The first thing I did was check the animation asset, to see if the animation was indeed in, and I was glad to see it was.
Then, it was just a matter of applying the animation to the model.
In the video below you can see the model with the applied animation. Something I really like is how the animation includes small twitches and micro movements, giving a lot more realism to the animation (even if I didn't refine the export enough to actually improve the animation detail). Now you can see why I talk so much about how Autodesk should bring back Face Robot.
My next step is to test the entire workflow with Amelia, including a full face + body animation. In the meantime, I'm also learning more about Blueprints (using another Pluralsight course) and also a little bit of artificial intelligence.
Get Unreal Engine: https://www.unrealengine.com/en-US/get-now