Digital human creation with Reallusion: Animating in iClone 2

Nov 04, 2020 at 02:00 am by nemirc

This time we are going to take a closer look at animation in iClone. You can produce animations using keyframes, puppeteering, and motion capture. Since I don’t have any kind of motion capture device, I will focus just on keyframes (I will leave puppeteering for later).

You may know I usually animate with HumanIK in Maya, since it provides a very natural-moving humanoid rig. iClone’s humanoid rigs are basically the same. When you move a body part, the entire body reacts accordingly. So, if you rotate the upper chest, the lower chest and even the hips will react to that rotation. You can use the animations from the library, but you can also create your own ones from scratch. iClone uses auto-key animation, and keys are created on all body parts whenever you move the character. However, if you want to offset bodyparts, you can manually move them along the timeline. This is useful when you are animating the “follow-through”, to add a more natural feel to the animation (for example, if your character turns to the side, and you delay the head so it feels like different body parts hit the pose at different times).

The animation timeline can display the different tracks for your animation, but some of them are hidden by default. For example, the facial expressions track is hidden, but you can show it. Just like when animating the body, when you animate the face and change the pose, a keyframe is created. Here, the face is animated “as a whole”, though, so you can’t display keyframes for the brows or the lips separately. Coming from Maya, I am used to being able to animate the different morphs separately, but I don’t think having a global keyframe is an issue here, because, in iClone, you are animating facial poses, not morphs. If anything, I think it makes things more streamlined and easier to track.

iClone’s library includes hand gestures as well. These can be a quick way to make your character switch from one gesture to another. Gestures work like motions, as they are added to your timeline as tracks, and you can add keyframes to modify the overall gesture.

When you finish the animation (body and face), you can save it to your library for reuse. The iClone timeline has start and end markers, so you can tell iClone where the saved animation should start, which is useful if you just want to save part of your animation.

As you know from my previous article, when you export animation from iClone into UE4, it includes the facial animation. You can also use Character Creator to batch-export animations into UE4. If you open Character Creator, and open your character, you can then set the options to only export motion, and add all your iClone motion files to the export list. All the exported animations can be then imported into UE4 just fine, but this will only include body animation (no facial animation). The only way to get the facial animation is exporting from iClone, using the 3DXchange pipeline.

I will continue exploring iClone animation more, including facial animation, soon.

Visit Reallusion’s website:

Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!