As a follow-up to the previous article, I decided to cover more presentations and talks available at SIGGRAPH 2020.
Every year, Maxon gives us an update on what they have been working on, and this year was no exception. They focused on updates for Red Giant, Redshift and Cinema 4D. When it comes to Red Giant, they have updated some After Effects plugins, including their glow and chromatic aberration. Chromatic Aberration has gained traction these last years, but something I liked about Red Giant's approach is how they allow you to refine the effect, allowing you to do things like change the center of the effect and also making the color separation more gradual and even blurred, producing really nice results.
They also presented Redshift RT, and showed off the new renderer in Maya. As we have seen during these years, real time rendering made its way out of game engines and into media production, and Redshift offers you a real-time version of Redshift (hence the name), allowing you faster iteration when you change lighting, camera position and even materials. Another interesting bit was how they are changing the Cinema 4D core and implementing a node-based workflow for different things, from scene creation to material creation. One thing they mentioned is how nodes are going to be optional, not forced, meaning you will still be able to follow your usual workflow, but a node-based workflow may be beneficial to bring other types of users into Cinema 4D.
I also attended a talk from Laika, where they explain how they collaborated with Intel to create AI rotoscoping for cleaning up their stop-motion models for their feature films. The talk is very technical, and goes into a great detail explaining how the software saves them a lot of time when they are working on removing seams on their models on finished shots.
Another interesting talk was from Pixar, where they explain the method they use to bake their rigs, while keeping the same level of quality. There are various reasons to do this, including making their rigs compatible with newer versions of software applications, and also exporting their rigs to third party applications (for example, a game engine).
Speaking of games, CD Projekt Red presented the tool for facial lipsync and animation they are using for Cyberpunk 2077. The tool, named JALI, takes audio from voice acting, transcriptions (with meta tags) and then synthetizes a facial animation using data it has learned from previously-fed information, to produce a believable result. This tool can animate the mouth, jaw, eyes (including pupils), brows and neck to create the result. Since Cyberpunk 2077 will be releasing globally, and will support various languages, the system is able to produce animations for all different languages.
While some game engines already include lipsync tools, they are very rudimentary compared to what CDPR are doing, and it's very cool to see the amount of work some developers put on their creations. When you consider all NPCs in the game will have this same system driving their facial animation, you can imagine the level of detail and realism Cyberpunk 2077 is going to offer.
There are still a lot of interesting things to see, so stay tuned for more SIGGRAPH 2020 coverage on the next article.