On Tuesday, SIGGRAPH 2021’s Real-Time Live! kicked off at 4:30 pm PDT. The show opened with a SIGGRAPH first: shaders rendered live onscreen. Following the pre-show (pictured above), were seven live demonstrations of the past year’s greatest projects that use interactive techniques. From Japan to Europe to the United States, people from around the world tuned in to experience what’s next for the future of computer graphics — and they were not disappointed.
This year’s Real-Time Live! featured new techniques in virtual reality, digital avatars, education, and animation. But there was one theme that stood out the most: The theme of bringing people together virtually, something we are all familiar with over the past year and a half. A lot of the demonstrations focused on creating a seamless, more interactive way to bring products, shows, and people together on a virtual and global scale.
Here we will share a brief overview of each of the SIGGRAPH 2021 Real-Time Live! demonstrations. Interested in learning more? Watch the show on our YouTube channel (coming soon).
Project LiViCi: Real-time Immersive Circus Performance
Winner: Audience Choice
Live from the mocap stage, viewers were in for a show. This demonstration — from Shocap Entertainment and The 7 Fingers — brought a circus performance to the virtual world. The performers’ movement was translated onto virtual characters seamlessly and immediately. This technology brings a new meaning to performance and artistry in virtual production, as viewers experienced the acrobat and character performing side-by-side. As one audience member reacted (below), it was “the greatest show”, winning the team the Audience Choice award from this year’s virtual viewers.
I am AI: AI-driven Digital Avatar Made Easy
Winner: Best in Show
Not ready for your next virtual interview? No worries! This new NVIDIA technology creates a virtual avatar from a single photo. It starts with speech. When taking in your speech, the technology generates facial movement to create a lifelike, speaking avatar for video conferencing. But that’s not all! The system also allows for virtual avatars to function via text-to-speech technology, and can even sing. This demonstration showed how far digital avatars can go and, who knows? You may be using an avatar generator for your next Zoom call.
Shading Rig: Dynamic Art-directable Stylised Shading for 3D Characters
Toon shading can be harsh and unrealistic at times, but not anymore. This technology out of Victoria University of Wellington allows you to edit toon shading to provide seamless 360-degree shading for characters. With smooth transitions and the ability to manually change the shading, you now have a way to present characters with more artistic intention. It’s easy, too! Changes can be made easily and automatically applied upon reuploading into your gaming application. Voilà, we now have a more art-directed way to tackle toon shading.
Coretet: Virtual Reality Musical Instruments for the 21st Century
This virtual reality experience offers musicians a flexible instrument to play in a virtual environment. Musicians can now share a performance space and be countries away from each other. The technology, from Rensselaer Polytechnic Institute, uses traditional and familiar physical gestures to figure out the notes you are playing and the way you are holding your instrument. Choose from a cello, violin, viola, double bass, or the experimental orb, and start making music!
Technical Art Behind the Animated Short “Windup”
In this demonstration from Unity, a member of the creative team for “Windup” went over the process of procedural environment orientation in real-time on the animated short, which was part of the 2020 Computer Animation Festival Electronic Theater. The presenter was easily able to control lights, shadows, and animations as the scene on display progressed. With this tool, you have major control over the environment in a scene. Need to build a root on a wall? Customize the size, thickness, twists, and more … in real time!
“I haven’t been able to teach the way I want to … until now,” share Ken Perlin as the start of this demonstration from New York University’s Future Reality Lab. Showcasing the classroom of the future, Perlin went over the Future Reality Lab’s completely immersive teaching platform that is designed for teachers and students to learn in a whole new way, whether it’s about computer graphics or another subject. The system was created using volumetric animation. Educators can learn more about the tool from Ken directly as part of the on-demand SIGGRAPH 2021 Course, “Inventing the Future.”
Normalized Avatar Digitization for Communication in VR
Virtual reality is at it again. This technology, from Pinscreen and UC Berkeley, is based on state-of-the-art avatar digitization. From a single picture, avatars can be generated and enter into an immersive, virtual reality space among other users. The goal is to enable social interactions from remote locations, and avatars are generated from whatever picture you choose … the rest is up to you!