AR motion capture character content utilises real-time ray tracing and real-time facial animation, live-to-air
Pixotope, provider of live photo-realistic virtual production systems, today announced that RIOT Games used Pixotope (AR graphics), with Cubic Motion (real-time facial animation), Animatrik (motion capture), and Stype (camera tracking), to deliver the first live broadcast containing real-time ray tracing and real-time facial animation, at ‘The League of Legends’ Regional Finals on Sunday 8th September in Shanghai.
“At Pixotope our main objective is always to deliver the client’s vision. Great technology gets us a long way, but an even more important piece is the team that brings it together. For this project, Riot Games brought in several outstanding partners who all worked tirelessly to deliver on the client’s creative ambition and give the audience something extra special.
We are truly grateful to our friends at Riot and our project partners for the opportunity to once again push the envelope for live media productions.”Halvor Vislie, CEO, Pixotope
However, ray tracing consumes huge quantities of rendering power to achieve this, which is why to date it has only been used for non-real-time visual effects for film and television. But with the release of Nvidia’s RTX series graphics cards, real-time ray tracing has become possible, but not guaranteed.
The challenge in the live broadcast TV world is using the incredible power of ray tracing whilst also maintaining standard video frame rates. Pixotope unique native Unreal™ based architecture and single-pass render pipeline, provide a very low rendering overhead, enabling ray tracing processing whilst maintaining video playback frame rates.
It's this architecture that has enabled Pixotope team to deliver the world’s first live ray tracing broadcast.