Case Studies | Pixotope

Pixotope Brings the Baltimore Ravens to Life | Pixotope

Written by Pixotope | Jan 28, 2021 1:48:00 PM

 

The Challenge

Virtual event production company, The Famous Group, turned to Pixotope to help them create an extraordinary augmented reality feature at the Baltimore Ravens’ M&T Bank Stadium. The mixed reality technology of Pixotope was central to the production of a giant-sized raven to swoop and land in the stadium and respond to in-game live moments.

“The goal with Augmented/Mixed Reality for this project was to produce a dynamic, in-stadium fan experience for our client, the Baltimore Ravens, and bring their Raven mascot to life, creating a true “WOW” moment. We wanted to give people a glimpse into what is possible with mixed reality production.”

Jon Slusser, Partner and Owner, The Famous Group

 

The Background

Pixotope initially provided technical guidance as the digital raven was designed using CG. Three days before the event, a specialist team from Pixotope arrived at the stadium to integrate the various elements of content and prepare it for rehearsals, adaptations, and eventually for delivery of the final live production. Apart from providing the core technology to execute the mixed reality visuals, Pixotope also provided expertise to act as a liaison between all the various departments which included camera, live program vision mixing, and in-stadium display systems.

“First and foremost, Pixotope took time to understand the project and help strategize with us as we communicated back to the client. Their expertise in the space was a big value-add, and when it came time to develop the various animations, Pixotope helped us with refining the look and feel of the character.”

Jon Slusser, Partner, and Owner, The Famous Group

 

 

 

The Pixotope Solution

The team installed the Pixotope mixed reality platform as the central hub of the complex production, which would later layer the real-time augmented content over the live-action shots. Pixotope created a virtual studio environment, which just like a real studio or set, contained lights, a camera, and objects to be photographed (or “rendered”). In this case, the virtual scene in Pixotope contained the animated raven, a laser-scanned model of the stadium, as well as a camera and lights to match those in the real-world stadium. Additional video hardware was provided by Quince Imaging.

An important task for Pixotope was to accurately sample the stadium lights so that when the digital raven was incorporated into the virtual set, it would be lit exactly as it would have been had it really existed in the stadium itself.

“We initially used 360-degree high-dynamic-range photography to measure both the location and the relative brightness of each of the light sources. Accurate replication of the lighting is essential to ensure the augmented elements look real when added to the live background. In fact, we had almost 30 real-time light sources within Pixotope’s virtual environment, including bounced green light to mimic the effect of reflected light from the green turf of the sports field.”

Frank Daniel Vedvik, Senior Product Specialist, Pixotope 

 

How It Worked

The laser-scanned model of the stadium, which was also imported into Pixotope, was used as a “shadow catcher”. This is a CG model which accurately replicates the shape and form of a real-world scene, but one that is not rendered out in vision directly. How it works is that only the shadows that fall upon the model of the stadium are rendered, which are then composited over the live-action shots. This technique resulted in the augmented raven accurately and realistically casting shadows over the live shots of the stadium.

“One key difference between real-time (live) use of computer animation and the more traditional use within a post-production environment, is that there are many more variables to prepare for with live events. Improvements and adaptations to the augmented elements occur right up until the last second and, therefore, the duration of shots cannot be precisely known ahead of time.”

Øystein Larsen, Chief Creative Officer, Pixotope

 

Normally, CG animated objects are pre-keyframed to run their designated animation and then stop. But this does not work for a live scenario. For example, a brief for the raven to fly into the stadium and land on the goal post, squawk, and then take off again would require the timing of those segments to be pre-set into the animation. However, in a live scenario, it is not known how long the shot will be because the duration will depend upon unfolding game-play events.

To allow for this scenario, Pixotope relied on the team’s ability to access the powerful underlying Unreal Game engine fully. This enabled the use of game logic to migrate on-demand between different states and animations of the CG raven model. By doing this, the raven could be instructed to loop a specific section of the animation, for example, while it waited on the goalposts until the director’s cue. At this point, the game engine could be triggered via Pixotope to merge to a different animation, such as making the raven take off and fly away. This merging process essentially creates “live” animation to move each part of the raven model from the position it was last in, to the position set out in the next animation segment, over a short period of time.

To ensure that the raven could be positioned anywhere within the stadium and be able to properly react to the lighting in any zone, Pixotope adjusted the “shaders” used to render the digital raven model. Shaders are sub-programs that describe how a given surface of a CG model reacts to light.

“The modifications ensured maximum flexibility to match the raven to the high contrast lighting changes between different stadium areas, while at the same time also ensuring that the digital raven could be easily rendered within the Pixotope system in real-time at 59.94 frames per second.”

Frank Daniel Vedvik, Senior Product Specialist, Pixotope

Another requirement was to have the raven perching between the goalposts. Adding augmented content to a background shot might seem simple enough, but this only allows for the augmented content to sit in front of the live shot scene. Due to the viewing angle of the goalposts, the closest post to the raven would have to appear in front of it.

In normal post-production, this would be achieved with a key (such as a green screen chroma key) or by using a matte (rotoscoping). Neither is possible though in a live scenario. Keying cannot be relied upon because it is not able to predict what colors will appear around the object to be keyed. Rotoscoping is also another post tool that cannot be implemented in this case because the framing of the target object could not be known ahead of production time, as it was shot from a freely moving camera with a variable zoom lens.

To overcome these complex challenges, the clever solution Pixotope employed was to build the goalposts as a 3D object within Pixotope. These accurately matched the size and position of the real posts.

“The goal-posts model was used to create a “hold out mask”, which created a hole in the alpha channel of the raven so that when it was added to the background, the part that would cover the foreground post was effectively erased.”

Frank Daniel Vedvik, Senior Product Specialist, Pixotope

This presented the audience with the illusion that the raven appeared to sit perfectly between the goalposts.

As every visual effects person will agree, it is the small finer details added to CG which contribute to it looking realistic. Real-world shots tend to have blurs, flares, and noise due to optical limitations in lenses and cameras. CG is naturally devoid of such imperfections and so can appear unrealistically sharp and vivid. However, adding organic and natural effects such as flares and blurs is processor intensive. This is why they are used very sparingly in computer games where the real-time frame rate is more important than realism. To combat this, the Pixotope virtual production platform comes with its own set of highly efficient custom post-processing effects, which when combined with a bespoke single pass renderer, enable very realistic images to be created in real-time.

One such post-processing effect used on the Baltimore Ravens project was a custom “light-wrap” effect. Light wrapping happens when a foreground object travels in front of a bright light source. In the real world, the bright background light seems to eat away at the edge of a foreground object, with its light spilling over in front. Think of the image of a person standing in front of the sun, with the sun just peeping out from one side. This makes a large flare become visible, which covers part of the person even though the sun is behind them. The Pixotope light wrap feature simulates this phenomenon, while still impressively maintaining real-time performance, even at 60 frames per second.

Once the virtual environment within Pixotope was set up, the resulting images were then layered over the background camera shot. In order to achieve this, Pixotope had to simulate the real-world camera position, viewing direction, and focal length of the real-world camera’s lens. This process ensured that the virtual raven was “filmed” correctly from exactly the correct angle. sType was used to provide the camera tracking information to Pixotope. 

 

 

It is imperative that the quality of the background shot must be preserved when compositing augmented reality on top.

Pixotope takes an ingenious approach to combat this. While the background shot is present in the virtual environment as a light source to affect the CG objects, these are then rendered over a direct feed of the background shot in a process exclusive to Pixotope. This guarantees that when Pixotope augments material onto a camera feed, the original image qualities are left perfectly intact.
Once all the technical aspects had been set up, rehearsals could begin. Since Pixotope works in real-time, adjustments and improvements to most aspects of the production can (and do) occur right up until the last minute. In the case of the Baltimore Ravens project, the agility of Pixotope allowed the creatives and show directors to try out alternatives in pursuit of the perfect augmented experience.

 

Success

The final execution worked flawlessly, with the giant raven swooping into the stadium on cue, which exhilarated the attending audience. Positive social media messages about the event were a testament to how much the mixed reality additions enhanced the audience’s experience and how realistic it was.

“The overwhelmingly positive results were felt in the stadium, on social media, and with traditional media outlets. We received over 11 million views of the Raven in flight on various social media platforms, all referring back to the Ravens’ original social media posts. We also saw a huge spike with traditional media outlets like ESPN, Bleacher Report, and Sports Illustrated”. CBS Sports reported that the mixed reality segments “Had fans, (and anyone who saw the video on social media), in awe.”

Jon Slusser, Partner, and Owner, The Famous Group

 

“Mixed reality content has the power to grab the attention and drive engagement of a wider audience, by providing extra dimensions to the viewer’s experience and creating cut-through, stand-out moments that are so shareable on social media. We are very proud of our work with The Famous Group to bring the Baltimore Raven to life.”

Marcus B. Brodersen, CEO, Pixotope

 

Discover our sport event solutions

Pixotope supports mixed reality solutions at live events for ambitious sports events.

Read more about Pixotope Sports Event Solutions.