Skip to content
Pixotope announces a software release of 1.3 version
Pixotope22 July 20203 min read

Pixotope unveils mixed reality advances in Pixotope 1.3

Version 1.3 offers new object tracking, powerful lighting integration, enhanced colour management, and more.

 

Pixotope Technologies, creators of live photo-realistic virtual production system Pixotope, has unveiled its latest Version 1.3 software featuring a wide range of advances that promise to improve how virtual environments interact with real-world elements.

Pixotope enables the production of mixed-reality (MR) content by bringing together physical components such as presenters, actors, props, and free-moving cameras, with virtually created assets such as scenes, graphics, animated characters, or any other computer-generated elements.

Pixotope forms the central production hub when creating mixed-reality content for broadcast and live events, with Version 1.3 offering new object tracking, powerful lighting integration, enhanced color management, and more.

An advance in Pixotope Version 1.3 is the ability to utilize and integrate data from real-time object tracking systems. This allows Pixotope to use the position of moving tracking locators in the real-world environment and attach them to digitally created objects so that those objects can be made to follow the tracked motion.

This, in turn, enables presenters to freely pick up and rotate graphics or any other virtually generated asset, opening limitless creative possibilities. From showing a 3D model in the palm of their hand to controlling any aspect of a virtual scene with their own physical movement, presenters and actors become free to interact with the virtual world around them.

Another benefit of Object Tracking is that presenters themselves can be tracked so that Pixotope detects wherein the scene they are. A challenge with normal virtual studios is that presenters must be mindful about where they stand and when. Presenters cannot walk in front of graphics that have been composited over the frame. However, when accessing an object’s position and orientation through Pixotope’s Object Tracking interface, they are free to walk in front, behind, or even through virtual objects, because Pixotope recognizes where the presenter is in respect to the position of other generated items within three-dimensional space. 

Also new in Pixotope Version 1.3 is the ability to control physical lights using DMX512 over the Art-Net distribution protocol. This enables Pixotope to synchronize and control any DMX controllable feature of physical studio lights from the digital lights used to illuminate virtual scenes. Lights can then either be driven via pre-set animation or by using the new Slider widget available for user-created Pixotope control panels. Such panels can be accessed via a web browser on any authorized device and be operated either by a technician or presenter.

Pixotope Version 1.3 also further improves the results of chroma keying (such as for green screen studios) with new features to help extract greater detail, like fine strands of hair and shadows, as well as new algorithms to process key edges to sub-pixel accuracy, improve color picking and automate the reduction of background screen color spill.

Color Management has been extended to Pixotope Editor Viewport to ensure that artists working in any practical color space, including HDR, can have complete confidence in the color fidelity of the images they are creating.

Pixotope is natively integrated with the Unreal game engine, and in Pixotope 1.3 all the latest features of UE Version 4.24 are available to users. Benefits include layer-based terrain workflows for the creation of adaptable landscapes, dynamic physically accurate skies that can be linked to actual time-of-day, improved rendition of character hair and fur, as well as increased efficiency for global illumination that helps to create photo-real imagery.

Pixotope Chief Creative Officer Øystein Larsen explained: “The success of a mixed reality scene depends upon the relationship and interactivity between real and virtual components. Part of this success depends on technical accuracies, such as matching lighting, replicating freely moving cameras, and having seamless keying. But there is also an emotional aspect that flows from enabling presenters and actors to freely express themselves through unrestricted movement and interacting with virtual objects as they would real objects. Version 1.3 of Pixotope provides large gains in both these areas.”

Pixotope CEO Marcus Blom Brodersen added: “The advances within Pixotope Version 1.3 deliver another step-change for producers of mixed-reality content. The extremely high-quality images Pixotope produces, together with the creative and physical freedoms it allows for those, both in front of and behind the camera, enable our customers to make ever-more exciting and attention-grabbing productions.”