Pixotope

The revolution in virtual production systems

Pixotope is an open software-based solution for the rapid creation of virtual studios, augmented reality (AR), extended reality (XR) and on-air graphics.

Realistic photo rendering

Pixotope uses Epic Games’ Unreal Engine to produce photorealistic renderings in real time. The combination of Unreal Engine and Pixotope allows designers to quickly create virtual sets, virtual environments and augmented content, with terrain and foliage, with particle systems (rain, smoke, fire, hair, cloth, explosions, etc.) with simulated cameras and lens properties (lens distortion, depth of field, chromatic aberration).

Quick design and implementation

Pixotope is designed for on-air use allowing for rapid design and delivery of virtual, augmented or mixed reality content. It allows users to configure, create and control any type of virtual production from a single user interface.

Editor WYSIWYG

Whether you choose to create content using Pixotope’s unique automatic generation tool or create your own content from scratch using its specialized WYSIWYG editing tools, Pixotope harnesses the real-time power of the Unreal Engine to create anything, from a single camera project, to a multi-camera live production in a virtual studio, all in cinematic quality.

Low cost subscription

An extremely attractive subscription model ensures that Pixotope can be easily deployed within an organization and tailored to the specific requirements of each project.

Main features

  • Real-time photorealistic CGI characters and environments using the Unreal Render engine
  • 32-bit HDR linear color processing and rendering environment
  • Support for broadcast industry standard video formats, including UHD / 4K
  • Easy to use internal chroma keyer
  • Internal GPU-based video processing system for color correction, mask adjustments and image effects
  • Non-destructive internal composition that ensures that the video is not affected by the anti-aliasing of the graphics system.
  • External compositing pipeline with separate outputs for junk, reflections, shadows and alpha matte for AR
  • Support for all standard products and protocols for real-time camera monitoring
  • Easy to use and customer customizable control panels that can run on any web browser
  • 8 fully configurable video inputs and outputs (with AJA Corvid 88)
  • Genlock / synchronization with Tri Level and Blackburst inputs on SDI, on an SDI video signal or on the internal clock.
  • 3D character generator focused on broadcasting and animation tools
  • Seamless WYSIWYG workflow from editor to live broadcast, with live SDI and monitoring within the editor
  • Low-latency, high-precision external API server for data integration and remote control
  • Multi-camera support
  • Automatic generation of virtual studies using Studio Creator
  • It runs on basic hardware, including professional and gaming-type GPUs
  • Timecode support with LTC timecode
  • Support for industry standard content generation workflows using FBX, Alambic, openEXR, etc.
  • User controllable video and delay detection capability