Eurosport, live from their office
Made at NEP The Netherlands.
Virtual Set Development:
Will Nunes
Roel Bartstra
Thomas Kole
Natalia Khokhlova
Project Management:
Danny Koelewijn
Though it doesn’t look like it, Eurosport NL did their live coverage from the 2018 winter Olympics live from their coffee corner. We used motion tracking and the Unreal Engine (Zero Density’s Reality version, to be specific) to transform a Green Box into a fully featured TV-studio.


The Set
In essence, running an augmented reality show using Unreal Engine works as follows:
You make a virtual set, keep track of where the camera is in the room, render a frame from the exact same point of view, key the video feed and composite it over the virtual image.
We used Unreal’s UI system (UMG) for motion graphics. This allowed us to easily work with text and animations. Zero Density’s Reality comes with software that lets you build control applications that send text, images and events to Unreal. We could then use this information to populate the UMG widgets with information during a live broadcast.
The idea of turning the set into a lookout over the Olympic village started out as just an idea. As Eurosport sent out their crew to Pyeong Chang, they were able to snap not one, but two awesome panoramas with a prominent view on the Alpensia Sky Jumping stadium. One by day, one in the evening. Using some shader math we projected these panoramas onto the background, allowing for cool sweeping camera movements with a lot of depth.
Tracking & Keying
The greenbox (sometimes referred to as a “limbo”) was built by an external party. It’s made out of wood painted with a very matte green color. The corners are rounded to ensure optimal keying. We key the feed in Reality, and it is composited into the virtual environment.
Tracking is handled my mo-sys. They have a product called the “StarTracker”. It’s an infrared camera that mounts to a broadcasting camera and looks upwards. The ceiling of the greenbox is scattered with little retro-reflecting stickers. The startracker shines infrared light up, so that they show up for the infrared camera. It recognizes the patch of “stars” it sees and works out where the camera is this way. This data, together with zoom and focus data, is sent into our engines.


This is the full setup: It looks rather involving, but during a live broadcast it’s surprisingly hand-off. Populating the ceiling with the reflector stickers is a manual process, best enjoyed with some colleagues and lightsabres.
TV vs Games
Framerate is incredibly important for live AR productions. The show runs at 50fps, so for an hour long show you’re producing 180,000 frames. None of these frames can drop! You will have to take full ownership over the performance of the project, winging it is not an option. Of course, the systems we run on are very powerful, but the visual fidelity required is also very high.
This also goes for asset quality. We often misalign, smudge up, or otherwise add imperfections to our virtual sets so they appear more life-like.
An unexpected source of trouble in AR productions is field of view. Camera’s tend to get up real close, and your assets need to hold up under the trickiest of angles.
The AR part of the show was a critical one. However, the full production is much bigger. It’s always a great experience when you’re installing the project on location. The significance of your contribution is quickly overshadowed when the production takes form, as the greenbox is built, the cameras are set up, the set is lit and the wiring is done. Then it happens again when the crew shows up: the director, presenters, camera operators, journalists and editors, graphics operators, shading operators, image technicians, and much more. It’s awesome when it comes together like that, forming a chain where your contribution is one of many vital links.
LINKS
Zero Density – reality, a customized version of Unreal Engine for broadcasting
mo-sys – camera tracking