Lost in Time: The Future of Immersive Media

Article Featured Image

Lost in Time is a mash-up of live action filmed on a green screen with audience participation and real-time graphics. spanning TV, gaming, mobile and e-commerce.

"We are pioneering a new category of digital entertainment called interactive mixed reality (IMR)," explains Bård Anders Kasin, co-founder of The Future Group (TFG), the tech masterminds and co-producer of the show with FremantleMedia. "The real and virtual worlds merge to produce a new environment where physical and digital co-exist."

"All the big TV evolutions have come about from technology," says Dug James, SVP of development production and part of the global entertainment development team for FremantleMedia. "The advent of mini-DV and portable camcorders enabled the first reality shows. Nonlinear editing and logging of rushes enabled on location rapid turnaround formats like Big Brother and text voting pushed interactivity with the Idol and Got Talent formats. Fremantle are always looking for those forward-looking tech changes that deliver new ways of entertainment or storytelling."

Nolan Bushnell, the founder of games developer Atari Corp. and a consultant to the project, claims that the fusion of gaming with TV can "bring a standard construct for new kinds of entertainment."


Lost in Time host in the Jurassic period.

 

Lost in Time follows three contestants as they are transported into different eras including the Roaring Twenties, Wild West, Space Age, Ice Age, Medieval Age, and the Jurassic period where they compete in a series of challenges against the clock with the aim of winning a jackpot prize. Viewers can also participate within the same virtual world via mobile app where they are able to play against the show contestants as well as other players from across the country in real time.

While the TV show can be enjoyed without playing the second screen game, the format is targeting family audiences for broadcasters concerned about shedding core viewers to OTT services and mobile.

TFG has spent two and half years and amassed $42 million in funding to develop the core technology which is based on Epic Games' Unreal Engine.

The nut that it's managed to crack is to synchronise computer animation with live action photography such that people are able to interact with virtual objects, or have physical objects interact with the virtual world, in real time.

It's a breakthrough that puts the company three years ahead of anyone else, according to co-founder Jens Petter Høili. "Various companies might have developed different aspects of the technology, but no one has put it all together in the way we have," he says.

The technique is becoming increasingly popular in visual effects filmmaking on films like Avatar or The Jungle Book where a director can direct actors against a green screen while viewing virtual backgrounds in real-time. TFG takes this a stage further—it is fully live-capable. The virtual worlds are created in advance and rendered live mixed with live action photography.

"A games engine wants to render in as few milliseconds as possible whereas broadcast cameras records at anywhere from 25 to 50 frames a second," explains Bård Anders. "To make this work for broadcast we had to work a way of getting frame rates from virtual and physical cameras to precisely match, otherwise this would not work."

Working with Canadian broadcast kit vendor and virtual set specialist Ross Video, TFG devised a means of binding the Unreal game engine with linear timecode.

To do this the team were granted access to the Unreal source code and reengineered it so that the rendered virtual image is genlocked with studio footage captured with SMPTE timecode.

To achieve a pinpoint accurate chroma key, a set of HD systems cameras have been customised to run 4:4:4 RGB rather than the 4:2:2 YCbCr compressed images of conventional broadcast.

Contestants and physical objects are tracked by 150 IR sensors positioned behind the green screen. This arrangement also enables motion capture in real time. Demos of this have included contestants mocked up as storm troopers.

"In a movie you'd put markers on an actor and remove them in post," says Bård Anders. "We don't have that luxury so we needed a whole new way of linking the IR signals with the camera tracking."

Even the speed and racking of the robotic cameras has been tinkered with. Such systems are typically designed for slow-moving tracking shots and gentle zooms in news room virtual sets not for filming people running or jumping around.

The cameras are loaded with Ross' software UX VCC which provides a bridge between the robotic and manual camera systems with a tracking output and the Unreal engine.

Accommodation had to be made for any change in the physical depth of field from focusing or zooming which will naturally distort the picture's bokeh (the visual quality of the out-of-focus areas of a photographic image). To do that profiles of each individual lens are fed to the UX VCC which in realtime replicates the distortion inside the virtual camera model.

"If a physical prop in the studio and a virtual prop are not aligned even by a fractional amount then the whole chain pulls apart," says Bård Anders. "The background optics of each lens which distort when you change focus need to be exactly matched in the games engine."

Production is being made in a 500 sqm/5381 sq ft studio on the outskirts of Oslo. This arrangement includes a Technocrane, automated ceiling camera, several SolidTrack markerless systems plus Steadicam units. A military-grade simulation platform is used for flying or driving game elements.

The idea is that broadcasters could either use this as a hub and fly in to shoot their version of the show or establish their own green screen base equipped with a package of TFG gear. Further production hubs in the U.S. and Asia are planned.

TFG will offer a package of pre-built virtual environments as well as a library of 3D assets for content creators to build their own worlds.

This allows a broadcaster to tailor the show to suit. A Chinese version of Lost in Time might include a future Shanghai or an ancient Han Dynasty world in contrast to a version produced for Argentina, for example.

The entire technical setup will be sold to broadcasters along with a licence to produce the format. Crucially, that required the system to operate within a standard production environment.

"We could produce over IP but Fremantle needed this to scale which means it has to be able to plug into studios throughout world," says Bård Anders. "It is also important for broadcasters to use this without needing to train people to a large extent when they operate it."

Familiar broadcast switchers and control surfaces are integrated, such as Ross Carbonite Black and the Ross UX VCC. Directors will have to familiarise themselves with the ability to select from a virtual infinite number of camera angles inside Unreal with which to replay highlights of a game from any angle.

As part of the current production contestants will be recorded in 3D using photogrammetry for insertion of animated avatars of their facial likeness at certain points in the games' storyline.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues