An Object Lesson in Personalized Streaming Video Experiences

Article Featured Image

“One of the substantial benefits of working this way would be to allow us to author experiences once, for all devices, and deliver the composition session data to all platforms, allowing the devices themselves to choose which raw assets they need to create the experience for themselves,” he says. Examples include a low bitrate version for mobile, a high-resolution version for desktop, and 360° for VR headsets.

In theory, this would allow the production team to serve potentially hundreds of different types of devices regardless of connection or hardware capability without having to do the laborious work of rendering a separate version for everyone.

The hardware for an object-based production, called IP Studio, is being adapted for IP by the BBC. From a production point of view, equipment from a camera to a vision mixer or archive can be treated as an object. “IP Studio orchestrates the network so that real-time collections of objects work as a media production environment,” says Page. So, in the BBC’s schema, Optic will output UCMP, and that sits on top of IP Studio.

OBB Goes Commercial

As a public-funded body, the BBC is driven to unearth new ways of making media accessible to its licence fee-paying viewers. Larger onscreen graphics or sign presenters in place of regular presenters to assist people with impairments are two examples of OBB intended to improve accessibility.

The BBC is also part of the European Commission-funded 2-Immerse with Cisco, BT, German broadcaster IRT, ChyronHego, and others. It is developing prototype multiscreen experiences that merge broadcast and broadband content with the benefits of social media. To deliver the prototypes, 2-Immerse is building a platform based on European middleware standard HbbTV2.0.

OBB is likely to be commercialised first, though, in second-screen experiences. “The process of streaming what’s on the living room TV is broken,” argues Daragh Ward (right), CTO of Axonista. “Audiences expect to interact with it.”

The Dublin-based developer offers a content management system and series of software templates that it says makes it easier for producers to deploy OBB workflows instead of building one from scratch. Initially, this is based around extracting graphics from the live signal.

Axonista’s solution has been built into apps for the shopping channel QVC, where the “buy now” TV button becomes a touchscreen option on a smartphone, and The QYOU, an online curator of video clips that uses the technology to add interactivity to data about the content it publishes.

The idea could attract producers of other genres. Producers of live music shows might want to overlay interactive information about performances to the second screen. Sports fans might want to select different leaderboards or heat maps, or track positions over the live pictures. BT Sport has trialed this at motorcycle event MotoGP and plans further trials next year.

Another idea is to make the scrolling ticker of news or finance channels interactive. “Instead of waiting for a headline to scroll around and read it again, you can click and jump straight to it,” says Ward. Since news is essentially a playlist of items, video content could also be rendered on-demand by way of the news menu.

This type of application still leaves the lion’s share of content “baked in,” but it’s a taste of OBB’s potential. “All TV will be like this in future,” says Ward. “As TV sets gain gesture capability and force feedback control, it allows new types of interactivity to be brought into the living room.”

The audio element of OBB is more advanced. Here, each sound is treated as an object to add, remove, or push to the fore or background for interactivity, to manage bandwidth, processing capacity, or for playback on lower fidelity devices.

Axonista has built an object-based solution into apps for the shopping channel QVC, where the “buy now” TV button becomes a touchscreen option on a smartphone.

Dolby’s Atmos object-based audio (a version of its cinema system) is likely to be introduced to consumers as part of a pay TV operator’s 4K/UHD package. Both BT Sport and Sky, the broadcasters dueling it out with 4K live services in the U.K., have commissioned their mobile facility providers to build-in Atmos recording gear. Sources at these OBB providers suggest that a switch-on could happen by this time next year.

Initially, a Dolby Atmos production would allow additional user-selectable commentary from a neutral or team/fan perspective, different languages, and a referee’s mic. It would also add a more “at the stadium” feel to live events with atmospheres from the PA system and crowd.

BT’s research teams are also exploring the notion of responsive TV UI for red button interaction on the big screen and targeting 2020 as time for launch.

“Today we tend to send out something optimised for quite a small screen size, and if you have a larger screen it is then scaled up,” Brendan Hole, TV and content architect at BT, told the IBC conference.

“We are asking what happens if the broadcast stream is broken into objects so that the preferences of the user can be taken into account. You can add or remove stats in a sports broadcast for example, have viewer selection of specific feeds. It could automatically take account of the size and type of screen or it could take account of the fact I have a device in my hand so elements, like stats, could be delivered to mobile instead of on the main screen.”

Others investigating OBB include Eko Studio, formerly known as Interlude’s Treehouse. It offers an online editing suite that lets users transform linear videos into interactive videos so that the viewer can choose the direction of the video.

New York-based creative developer Brian Chirls has developed Seriously.js, an open source JavaScript library for complex video effects and compositing in a web browser. Unlike traditional desktop tools, Seriously.js aims to render video in real time, combining the interactivity of the web with the aesthetic power of cinema. Though Seriously.js currently requires authors to write code, it is targeted at artists with beginner-level JavaScript skills so that the main limitation is creative ability and knowledge of video rather than coding ability.

MIT put the groundwork into object-based media a decade ago. It has since moved on to holographic video and display, although some of the same principles apply.

“We are exploring holographic video as a medium for interactive telepresence,” says Bove. “Holosuite is an object-based system where we used a range-finding camera like Microsoft Kinect as a webcam to figure out which pixels represent a person and which pixels the room with the ability to live stream content of people separately from the backgrounds and with full motion parallax and stereoscopic rendering.”

For content creators, object-based techniques offer new creative editorial opportunities. The advantages of shooting in an object-based way is that media becomes easily reusable, and it can be remixed to tell new stories or build future responsive experiences that don’t require any re-engineering.

“Either we need to produce multiple different versions of the same content which is highly expensive or we capture an object once and work out how to render it,” says Page. “Ultimately, we need to change the production methodology. OBB as an ecosystem has barely begun.”

This article was published in the Winter 2016 European edition of Streaming Media magazine as "An Object Lesson."

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

BBC Moves To Deliver Object-Based Media at Scale

Object-based audio advances to the next stage in nonlinear content creation delivery

Axonista Introduces Object-Based Broadcasting Platform

By bringing interactivity to the screen on mobile devices, set-top boxes, and VR headsets, Axonista opens new avenues to monetisation