Bringing broadcast quality to OTT streaming
It became obvious over the summer with the FIFA World Cup. There were cases of people watching a game who heard their neighbours cheer a goal – before the action had even started on their screens.
How’s that for a spoiler effect?
So it seems that now that image quality is pretty good, the new Eldorado in the relentless quest for offering the best quality of experience might well be low latency: reducing the time from when an action happens to when viewers actually see it on their screens.
But how do you do that?
There is no quick and easy solution for achieving low latency. That’s because the video-delivery chain includes a number of processing stages. Each single stage adds latency. You work hard to reduce latency in one stage, only to find that the next one turns out to be the bottleneck.
So what to do?
While it’s not easy to reduce latency, it is indeed possible – and CMAF is part of the answer. But you need to approach this holistically and work at reducing latency in each part of the chain.
Last year, Ben Schwarz, the owner of CTO innovation, and Jérôme Blanc, EVP Compression Products at Anevia, co-authored an eBook about why low latency is important . Streaming live events was one big driver.
Now, part 2 of the eBook – the quick tech tour that explains how to achieve it – is out. It explains how to reduce latency to less than 5 seconds. It delves into new technology trends in OTT such as virtualisation, edge processing, and MEC, and the impact they have on latency and delay. And it is available for download HERE .
This article is Sponsored Content
As adaptive bitrate streaming soars, operators are losing control over quality and resources. Server-side selection gives them back that control.
OTT platforms with greater control over third-party infrastructure perform better, says measurement platform Mozark