SUBSCRIBE FREE to Streaming Media's European edition magazine - your insider guide to the people, tech, and trends driving Europe’s streaming revolution.

Why to invest in lowering sport streaming latency in 2026

Article Featured Image

The knock against streaming for live sport has always been higher latency than broadcast, and the agony of hearing your cable-watching neighbor cheer a goal before it happens on your screen. One could argue that live-streamed sport in’t really live in this scenario. But given the costs and challenges of lowering latency, is it really worth the investment, or should sport streamers be prioritising other engineering challenges or elements of the fan experience? MTech Sport’s Matt Stagg, DAZN’s James Pearce, BT Group’s Ian Parr, and TATA Communications’ Corey Smith discuss the current business case for lowering sport streaming latency at Streaming Media Connect 2026.

The specific conditions for worrying about latency 

Stagg asks, “What’s the business case and how do you justify an investment to lower the latency of your streams?”

DAZN has kept an eye on this issue, Pearce says. Latency isn’t bad in every scenario, despite the rise of social media and its instant notifications, he notes, but one use case proved that when rights were split across two broadcasters, one a traditional satellite broadcaster, OTT was slower. “So we did see that there was a business value in trying to reduce that latency so you could be comparable,” Pearce shares. 

The challenges around low latency stem from personalisation and ad insertion, Pearce continues, “and some of the protocols struggle, I think, when you start looking at some of the low-latency protocols. So it’s definitely not an easy thing to fix, but there are certain use cases where I think where your content is on the platform compared to another platform, then it really stands out and makes sense to do it.”

Stagg mentions that it’s not truly live sport when one viewer hears their neighbor shout at something they haven’t seen yet. He laughs that even his fellow pubgoers annoy the viewers next door sometimes. 

How latency protocols affect the network 

Stagg invites Parr to contribute. Echoing Pearce, Parr adds that another annoyance is a tweet spoiling a goal that hasn’t been on TV yet. He explains, “I can understand the attraction of wanting to not be 45-plus seconds behind the live edge of a sport stream for the reasons that we just outlined. And so I can understand the customer experience benefits of that.” But network operators need “to understand how the low-latency protocols affect flow and network traffic engineering options available on the network as well, particularly when things are starting to go wrong.” 

Parr has seen evidence that UDP-based distribution protocols are less network-efficient and cause greater load on the network than standard TCP or HLS protocols that are easy to manage and monitor, he says. “So there is some emergence and some concerns about the increasing load price by lower-latency streaming, but we’ve not seen enough of what I would call ultra-low-latency streaming … yet to have really informed data,” he concludes.

Figuring Out the Nuances

Smith chimes in, “It’s easier to control your latency when you’re packaging the ads for a one-to-many type of audience. When you start getting into client-side ad insertion, that’s where things get a little bit wonky too, because now all of a sudden the ad load, there’s an intrinsic delay sometimes in those apps loading. And then the return to live broadcast, sometimes you’re coming back into the opening action of whatever’s coming next.” On the client side, he says, SSL overhead, encryption, and DRM decode are factors. 

“So there’s any number of different things that don’t exist in the traditional broadcast world where my direct TV downlink is a private point-to-point connection,” he continues. “My traditional broadband television provider is a private network as well, where the internet is this open kind of playground. So I think that there’s a lot of nuances to IP streaming that we still have to figure out.” 

Smith wonders, “What does low-latency HLS really mean when it comes to the CDN side of it and being able to support it at scale? Do we all just go to an SRT stream distribution point to point to a customer endpoint and forget the HTTPS or HTTP delivery?” He reiterates that with UDP versus TC/PIP, there are trade-offs, and more R&D needs to be done. 

“Until we can actually get people from not screaming at my neighbor’s house when somebody makes a goal to me watching it 30 seconds later, I think that that’s going to be a continued problem,” Smith predicts. 

When the Problem Must Be Fixed 

Smith is unconvinced that allocating money to solving this problem isn’t always immediately necessary. “The reason I say that is because you only really need to have that real-time IP streaming factor if things are simulcast on a traditional delivery network and there’s gambling involved, right? When there’s actually transactional betting or something happening in the background that’s now a non-competitive type of created situation where now all of a sudden there’s real stakes, if you will.” He knows some gamblers consider the stakes life or death. When their own money gets involved is when they start caring about low latency. 

Join us May 12–14, 2026 for more thought leadership, actionable insights, and lively debate at Streaming Media Connect 2026! Registration is open!

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues