How to Solve Interactive Streaming’s Latency Challenges

Article Featured Image

What are the biggest challenges to delivering successful large-scale interactive live streams, particularly achieving the requisite low latency, and what technology solutions apply?

Tim Siglin, Founding Executive Director of Help Me Stream Research Foundation and Contributing Editor of Streaming Media, talks with Oliver Lietz, CEO of nanocosmos, and Darcy Lorincz, CEO of Motoworlds, in this clip from Streaming Media Connect 2024, about real-world interactive streaming problems, how to address them, and how to deliver high-quality user experiences from traditional streaming to the metaverse.

Siglin begins by asking Lietz, “What are some of the challenges that interactivity throws into the mix that your company has seen, and how do you go towards solutions models for that?”

Lietz stresses the importance of maintaining a stable signal for interactive use cases, even in low-resolution or low-bitrate situations. “Even if the video drops, you still want to have audio, but even audio can be a very low resolution and a very low bit rate just to keep up the monetized video stream to be able to bet or bid on these events,” he says. “And that's a bit different from broadcast environments where you want the highest possible quality on a 4K screen.” He emphasizes the general need for technical flexibility. “You need to adjust for the right technologies,” he says. “Like WebRTC, but [they are] difficult to adjust on these different network situations. That's why there's adaptive bitrate, which is an established technology for broadcast, but which also works for ultra-low latency live streaming. At least in our case, we really turn the quality data down as [much as] possible but keep a live stream running with the lowest latency possible. If the network goes back up again, you might tune it up again to a high resolution. So that's key, that no matter what happens, if you are commuting, if you're on a bad network…you need to have a constant live stream, otherwise you would lose the connection. And probably also the monetization of this distributed livestream.”

Siglin says, “That's a hundred percent correct. You want to provide variability in your encoding ladder so that if you have someone who starts high and drops to low, versus coming back up.” He asks Lorincz, “When you're talking about interactivity in the spatial or the metaverse, you're also having to tie it to things like avatars or worlds that don't exist. What extra challenges do you end up with in that as you've approached it?”

Lorincz discusses the challenges of creating real-time, interactive experiences in the metaverse, including the need for avatars to have realistic conversations and for scenes to unfold in real-time based on user preferences. “The one that I'm finding most challenging right now is how to talk to a language model and have it talk back to me like we're having a real conversation,” he says. “You see these agents and avatars now being pretty real-time and pretty human, but if you can't have a conversation, it immediately breaks down that experience, and now you're going, ‘Okay, this is just a bad experience with an avatar.’ So you now have to actually have the avatar thinking that it's feeding back in that language model, sitting in the network somewhere. And it's got to have a conversation in real-time like we're having right now.”

He presents a specific example of an emerging highly advanced video from text AI model. “They just introduced a new thing called Sora, which is going to now create on-the-fly scenes, literally video development as you go,” he says. “So you walk through these metaverses, and it's unfolding in front of you based on what you want to see, and you're having a conversation at the same time. Those are new elements that nobody has ever thought of. Now, it's actually unfolding in real-time. How does anything understand what it's looking at in terms of the scene or the frame? So there's going to be a lot of new challenges in this world.”

He notes that while gaming has laid some groundwork for these challenges, the scale and complexity of the metaverse present many new problems to solve. “Fortnite, Minecraft, Roblox…they have to figure this stuff out…we have millions of people in servers. How do you get across those servers? And there's so much in the metaverse that's about problems we've solved in other industries that are at scale that we've never even thought about before. You never thought there were going to be 10,000 people in a server, all experiencing a concert or at a virtual event. And that's happening now all the time. And they're talking to each other and to bots and avatars. So it's a fun challenge.”

See videos of the full program from Streaming Media Connect February 2024 here.

We'll be back in person for Streaming Media NYC May 20-22, 2024. More details here.

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

How Globo Handles Latency Costs for Live Sports Streaming at Scale

When it comes to implementing streaming tech for large-scale, high-stakes live sports, often decisions around managing latency are driven as much by cost concerns as network conditions, audience expectations, and the like. Globo Digital Products, Platform & Adtech Manager Jonas Ribeiro reveals the latency Globo delivers on typical sports streams at scale and what factors into those numbers in this discussion with Eyevinn Technology's Magnus Svensson at Streaming Media Connect 2024.

Synchronization vs. Latency - Which Matters More in Enterprise and Sports Streaming?

Chris Packard, Global Live Operations Lead at LinkedIn, discusses the role of interactivity in enterprise streaming, what the essential elements are of a successful user experience, and striking a realistic balance between ultra-low latency and synchronization, in this discussion with nanocosmos' Oliver Lietz and Help Me Stream's Tim Siglin from Streaming Media Connect 2024.

CDN77's Juraj Kacaba Talks Low-Latency Streaming and the Edge

CDN77's Juraj Kacaba sits down with Tim Siglin to discuss low-latency streaming and the edge in this interview from Streaming Media East 2023.

Vindral CEO Daniel Alinder Talks Latency, Sync, 8K, and Vindral

Tim Siglin of Help Me Stream Research Foundation sits down with Daniel Alinder of Vindral to discuss latency, sync, 8K, and Vindral in this Streaming Media East 2023 interview.

Companies and Suppliers Mentioned