Live from the Carrier Edge
As streaming increasingly becomes the preferred consumption model for some of the world’s most-watched content, expectations for high-quality and a great user experience are intensifying. Live sports, concerts, breaking news, and red-carpet events now account for a substantial portion of the OTT traffic crossing service providers’ networks daily. This reality is magnified when you consider new, more bandwidth-intense types of content that are now coming to the fore, including VR/AR and volumetric video. Service providers must now ask themselves, are they prepared to manage this massive network load?
The answer should be yes. The ultimate goal for every service provider is to build a superior delivery network that flawlessly delivers live content, regardless of spikes in traffic. However, the reality today is often far removed from that desired endpoint. Even service providers equipped with considerable network capacity can have a difficult task ensuring the uncompromised scalability that ultimately delivers users a better quality-of-experience (QoE) to their customers. Attentions are now turning to how edge delivery technologies can overcome the perennial challenges of live streaming and enable service providers to better manage the traffic passing through their networks.
Taking control of network traffic
Spikes in network traffic are unavoidable, particularly around large-scale global sporting events. Take Super Bowl LVII as an example: Nielsen reported an average of over 113 million fans watching the US broadcast. FOX said its average audience represented the second most-watched non-overtime Super Bowl on record and the second most-watched program in Fox Sports history. Furthermore, its streaming audience reached an all-time high average of 7 million. Securing the necessary capacity to deal with this streaming demand can be daunting and often involves multiple CDN partners with a potential peak traffic of 80Tbps or more. It's one of the reasons service providers worldwide are optimizing their networks by moving towards open caching, a technology championed by Qwilt and developed by the Streaming Video Technology Alliance (SVTA).
Open caching tackles the major bottlenecks in internet infrastructure caused by increased digital media and other online content consumption. By placing caches deep within service provider networks, open caching solves the capacity challenge by enabling cable, telco, and mobile network operators to cache and deliver streaming media from locations close to consumers. This means the live workflows are dramatically reduced, streaming latency is minimized, and newly standardized protocols can be deployed faster than ever before.
What does it take to deliver a live streaming event?
Qwilt recently conducted a survey of 300+ content publishers, who were presented with the task of running a live streaming event for an audience of 10 million viewers. When asked about their greatest concerns for this hypothetical event, the majority of respondents (52%) were concerned with their commercial CDNs in terms of securing enough capacity for the event or overall CDN performance. The next most significant concerns included overall latency (17%) and measuring QoE (17%).
The survey's findings reiterated the concerns content publishers faced in their CDN partnerships: risking multi-million-dollar content rights investments with the possibility of customer churn due to poor streaming QoE. Latency is always a concern with live event streaming. For instance, SSIMWAVE, which measured video quality for more than half a dozen broadcast and OTT streams during this year’s Super Bowl, found that of the five streaming services measured, four suffered from latency that was 20-40 seconds behind the fastest streaming service.
What’s possible from the edge?
Delivering content from the true telco edge and as close to the end-user as possible results in better QoE through lower latency, higher throughput, and reduced time to first frame. This results in a lower rebuffering rate and a higher average bit rate when compared to traditional commercial CDNs. The key point is focusing on delivering quality-as-a-service that reflects the unique needs and abilities of the provider network.
Open caching consistently delivers exceptional QoE by distributing individual live streams deep within the service provider network and well downstream of potential congestion at peering and exchange points. Unlike commercial CDN nodes that are centrally located in the mid-mile, open caching nodes are deeply embedded in the service provider network at the closest possible location to the users – potentially just a few blocks away.
Open caching-based architectures create network capacity in a more efficient, edge-computing form so you can scale for less and serve more streaming content in higher quality from the edge. In short, open caching offers measurable benefits to the entire ecosystem – publishers, service providers, and consumers, and is quickly becoming the new standard of content delivery quality.
[Editor's note: This is a contributed article from Qwilt. Streaming Media accepts vendor bylines based solely on their value to our readers.]