Upcoming Industry Conferences
Streaming Media West [19-20 November 2019]
Live Streaming Summit [19 November 2019]
Past Conferences
Streaming Media East 2019 [7-8 May 2019]
Live Streaming Summit [7-8 May 2019]
Content Delivery Summit [6 May 2019]
Streaming Forum [26 February 2019]

Live Sports Streamers Want Sub-10 Second Latency
M2A Media explains how it orchestrates live streaming worldwide for DAZN, at the Vortech.by conference; Amazon talks AI and the Royal Wedding
Learn more about the companies mentioned in this article in the Sourcebook:
{0}

One of the biggest SVOD sports streamers in the world is attempting to tackle end-to-end latency, but finding it a challenge.

"Customers want to get down to 10 seconds latency," declared Marina Kalkanis, CEO of M2A Media. "Of course, the aim must be zero, but even 10 seconds is a challenge."

The London-based software developer has been working with aggressive live sports aggregator DAZN for two and half years, in that time underpinning its expansion into regional territories Japan, Canada, Italy, the U.S., Germany, and shortly, into Brazil. 

"This pace of change is only possible if you take advantage of the cloud," said Kalkanis.

She was speaking at Vortech.by, an inaugural conference for coders and software engineers hosted in Stockholm by API-based cloud media asset management platform Vidispine and with the support of AWS.

Kalkanis said that while the lag from camera to viewing on DAZN is 30 seconds at present, "we pick up the feed from playout so it's 5-10 seconds plus the twenty on our end. There are things we can do to shave more time off, but it is tricky."

M2A's Live Streaming service orchestrates DAZN's entire cloud video workflow, from starting origin servers and transcoders, and loading transcode profiles to presenting cached CDN end-points.

"Our solution can orchestrate any number of concurrent channels and cope with multiple CDNs pulling from multiple regions simultaneously, which in turn serve millions of users," Kalkanis said.

For DAZN, the live playout signals are captured to two data centres and unicast either in RTP with forward error correction to AWS Elemental Live. From the encoder the feeds are encoded into the ABR ladder, sent to origin servers running Apache modules for packaging in HLS and DASH with DRM

The live capture-to-VOD workflow is managed in one of two ways. The first is by taking the live stream out of Elemental and passing that to S3 then, when the event is finished, to go back to S3 find the files needed to create the VOD. 

"The one drawback is that content is already encoded so you are relying on fragments (files) that already exist. We can only be as accurate as those fragments so there's more room for error."

The second path is to capture source video from playout then transcode in the process of creating the VOD assets. The advantage here, she explained, is frame accuracy and the output of higher quality video asset."

While the first path is faster and easier, since there's no additional transcode, the risk is a lower quality VOD. The trade off with the second path is the expense of using more AWS resource but a reduced bitrate at the end. 

"You've got to want it and you've got to have enough audience for it," she said.

Another key, for DAZN and for any live streamer, is to constantly review all aspects of the service and figure out where incremental improvements can be made. 

"If you want five 9's availability you need 24/7 operator MCR with eyes on glass," she said. "I don't know anyone who can use AI today for monitoring to work without fail.

Operator teams, she added, have to be aware at any point in time what is happening with development: "They must be hand in glove."

Quality of service and experience are ever more critical factors, as OTT sports companies start to compete for eyeballs. 

"Latency must be chased down; buffering is not tolerable; frame rates must be high, but bitrates should be as low as possible; and output must be viewable across a panoply of devices," Kalkanis said.

AI, the Cloud, and … Brexit?

The Vortech.by (where Vortech means fluid or dynamic ideas, according to Vidispine) conference also featured sessions Ingesting linear broadcast streams and distributing to multiple platforms from Grabyo and real-world applications of machine learning from AWS as well as speakers from Arvato SystemsValossa, Mayam, and the DPP.

Lee Atkinson, AWS principal solutions architect, media & entertainment, ran through AWS' ML and AI software stack for everything from automatic subtitle generation and multi-language translation to object and activity tagging in live sports streams using Rekognition.

One high-profile use case of Amazon's AI was with Sky News during last summer's British Royal Wedding.

Live feeds were taken from physical AWS Elemental encoders on site and fed Elemental Media Live as a mezzanine stream for media packaging delivery by Amazon Cloudfront. In this process, metadata company GrayMeta extracted single frames of video and sent them to Rekognition for identifying of people (famous guests like David Beckham) output them back to Amazon S3 as a JSON file and onward to the video player built for Sky News by UI Centric.

Atkinson revealed that this process was not entirely automated. "We built in a 30-second delay in the live stream for manual moderation of the AI recognition just to confirm the AI was correct," he said. "AI (object, activity, facial) recognition operates on prediction and degrees of certainty. You want the threshold to be as high as possible. Nothing is 100% certain. Since this was a working proof of concept, Sky felt they needed human eyes to confirm. If they didn't react quick enough, the AI wasn't confirmed (published live)."

Tim Burton, managing director of IP systems integrator Magenta Broadcast had some timely Brexit analogies. 

"What does Brexit and cloud have in common?" he posed. "Until recently, [the industry] created an island where it was nigh on impossible to integrate with other people's offerings, blocking trade within workflows. We were so concerned about the sovereignty of code we forgot about sustainability. Requests meant backlog, not integration.

"And while we didn't charge export tariffs [egress of media to the cloud] the only way to trade with us was through export—which made workflow slow."

Fortunately, ingress costs and download costs have reduced and entire cloud-based workflows can be built or migrated to without regard for physical borders. If only politics were so simple.

Lee Atkinson of AWS talks about how AI and machine learning can improve production workflows and asset management.

Related Articles
Delays of up to two minutes can really destroy the live sports experience. Walled garden solutions aren't working, so it's up to CDNs to provide relief.
Live video is getting interactive requiring bidirectional communication with low latency. Companies like Ex Machina, Haivision, nanocosmos, and Wowza—as well as the SRT Alliance—are working on solutions.
With help from Unified Streaming, Telestream has diverted all of its R&D into Project Orchid, which is designed to offer one-click live channel origination that supports real-time self-optimisation