RealSprint's Daniel Alinder Talks Sub-Second Latency and AV1 Encoding

Tim Siglin: Welcome to IBC 2022 here in Amsterdam. I'm in the NETINT booth. And I'm here with Daniel Alinder, who has a company called RealSprint. And he's gonna tell us a little bit about what RealSprint's doing, both with NETINT and also with hardware-encoded AV1. So Daniel, first of all, tell us sort of the background of where you come into the industry.

Daniel Alinder: So RealSprint, we have a couple of products of which one is Vindral Live CDN. So we focus on playout of video on the internet with ultra-low latency. And our focus is actually in this sweet spot kind of between the HLS-based solutions and webRTC. We wouldn't call ourselves real time. We wouldn't call ourselves HLS because we're not, we're in the sweet spot at one-second latency. It's actually configurable so you can go 600-700 milliseconds. Some of our clients might want to synchronize everything with television, so they want maybe four seconds of latency.

Tim Siglin: So your background was in the betting industry. So is that why you picked that sweet spot?

Daniel Alinder: Yes, it is. That's where the requirements come from. A few years back when flash was going to die (and it did), we started building this this product and we had to find a way of making the signal ultra-low latency, but still retain quality. And you really need to find the sweet spot for that, because as a player or as a better, you don't want the content to degrade into lower quality than it needs to, because that hurts trust in the brand. So if you're watching horses race, or you're playing on a roulette table, the trust in the product is such an important part that you need to fulfill. That's where we saw that with realtime technologies, a bit too sensitive, because then if you need 200 milliseconds of latency, that's the way to go. That's your F1 car built for that, but if you need sub-second or you need maybe around a second, then you can configure that with us and we will maintain that latency, but you will get a higher quality. That's what we've seen. So that's why our clients come to us.

Tim Siglin: So one of the things you did that you were telling me before we started the interview was you looked around at different solutions and NETINT was one of those solutions. Give me just a brief overview of why you wanted to use the NETINT solution and what it did for what you wanted to do.

Daniel Alinder: What we've ended up building is a hybrid CDN where we use partially VMs, but partially also we put our own metal in colocation points of presence across the world. And when you do that, obviously one very important thing is how much rackspace am I gonna be taking up here? Obviously, that builds cost. And also, some of these solutions, if they grow too much, it, it gets harder to manage. And obviously with with the current situation right now, we are happy that we've gone with NETINT because consumption is also a very important thing, both sustainability-wise, which is something that we as an industry need to emphasize, but also the electricity bill that that's gonna be coming our way. We've evaluated different options from CPU based encoding to GPUs and then NETINT ASIC cards, and what we found with the ASIC is that the throughput is very high and the power consumption is very low, which we do like, and the form factor as well, of course. But as I said, cooling bill, electricity bill--that's gonna be the financial metrics in the end.

Tim Siglin: So I guess one of the things I didn't mention at the beginning of this is you're doing this as AV1. Why AV1? And did that come into play as one of the factors as you looked at an ASIC solution?

Daniel Alinder: Our CDN runs codec-agnostic, so we do many codecs of which AV1 is the latest that we've added. And so AV1 specifically, NETINT with the recent Quadra card--very important. The fact that they've brought encoding of AV1 to real time, because we need the encoding to be in real time because we're pushing the stream to the end viewer. If we're pushing it to the end viewer in 800 milliseconds, we need to make sure that we get the stream to our edge servers as quickly as possible. So if the encoding of the AV1 wasn't real time, that that would be a problem for us. But generally, since we're doing this AV1 showcase together with NETINT and Oracle, it's a way of signaling what's coming in the future.

And with more high-definition video, we're seeing the bit rates increase as well, which is also something that's gonna be on a bill somewhere. We're running a CDN, and optimizing is super. So if we have a client in in a market that has a weaker network, we see where we may be only going to have like 600 kilobits per second on the last mile, because the ISPs can't deliver more than. Obviously, getting the most bang for the buck there in terms of image, quality is gonna be very important now. AV1 isn't fully adopted by everyone yet. So it's a future thing, but we wanna be there and that's what we've done.

Tim Siglin: Well, as a matter of fact, I just came out of another meeting where we were talking to CDNs about power consumption. And they said, we're very incentivized, especially in Europe, because of the cost of energy. So to sort of recap what you said, the fact that it's an ASIC that does more transcodes at a lower power consumption means that it gives you a better viability from that standpoint, with the power.

Daniel Alinder: Any solution--whether real time or, as we call ourselves, close to real time--that is ultra-low latency is going to have to prove itself in terms of cost effectiveness. There are these legacy solutions, these older paradigms that have been optimized heavily. So anything we can do to make sure that we're cost effective and the energy bill is a part of that. So it makes perfect sense.

Tim Siglin: All right, Daniel, thank you very much for spending time together again. Tim Siglin, Contributing Editor with Streaming Media, and we're here at the NETINT booth at IBC 2022. Thank you for your time.

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues