NAB 2023: vMix's Tim Vandenberg Talks NVIDIA GPU Unlocking, SRT, and vMix 26

In many respects, vMix established itself as the indispensable solution for remote producers working with multiple remote talent sources as the pandemic, and accompanied many streamers into cloud production as COVID accelerated the shift to cloud workflows, the company continues to charge ahead with innovations and new features, as evidenced by its strides in GPU unlocking through leveraging updated NVIDIA NVENC encoders. 

Streaming Media contributing editor Shawn Lam of SLVLive delved into all this and more as he caught up with vMix Operations Manager Tim Vandenberg in the vMix booth at NAB 2023.

Bringing NVENC-enabled Simultaneous Streams and Records to the Masses

"Let's start off with GPU unlocking from NVIDIA," Lam asks, diving right in. "What has that allowed you to do in terms of the NVENC encoders and bringing that to the masses?" 
"The latest NVIDIA drivers have allowed for five hardware encodes, after limiting it to two for the longest time," Vandenberg says, "so that opens up a lot of people to doing things like three streams and two recordings of their production, or ISO recordings if they wanted to as well." 

"In the past," Lam replies, "I think a lot of producers started with camcorders, but as things went to PTZ, the ability to record in-camera hasn't been there unless you're in Quadro card-supported workstations. But now, on a laptop system with a GeForce consumer card, now we can ISO record a couple and stream and record. I probably had you guys pretty excited about that announcement."

"I found out about it through forums without actually seeing the NVIDIA release, because it was hidden in the actual release notes, not the main press release," Vandenberg says. "So I had to dig down and prove that it was there. And then as soon as I realised, I went through all of our machines, tried all our cards to make sure it would work. And because vMix is only controlled by whatever the driver allows, as soon as they added it, you could do go from three to five in vMix and you didn't have to do anything else."

SRT and Other Supported Protocols

One of the keys to the vMix-driven cloud streaming workflows that producers have increasingly adopted in recent years is SRT (Secure Reliable Transport), though not everyone fully grasps its application or its benefits, as well as how it differs from or complements NDI. "In our own workflows," Lam says, "we're doing a lot of NDI. One of the things we haven't touched on too much yet is SRT. What is SRT? How do you support that, and where is it utilised?"

"NDI is more local," Vandenberg explains, relying on "a local network on your LAN. Whereas SRT is more of a WAN sort of situation where you're going outside of your local network where you are doing remote content. So you can send an SRT--secure reliable transport--sending a stream from one point via the internet to another point somewhere else. So it's typically being used now for people that are running vMix in an AWS instance where they're running local cameras, but then mixing and switching everything via the cloud. It's all about remote point-to-point video."

One of the hallmarks of SRT--embedded in its name--is its reliability, Lam asserts, and its ability to leverage readily available bandwidth resources. "And it does all this in a reliable manner too, right, just using the, the available internet that you have, whether it's mobile data or ethernet, but not having to really go up to satellites or rely on other types of transportation protocols. "

"That's right," Vandenberg says. "You're not completely taking out the satellite, obviously, but if doing a production where you have good internet access, you can set the latency on the, the stream to provide the best quality coming through. So it's it's been really good. We had support prior to 2020," he continues, but when the pandemic struck, "for people that wanted to do remote productions who couldn't be on premises due to COVID and distance protocols, it was great to have SRT as an option."
Lam inquires about other supported protocols, such as LIVE LAN, which was added in vMix 25, for sending video out of vMix via ethernet to locally connected projectors. "How does that work?"

"It's a little bit different," Vandenberg says, "because we have a lot of people that just want to display a local feed of their streams and we were like, 'How do we go about this?' because NDI will need a decoder on the other end. So we thought we would try using HLS to provide a stream to just a local network. And then most smart televisions have a browser that allows you to just put that in and you can play it. The only problem with HLS is that it's about a 10-second delay to make sure that the everything's smooth. But typically if you're not viewing the same area, like if it's in a different room, it's usually fine. It's probably not perfect for everybody, but it's a solution for a lot of people."

It's important to have such "in-built solutions," Lam replies, "that aren't reliant on separate hardware decodes, because a lot of times in a live environment, the client comes and says, 'We have too many people in the room, now we have to have an overflow room. How can you help us? And by the way, the room is a thousand feet away and too far to run cable straight there.' So it's an option." 
Vandenberg agrees. "And you don't want to run a stream and then have somebody load up YouTube on their phone in order to watch the stream that's in the next room and that type of thing." 

What's New in vMix 26

NAB 2023 marks the public debut of vMix 26, the solution's latest full-step upgrade. "So what else is new under the hood in vMix?" Lam asks.

"26 has a stream delay for people that want to stop cheating with their eSports, where you can set a delay on the stream so people can't be like stream-sniping." Vandenberg says. 
He also notes that the new version adds 15 new mix effects.

"I love the mix effects by the way," enthuses Lam. "When you brought out the initial few, I was like, 'I need more.' So tell viewers how does the mix effect work, and why do I love it so much?"

"It allows you to create a separate input, which can be an output," Vandenberg says. "So you can have, say, an overlay or an output that you're sending from vMix with its own transitions. You can have a submix. You can have your own separate sub-production within a production, and now you can have 15 of them for various things. So you could be sending a mix out to a big screen, but also overlaying that in vMix and using the transitions the same way."

"I use it all the time with templates," says Lam. "We'll have multiview templates with three remote panel presenters, and then a PowerPoint, and and bugs and logos. And then if the mix of presenters changes, I just simply switch in the mix effect, which one is assigned to that one. And it's just submixes. It gives me more tools so that I don't have so many inputs that I'm switching from--I'm just mixing the submix within input number two, for example. It's a great feature." 

Vandenberg catalogs other new additions such as title templates and SRT replay, "so people can do remote replay, which is pretty handy if you're doing remote sports; and we added a bunch of vertical production stuff, so you could get high quality vertically and horizontally if you really want that."
"Vertical video makes me cringe a bit," concedes Lam, "but if there's a need for it..."
"You'd be surprised," says Vandenberg. "There are a lot of different avenues for vertical now. I know this is broadcast, but..."
"It definitely has huge usage for signage," says Lam. "There are a lot of vertical displays. It doesn't matter what we think of it. If it's needed, it's nice that you support it."

Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues