How to Create a Live HLS Feed With HEVC
In this tutorial, you’ll learn how to create a live HLS feed with HEVC and H.264 streams using Softvelum's Nimble Streamer. Note that this feed will not be compatible with older iOS, MacOS, and tvOS devices that can’t upgrade to their latest revisions, so we’ll show you how to create a legacy HLS live stream as well. If you’re unfamiliar with Nimble Streamer, you can check out my review, “Softvelum Nimble Streamer Is Flexible and Well-Featured."
Nimble can transcode H.264 or HEVC input to an HEVC or hybrid encoding ladder, but you’ll need a cloud instance with either Intel or NVIDIA HEVC co-processing. In our tests, we used a G3.4x Amazon EC2 instance that cost about $1.14/hour. As you’ll see, when producing our encoding ladder, the instance had plenty of headroom, so we could have used a cheaper machine or processed more live streams
In writing this tutorial, I’ll assume that you can get the server up and running and that you’ll be using the WMSPanel service to drive operation. You can read all about costs and basic functionality in the aforementioned review. To speed your efforts, I’ll detail where to find all of the screens referenced in this tutorial. There are a lot of bits and pieces to configure and get up and running, and often knowing where to look is half the battle.
Regarding the incompatibility issue we mentioned above, a warning on the Softvelum site states the following:
After you enable fMP4 for your application or for the entire server, its streams will be played well only on appropriate Apple devices. All other playback devices and software like Android or PC web players will not be able to play it. We recommend creating a separate application for fMP4 delivery if you have different target platforms for your streams. This may definitely change in future but for now you need to consider this factor.
Accordingly, in this tutorial we’ll discuss how to create a fragmented MP4 stream for HEVC-capable Apple devices and a legacy stream for older Apple and non-Apple end points.
When producing a live event, HEVC provides valuable bandwidth savings in the first mile and the last mile, so if you can get your hands on an encoder with HEVC output, go for it. In our tests, we used Vitec’s MGW Ace HEVC Encoder, which can also output H.264 if needed. Interestingly, you can’t push HEVC using the RTMP protocol, so you’ll have to use UDP or some other protocol. Although it’s not rocket science, this requires a bit more network knowledge than using the RTMP protocol.
As an example, we originally planned to use port 265 for the UDP stream, but the Vitec encoder didn’t support ports below 1200 or so, so we changed to port 26500. Configuring this wasn’t particularly challenging, but I was glad I wasn’t under the gun when I discovered the problem. If you’ll be using a new encoder for this project, or a new protocol, budget plenty of time to get issues like this resolved.
Step 1: Set Up Global Options
When Apple added HEVC to HLS, the company specified that you can only use the fragmented MP4 format for the streams, not MPEG-2 transport streams. To accomplish this in Nimble, you create what the service calls an application, and set “HLS(FMP4)” as a supported protocol. You reach the screen shown in Figure 1 by clicking Nimble Streamer in the top menu and choosing “Live streams settings,” then choosing the Applications tab on the left.
Figure 1. Click the wrench to set application settings. Here we’re choosing the fMP4 application for fragmented MP4 files.
In addition to the HLS(FMP4) protocol, you should also select the “Softvelum Low Delay protocol” (SLDP), since it’s faster and easier to preview your streams via this protocol (you can read more about it). You can see that we have both protocols selected on the bottom of Figure 1.
You can also see that we’ve set the chunk duration to 6 seconds, and the chunk count to four, which means that that the server won’t publish the playlist until it has four chunks. These settings favor playback stability and encoding efficiency more than low latency, so if time-to-live latency is an issue for you, you may want to be more aggressive. For example, changing to a chunk duration of 2 seconds will reduce latency from 24 seconds (6 seconds x four chunks) to 8 seconds (2 seconds x four chunks).
On top of Figure 1, you see we have two applications running—the fMP4 application we’re editing in the figure, and “leg” (which stands for legacy), the application we’ll use to create HLS streams for legacy devices. In that application, we selected the HLS (MPEG-TS) and SLDP protocols.
When working with Nimble, clicking the wrench always opens your configuration options, which is intuitive. Less intuitive is the question mark, which opens publishing URLs for each application in this window, and opens preview windows in most other screens.
Step 2: Set Up for Input and Output
In terms of high-level workflow within Nimble Streamer, you must first input the stream, then route it to the transcoder, which creates the actual streams and hands them off to the server for delivery. In Figure 2, which you access by clicking the “MPEGTS In” tab on the left of the Live Stream Settings window, we set incoming parameters for the UDP stream, including opening port 26500.
Figure 2. Setting and naming the UDP input and opening up port 26500
In Figure 3, accessed by clicking the “MPEGTS Out” tab in the same screen, we create two streams for the Transcoder, one for the fragmented MP4 streams and the other for the legacy streams. By specifying “h265_input”, we’re tying that stream to the HEVC input configured in Figure 2; this is how the system can input multiple streams from multiple sources and route them to different transcoding jobs.
Figure 3. Creating the outgoing stream for Transcoder
Note the green check marks on the left of all streams. If any of these show a red exclamation point, it means you have a problem. In particular, if you’re wondering whether your incoming stream is properly configured, check the UDP input field shown in Figure 2. A green checkmark doesn’t always indicate that you’re receiving a stream, but a red exclamation point absolutely means that you have a configuration problem that will prevent transcoding and streaming.
Note that with RTMP input, you won’t have to configure input and output streams—they will be created automatically as Nimble receives the incoming transmission. These In-Out steps are specific to UDP just because each incoming stream may have multiple channels.
Step 3: Create Encoding Ladders
Here’s where things get interesting. As stated at the top, the HLS streams we’re creating will not play on many PC, Android, or older Apple devices, some because they can’t play HEVC, some because they can’t play fmp4 files. While it’s possible to mix H.264 and HEVC content in the same encoding ladder, as we did below, all streams must be fmp4, which means that you’ll be excluding some potential viewers. Accordingly, we created two different HLS file groups; one targeting newer HEVC-compatible devices using HEVC- and H.264-encoded streams in fmp4 containers, the other targeting legacy devices using all H.264-encoded streams in MPEG-2 transport stream containers. Either way, you’ll use a wizard to create your encoding ladder.
You access the wizard by clicking the Transcoders menu option, which opens the screen shown in Figure 4. In Nimble Streamer-speak, a set of files to transcode is called a “scenario.” Note that this isn’t an adaptive group, but rather a group of files that you can configure into one or more adaptive groups for that application in a later step.
Figure 4. Accessing the Transcoding wizard
Click “Transcoding wizard” to create a new Scenario, or click “Name” on the left to open the encoding settings for an existing scenario. The wizard itself, shown in Figure 5, is a drag-and-drop operation that has configuration options for both the source and output streams. Source is on the left, and by choosing the fmp4 application we’re inputting the fmp4 output stream configured in Figure 3. Output stream parameters are on the right; here, I’m creating a 2.5Mbps HEVC stream transcoded from the 4Mbps incoming HEVC stream to serve as the second-highest-quality file in the encoding ladder. From there, there’s a 720p HEVC stream at 1.5Mbps, and then 480p, 360p, and 240p H.264 streams.
Figure 5. Here we’re creating the hybrid encoding ladder. Set input detail via the box on the left; set the output configuration using the box on the right. Both are accessed by clicking wrenches that appear when you hover over the boxes.
Whether you’re creating a hybrid, or a H.264- or HEVC-only stream, you add the output, then choose the codec and encoder. As you can see on the right in Figure 4, we’re using the Nvidia HEVC codec, which is why our server needed Nvidia hardware.
Operation here is relatively primitive. For example, the standard configuration screen doesn’t contain a field for data rate, so you have to manually create the field by typing “bitrate” into the box that says “tune, profile, etc...,” which turns that into the bitrate box and creates another open field. You would also have to manually add fields for the profile and other controls, using settings specific to each codec, and using FFmpeg parameters in all filters. Given that most producers will need to set a data rate, and perhaps choose an H.264 profile for each stream, Softvelum could definitely make this easier.
The old realities that used to dictate codec adoption no longer apply. Opening up new markets now matters more than reducing operating expenses. How are HEVC, AV1, and VVC positioned for the future?
Reaching multiple viewing platforms with a single live video stream used to be impossible, but not anymore. Here's how to do it with Switchboard Cloud.
While it's clear that Flash's time is coming to an end, it's less clear what will replace it. A survey shows DASH support, but its real-world use is around one percent.
Streaming protocols are getting people talking and the buzz is loudest about Apple HTTP Live Streaming.