Live Streaming from a Notebook

Article Featured Image

And the Quality Winner Is?
Overall, I'd give the quality crown to Adobe, though with low-motion talking heads, the streams were hard to tell apart, and even during high-motion sequences, the differences were minor. For example, when the going got tough, Expression Encoder 4 tended to show macroblock artifacts more noticeably than other encoders, as you can see in Figure 8. As you recall, I had to re-encode the EE4 file into H.264 format to input it into Premiere, resulting in double encoding, albeit with the re-encode at a very high data rate. So I checked the original file to verify that the blocks were evident there, not injected by the second encoding. 

Ozer Notebook Figure 8

Figure 8. With high motion sequences, EE4 tended to get blocky more quickly than other tools.

In contrast, Wirecast suffered from a frequent "rookie" error that occurs when the MainConcept codec is implemented into a product. Specifically, occasional key frames end up very degraded for just that single frame, as shown in Figure 9. Interestingly, I first saw this when Sorenson integrated MainConcept into the Squeeze encoder, and again when Telestream integrated MainConcept into Episode. It's very transient-in most instances only one frame long, so it's very hard to notice-but hopefully it's something that Telestream will address soon.

Ozer Notebook Figure 9

Figure 9. With Wirecast, some key frames were very, very degraded.

In addition, Wirecast's video seemed slightly faded compared to the other encoders, though you would only notice this with side-by-side comparisons, which of course, viewers never have. As a side note, though I dialed in a key frame interval of 300, Wirecast seemed to default to 90 on all three computers, which probably didn't cause any of these problems, but another issue that hopefully will be resolved by Telestream in future releases.

Ozer Notebook Figure 10

Figure 10. Some Kulabyte frames showed less detail than frames produced by other tools.

Kulabyte got dinged because in one or two scenes, it showed slightly less detail than the other formats, most noticeably in the scene shown in Figure 10. Certainly, as flaws go, loss of detail is much less noticeable than blockiness or the frame degradation that Wirecast exhibited.

Overall, FMLE showed very good color and resisted the types of flaws exhibited by the other tools. Note, however, that the spread between the quality produced by the various tools was much less than the spread between CPU efficiency, or even data rate consistency. In other words, the quality of all tools was certainly "good enough," so base your decision on other factors.

The Envelope, Please
Table 9 summarizes the findings from above. Instead of adding up the numbers and choosing a winner, I hope the table helps you choose the tool best suited for your productions and work around its weaknesses. In addition, keep in mind that each tool has unique capabilities not discussed, like Wirecast's production capabilities and EE4's ability to create iOS compatible streams (in conjunction with other Microsoft server software). 


CPU Efficiency

Data Rate Accuracy

Data Rate Consistency


Adobe FMLE





Kulabyte Xtream 2





Microsoft EE4





Telestream Wirecast





Table 9. Rankings in the various categories, with 1 the best and 4 the worst.

My goal was to objectively measure some critical performance capabilities of each tool and provide the results. I hope you find the data useful.

Software vs. Hardware
Before I sign off, there was one other issue I wanted to address, which is how the quality of a hardware streaming encoder compared to our software tools. So I compared the output quality of FMLE against a stream produced by the Digital Rapids TouchStream appliance. 

Ozer Notebook Figure 11

Figure 11. Some FMLE frames showed less detail than frames produced by other tools.

Figure 11 shows a relatively easy low-motion comparison, and though the colors are slightly different, the clarity is about even. One one or two frames in my test file, most noticeably the one shown in Figure 10, TouchStream showed a bit more detail, but you had to look very hard to see the difference. Overall, if you're choosing between a hardware encoding appliance and software encoding tool, comparative quality shouldn't be the deciding factor. 

A Note from the Author, 2/25/11

I've received some good reader questions, both directly and in the article comments, so I wanted to provide some additional configuration and other information. First, more comprehensive computer specs.

Hewlett Packard 8740w: 2.0 GHz four-core (8 with HTT) i7-based CPU, running 64-bit Windows 7 Professional with 8GB of RAM, NVIDIA Quadro 5000m, 7200 rpm disk drive

Macbook: 3.06 Ghz Core2Duo-based MacBook Pro, running Snow Leopard with 8GB of RAM, NVIDIA GeForce 9400m, 7200 rpm disc drive

Hewlett Packard 8710w: 2.2 GHz Core2 Duo running 64-bit Windows 7 Professional with 2GB of RAM, NVIDIA Quadro FX 1600m, 5400 rpm disk drive

On the Decklink card: I connected the Decklink to the 8740w using the Magma PCI Express Expansion Systems, which provides a PCI Express chassis with dedicated power and cooling that connected to the 8740w via the Express card slot. For more information, see I connected to the Decklink via component video and analog audio.

Other questions:

>>It seems you used MediaInfo specifically for analyzing Wirecast frame rates. Consider using it for all the files. It can confirm the reporting accuracy in the various encoders and ensure they're all being tested with a common tool. Perhaps you did that but it isn't clear to me.

I used MediaInfo for all files for which it reported that data; I supplemented that as necessary for files that it would not.

Lots of suggestions for additional testing; let me get my life back to normal after this bear of an article, and I'll propose some follow-up articles to Eric Schumacher-Rasmussen.


Streaming Covers
for qualified subscribers
Subscribe Now Current Issue Past Issues