Buyers' Guide to Quality Analytics 2017

Article Featured Image

Why? Because QoS typically measures performance up to, but not including, the last mile, or the connection between ISP and viewer, while QoE measures the actual experience of the player, typically via some kind of beacon integrated with the player. If there’s no player-specific beacon, there’s no measurement of the actual experience, so it’s QoS. With a beacon, it’s QoE.

How can QoE be poor when QoS is good? There are dozens of reasons, some of which the publisher can control. For example, if last-mile performance is poor, there’s too much demand on the viewer’s connection, or the viewing device has a poor Wi-Fi connection, QoE suffers. If the computer is underpowered or the player is inefficient, keyframes aren’t aligned, or your encoding ladder is poorly designed, the player can stop or skip. These problems are all unrelated to network performance. Obviously, in those cases where the viewer is at fault, they don’t care; they’re just as likely to change the channel or cancel their subscription as they would be if it’s the publisher’s fault.

Here’s the obvious point: Poor QoE doesn’t directly mean that network QoS is poor, and good QoS doesn’t mean that QoE will be good. That’s why to ensure and maintain a good viewing experience, you need to monitor both QoS and QoE. In a nutshell, this is why (primarily) QoS vendor IneoQuest and QoE vendor Conviva formed a partnership in September 2016. According to the press release, “By combining Conviva’s consumer-side Quality of Experience (QoE) measurement with IneoQuest’s content preparation and delivery performance quality measurement, this collaboration delivers the industry’s first complete, end-to-end view of Internet video delivery, pinpointing the exact location and root cause of streaming issues.” We’ll take a quick look at QoS, then finish up with QoE.

QoS Products

As an example, Figure 3 shows a line of products related to QoS and QoE from IneoQuest, with the microscope designating the Inspector product, and the tripods the Surveyor products. Using the distinction presented previously, the QoS line ends with the final tripod on the right, with separate tools to measure QoE via beacons in the players on the far right.

The IneoQuest line of (primarily) QoS products 

Briefly, the Inspector product monitors live encoding quality and packaging integrity. It can track and alert regarding many of the same issues managed by the automated file-based QC tools discussed previously, including loudness control and ABR boundary point alignment. The live quality metric helps operators tune encoders and can alert to any significant drops in video quality.

In contrast, the Surveyor tools track network performance by monitoring whether ABR segments are delivered faster than they are played, a simple but fundamental performance indicator. Because they operate by retrieving segments from specific URLs, Surveyor tools can also report any missing files or errors in the manifest file. From a pure QoS perspective, by placing Surveyor units at various points of the delivery infrastructure, publishers can monitor where problems arise and can take corrective action. If delivery performance is good in the origin server and two of three content delivery networks (CDNs), it’s clear that the third CDN is the problem, not the origin server. In essence, this was the data that Conviva’s beacon-based QoS measurement tools couldn’t otherwise provide.

Note that Tektronix offers similar functionality in a suite of products primarily delivered via various versions of Tektronix’s Sentry product, including Sentry Verify, Sentry ABR, and Sentry Edge and Edge II. Cisco and many other vendors also offer QoS-related products and services.

These are complex, highly featured systems, and most products cover the same basics. One key feature is whether the system can trigger a delivery infrastructure change based upon reported problems, like shutting down one underperforming CDN in favor of another. If the system monitors quality, check how issues are reported and stored, and whether it can capture the actual low-quality video so you can watch it later. If it has a video quality rating system, check if it can also help identify the root cause of the quality problems, like excessive motion, inadequate data rate, or similar issues.

One key differentiator is how the feature set is made available. If you’re an operator, you probably need 24/7/365 coverage, so buying a system might be your best alternative. If you’re producing live events, or only need to monitor performance periodically, you might prefer a virtual system you can pay for on a per-use basis. Beyond this, the system must be able to monitor the formats you distribute with the features you incorporate, like captions, advertising, and DRM.

QoS Vendors

Again, by our working definition, QoS vendors are companies that actually insert beacons into players, gathering statistics like video startup time, rebuffering ratio, average bitrate, and video start failures, which are the key QoS metrics defined by the Streaming Video Alliance. Key vendors in this space are the aforementioned Conviva and IneoQuest, as well as Nice People at Work (Figure 4), and newcomer Mux.

QoE metrics provided by the Nice People at Work YOUBORA system 

Since you’re hiring a QoE vendor to help you understand the customer experience, you should start with the metrics being provided, how easy they are to access and extract through export functions and APIs. Then you need to consider system price and the expense of integrating the beacon into your player. As an example, Mux offers a $999/month plan supporting up to 1 million monthly views that the company claims you can integrate with 5 minutes of coding. If you have a QoS system in place, you should determine how the two systems will integrate data to quickly identify problems and isolate their sources, and what automated actions you can set to quickly adjust your delivery infrastructure based upon incoming data.

Beyond day-to-day operation, consider the strategic value of data that the vendor can provide. One of the strengths of Conviva’s offering is the Experience Benchmarks service, which provides data like the comparative performance CDNs and internet service providers in your target areas, or even competitors. This data can help you choose your distribution partners more effectively and provide insight into achievable key performance indicators in the markets that you serve or plan to serve.

High-quality video delivered effectively is the key performance metric for all successful streaming video producers. It takes a suite of tools applied throughout the content encoding and delivery life cycle to make this happen.

This article was published in the Spring 2017 European edition of Streaming Media magazine. 

Streaming Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues
Related Articles

Buyers' Guide to Video Quality Metrics

Video quality measurement is more crucial than ever. In this buyers' guide, we'll identify the most commonly used metrics and discuss tools that deploy them.

Measuring Success: The Multiplatform Analytics Challenge

Counting viewers has always been a difficulty for streaming publishers, and today's multiplatform world makes the issue more demanding than ever. The answer is to radically rethink video analytics.

Parrot Analytics Measurement System Charts Real-Time Demand

With Parrot's demand ratings, broadcasters and OTT providers can measure the popularity of content streamed from any service.

European OTT Market Growing to $9.45B by 2018: Strategy Analytics

A research report finds the European OTT video market poised for strong growth, and sees big differences between East and West.