Upcoming Industry Conferences
Streaming Media West [13-14 November 2018]
Live Streaming Summit [13-14 November 2018]
Streaming Forum [26 February 2019]
Past Conferences
Streaming Media East 2018 [8-9 May 2018]
Live Streaming Summit [8-9 May 2018]
Content Delivery Summit [7 May 2018]
Streaming Forum [27 February 2018]

Review: Limelight Orchestrate Performance
Video isn't the only asset that can benefit from delivery via a CDN. Limelight Orchestrate Performance shows how website acceleration can significantly improve the performance of any website, not just video-heavy sites.

I have worked in and around the CDN space for nearly 20 years now. For the vast majority of my time I have been focused on streaming media delivery. I have, however, been aware that for many years most CDNs have increasingly focused on the "mass market" proposition of accelerating web sites in general.

I felt that it was high time I learned more about how CDNs can help web publishers deliver their non-video content, and so I asked my erstwhile friend and director of product management at Limelight (and former operations director of my own CDN) Steve Miller-Jonesto give me a tour of these web-focused CDN services.

As some of you will know I have a sideline project—cellmux.com—which is a simple news aggregation portal focused on video backhaul technologies. Currently the editor of the site, Michael O’Rourke, has the site hosted on a single server with a host based in London. We put it to Miller-Jones to show us the benefits of the Limelight Orchestrate Performance product were it to be applied to our site, as a way to directly see ourselves how CDNs can improve our audiences experience.

Steve covered the following topics on an extensive WebEx he held with us, addressing several key points:

  • Why use website acceleration?
  • What are the bits that stick it together?
  • Tests we did on cellmux.com
  • Review of results

Why Use Website Acceleration?

"People all over the world are trying to get content from a web server," says Miller-Jones, "which is not likely to be located very close to where they are, which increases the latency in delivering content from that web server."

What Are the Bits that Stick it Together?

Between the user and the publisher’s server is the internet, and the server can be considered to be a black box on the internet. Obviously, the internet is something publishers have little or no direct control of in terms of routing paths available etc.

To get content to the user, the content owner needs to optimize how the content flows across the first, middle, and last miles of the Internet.

Many CDNs deploy caching devices at the edges of subscriber ISPs. Publishers could also do this themselves, but it quickly becomes a very expensive exercise.

These edges are likely to be directly connected to the last-mile ISPs. Content will be cached in these caching servers as it is requested by end users. This is good for static objects that do not change often, like videos, but less effective for dynamic content like HTML. In cellmux.com's case there are many lookups to check that the content on the page isn’t out of date.

Orchestrate Performance optimizes the journey that content takes across the last mile, the middle, mile and the first mile of the internet after being served by the origin server. The service can provide connection pooling and management and compression, and all are entirely passively optimized without any particular integration work on the origin.

Limelight is "in control of the paths" across the internet, because it owns and operates its CDN as a private network.

Limelight builds a path across the middle mile, providing WAN acceleration and TCP optimization—and it is one of four major operators offering this kind of dynamic acceleration, although each CDN has a different network topology and implements the technologies in a different way.

In the last mile Limelight optimises the delivery of content so that the browser can display the part of the page being viewed as quickly as possible. The aim is to ensure the users wait the least amount of time before they can interact with the visible webpage.

In-lining is a typical technique, where common files such as JavaScript (JS) & cascading style sheets (CSS) are brought inline into the HTML code during the delivery of the HTML. This may mean the first byte delivered may take a few milliseconds longer, but the inlining reduces the number of roundtrips the browser needs to make to get all the data needed for the page, resulting in faster page load times.

This means the publisher and its own optimizations are reflected in the code, but the way the code is prepared for delivery results in better performance at the point of delivery.

Even security models are preserved—SSL offload secures the user’s SSL connections at the CDN edge, reducing the SSL calls to the origin, which are handled by the CDN. This also allows the CDN to protect the origin from flooding and DDoS attacks.

Let’s walk through the request flow:

  • User makes a request.
  • Origin is identified
  • Path to Origin from the edge serving the user is established across Limelight’s middle mile, enabling WAN TCP optimization to be established across that path.
  • A connection to the origin is made from the CDN POP closest to the origin, and connection pooling and optimizations are given some persistence so that they are not reestablished for each request.
  • Header analysis, compression, and any manual overrides are taken into consideration during the request, and response flow between the CDN and origin, using the Headers provided in the end user request (with additional headers being optionally added as needed, such as the users location).
  • The content is sent back across the CDN to the edge serving the end user.
  • The edge serving the end user then establishes any compression suitable for the user's browser and sends the content to the browser using TCP optimizations for the last mile.

Caches within the network hierarchy can aid the acceleration by caching static content like JS files within the Limelight network so that, for example, continental "junctions" within the network cache the content and then regional POPs can read from these caches on cache-miss, rather than send all requests to the origin.

Related Articles
Thanks to open caching tying together a CDN and last-mile delivery, viewers can expect lower-latency and improved service quality.