Teradek Bond II: Hands-On Review
I chose to make the route an “out and return” loop so that we aim to get a “butterfly” shape in the quality graphs. I allow for a slight variation between C on the way out and D on the way back, because the dual carriageway on the D return part of the route is slightly higher than the other side at point C. Other than that the route should be symmetrical, varying only with the time base caused by the traffic intensity.
We also have a few amusing moments where drivers think we are some form of surveillance vehicle.
So most of the velocity, trip-data, altitude, and GPS mapping is carried out by a jogging app I run on my iPhone as we drive. It’s a very simple solution and generally it works very well. For this particular run, the GPS system seems to have produced a few dirty-data segments (we suddenly moved out to the neighboring county and back in about 2 seconds!) so I have used the maps and altitude graphs from a previous test—these don’t really chance much. I did however manage to extract a good velocity graph from the app so I have used that below.
Then comes the question of how I analyze the quality in a meaningful way.
Because each demuxer varies there is still some variation in the testing model, but gradually I am refining the end point analysis to a model where I use Wireshark to record packet-by-packet the video stream as it is delivered to VLC or whatever media player the vendor recommends. We have tried recording these streams in VLC, but we often hit a problem: If the signal drops completely the VLC player session terminates, and if I use an infinitely repeating playlist to constantly poll for the stream to be restored, the archive overwrites the previous recording! I could go about fiddling with VLC to get it to increment the file names, but, since the use of VLC itself may vary if the vendor arrives with a different player requirement, I increasingly prefer to record all the packets using Wireshark, then trace out the session later, exporting the TCP stream of each section and combining them with the “cat” command line function. This has the effect of absolutely, unquestionably recording what we receive from the demuxers, and is, I feel, a great asset to the testing process. However it also has the side effect of losing the underlying timeline as each stream dropout occurs. This means that when the stream drops, the time base on the resulting file is lost and we literally jump from one section of the video to the next. In an (impossible) ideal world I would like the system to insert black space while the stream is lost—since this would produce some clear dead spots in the bitrate analyzer tool graphs—but as it is, you can clearly see in the graphs when and where the drops are.
This also has been causing problems with the playback of the joined archive. While VLC is broadly capable of struggling through most video—however error prone it is—many other technologies simply cannot handle errors in the same way. A great example is the initial experiment I made with uploading the Bond II sample to YouTube—it ended up either playing the audio properly but with only the first frame of the video showing, or it played the video at half frame rate. Clearly the remade MPEG-TS I had extracted from the network dump was confusing the YouTube FFMPEG transcoders.
To this end I will post the video on drop box for the moment should you wish to explore the image quality etc. Ultimately, I will have to free up the space in due course, but try the link here to download the .ts.
The good news is that all my own analysis locally was possible on the raw file and so while you can’t, in this instance, see the file without two passes through h.264 encoding (my local encode and YouTube’s) the video will give you a pretty good idea of the quality, and my analysis of the transport stability is still possible.
Teradek Bond II: Data Throughput Performance
So lets get to the data. First lets have a look at the demuxer’s own UI and graphic output at the end of the run (Figure 3):
Figure 3. The Sputnik demuxer's graphic output at the end of the test
It’s a nice interface, right? This orange line is showing the total data throughput. It spikes up much higher than the target 3Mbps, making up for periods where it was “under-running.”
Interestingly it needs at least one port active to bring up any data at all. That is why there is a port showing as active on the right hand side and a small blip of data around 16.00. It enabled me to screen grab the session in this instance.
The key things to comment on here are that there is a clear and complete drop out in the middle of the graph, between about 15.25 and 15.27, and then a further drop at 15.30 that then recovers much more slowly. Exploring other bits of the data (and these days, to be honest, I am getting quite familiar with the pattern of data in these tests) at around 15.15 and again at 15.40 we pass point E on the course, a gully where the cellular signal drops away steeply for a few moments as we hand over from one mast to another, though not long enough to cause a complete signal drop.
For interest the Sputnik graph actually brings up the data transfer of each modem and port if you like. (The graph above shows the total throughput). When activated it looks like Figure 4.
Figure 4. The Sputnik graph showing the data transfer for each modem and port
I have to say that is one of the most awesome looking data displays I have seen across many of these devices! It’s a great data-junkie wall graphic.
The light orange line is showing us the stream bitrate – and while the data throughput (the original dark orange line) spikes up above the target 3Mbps the video stream only peaks at the target 3Mbps. For fans of adaptive bitrate technologies, this is a great example of it in action. The various modems all take a varied amount of the load and you can see those working in the spectrum of other colors.
Obviously where that stream bitrate light orange line dips it directly reflects the video quality.
Now lets have a look at the packet I/O graph I extracted from Wireshark (Figure 5).
Figure 5. The packet I/O graph extracted from Wireshark
In many ways this is much more readable than the other graphs, despite being much less pretty to look at! Let’s explore the data we are looking at here. First thing to note is that the Y axis scale is packets per “tick,” and in this instance a tick is one minute. As such the 250,000,000 counter is a quarter billion packets per one minute and NOT 2.5Mbps as one may have been instinctively inclined to read it as—a number which would differ from the 3Mbps on the Y axis of the Sputnik graphs…
Again we see the bit-stream slow down around 15.25 as we hit the dead spot (C), and again as we return through it (D) at around 15.30. This corroborates the Sputnik data well. Perhaps the most interesting thing is that this graph is somewhere between Sputnik’s total data tate (dark orange) and stream bitrate (light orange) lines. My assumption is that the stream bitrate data of each packet is read from the packet headers by the tool that is graphing for Sputnik.
So far, so good. We seem to be getting data that fits our expectations.
The final data chart I have been able to extract is from a free app called Bitrate Viewer. It explores video files and produces a graph of the data bitrate. Now, as explained above, I cannot insert ‘black space’ in the transport streams without a reencode, and I was loath to do that with the raw data analysis. So what I have done is a little copy and paste magic.
The raw Bitrate Viewer output of my ‘cat’ed file looks like this Figure 6.
Figure 6. The raw Bitrate Viewer output
Since we know all the timelines are common, I think this will make more sense to you if I cut the video at the points where I think the ‘cat’ting joins in the above file, and then use the timelines of the other graphs to work out where the video starts running again I then effectively insert some whitespace. It will make more sense to see than to explain (Figure 7).
Figure 7. Comparing the Bitrate Viewer output with the Sputnik display, with whitespace inserted to represent the cellular dead spots in the testing route.
So as you can see from my split graph there are some significant outages in the video each of around 2 minutes, and even after the second major outage the video struggles to get back to full speed for about 3 to 4 minutes.
For me this is a good test result since it validates the expected capabilities of the Bond II. It was a rainy day. The Bond II does not have an external antenna—only those of the relatively tiny USB dongles—and as such when the layer 2 radio link fails it fails fairly quickly, causing a quick drop off. After the initial dead spot pass at point C, the recovery is fairly quick. This is due to the fact that when we come out of the black spot we come around a hill almost directly onto a cell mast.
The story is a little different on the return route. As we pass point D—just the other side of the road, but a little higher in elevation since the dual carriageway is raised on the return side—the signal recovers more slowly. (The geographic reasons are easier to see in practice than on my route map, but please take my word for it!)
With this more gradual signal recovery the radio links restore more “tentatively” and the underlying IP also reconnects with an unstable and significantly lower quality signal for an extended period, only ramping up after a few minutes (15.30 to around 15.38).
I think it is important that I mention that we are not allowing for variation caused by atmospherics, and—as is far from unusual in the South of England—it was raining fairly heavily throughout this test. This is in contrast to previous benchmark tests for other vendor products, which have unusually all been conducted on clear days.
So What's the Takeaway?
So, while I could explore these tests in endless detail, I feel it is a good point now to attempt to wrap up the benchmark and make a few assertions about the Bond II.
The great thing about these tests is, much like pricing analyses get more interesting as they accrue more data, I am starting to get an amount of data that I can compare from device to device.
It is clear to me that the use of the integral antenna in the attached USB dongles on the Bond II gives the device much more sensitivity to the layer 2 radio carrier. If the signal is compromised the antenna quickly loses signal. In contrast the integrated, optimized antennae found on the high-end systems, while putting out the same wattage of transmitted signal (according to mobile network standards), are much higher gain receivers and as such the signal drop off and recovery is not so sudden as it is on the Bond II.
This means that the Bond II struggled with my dead-spot test more than any other unit I have so far tested, BUT this is a limitation that must be weighed up with the fact that it is the lightest, smallest, and cheapest cellmux in the market today.
If you expect to be using a system like this in areas of pervasive strong cellular networks such as metropolitan areas, or relatively static shoots, then the Bond II is a highly desirable piece of kit.
It is a solid and professional workhorse, with the only caveat being that, compared to other systems that are three times the Bond II’s price, it is ideally suited to non-marginal cellular network footprints.
I can see these being affordable enough to be supplied as backup units to every electronic newsgathering camera operator. After all, most camera operators have spare lenses that are bigger and more expensive than the Bond II, and having one with you is much like having a complete spare satellite truck in your bag!
With a great form factor, rugged construction, and well-designed UI and controls—not to mention solid signal delivery in difficult conditions—the AVIWest DMNG Pro180 is one of the leaders in the cellular multiplexing sector
When facility lines and leased lines are cost-prohibitive, but cellular bonding doesn't provide the necessary QoS, portable Ka band antennas like the Sat-Comm Karbon 75 are just the ticket for live event webcasting.
A channel-bonding link aggregator that's optimized for video and decoupled from a particular codec, the Mushroom Networks Streamer offers unique flexibility at an attractive pricepoint