Upcoming Industry Conferences
Streaming Media West [19-20 November 2019]
Live Streaming Summit [19 November 2019]
Past Conferences
Streaming Media East 2019 [7-8 May 2019]
Live Streaming Summit [7-8 May 2019]
Content Delivery Summit [6 May 2019]
Streaming Forum [26 February 2019]

Cinegy Looks to 16K Streaming with "World's Fastest Codec"
Designed as a mezzanine codec like Avid DnxHR, Apple ProRes, and SonyXAVC, the DANIEL2 codec decodes HD at 17,000 fps in HD, all the way up to 16K at 280 fps, Cinegy claims

Moore's Law has finally hit the broadcast industry where IT technologies are taking over and making 4K a breeze. At the IBC show in Amsterdam last week it was equally apparent that working in 8K for postproduction and even broadcast streaming is inevitable, with tools from cameras to editing software and HEVC encoding already primed or in development to handle it. No-one, though, was envisioning a world beyond 8K—except German software engineer Cinegy. It demonstrated a codec that it claimed could decode a Hollywood movie in a second and is built to manage 16K data rates today.

"We realised that existing codecs are pretty much useless in environments beyond 4K," explained Cinegy CEO and co-founder Jan Weigner. "You need a ridiculous box of kit to service just a single channel. Yet the industry is already talking 8K, and the bandwidth of existing devices that we can envision in 4-5 years wouldn't accept that.

"So we decided to confront this problem by designing a mezzanine codec for acquisition and production from scratch which would render this vision today with existing off-the-shelf hardware. This is the only way to play professional-quality 8K streams on commodity hardware or even a consumer laptop today."

The DANIEL2 codec can decode up to 1100 frames per second at 8K (7680x4320, or 16x HD resolution) which translates into over 4,300 fps in 4K or 17,000 frames of full HD per second. It is specified to perform 16K at 280 fps. The compression ratio is stated as 1:10 to 1:20 working with 8K.

"The performance secret is that this is architected and developed from the ground up to be GPU-based," explained Weigner. "It is very conservative with GPU memory bandwidth, leaving compute resources for other tasks."

The demo at IBC showed the codec decoding multiple 8K streams along with multiple 4K streams while performing realtime compositing, colour correction, scaling and titling with the results displayed in realtime in 8K. The hardware platform used was a Intel quad core i7-67000K processor and an Nvidia GTX980Ti or Quadro M6000 graphics card.

"A problem faced when designing 4K, 8K—or soon 16K systems—that need to handle multiple streams and that need to manipulate them in real time, is that even if you could decode the streams using the CPU—which you cannot—then you'd probably still want to use the power of the GPU for effects and filters," explained Weigner. "Now you face the bottleneck of the system bus to transfer the decoded streams into the GPU's memory.

"This where DANIEL2 comes in," he continued. "Streams a fraction of the size of their uncompressed counterparts are read from disk or via the network and passed to the GPU to be decompressed faster than the uncompressed frames can be copied. So we can achieve less bandwidth of the system bus being used, less space or bandwidth consumed on disk or the network. I could decode dozens of 8K streams and still have enough power left for all the video processing like chroma keying and effects. This power means I can work with dozens of 4K stream on a laptop today."

DANIEL2's main use is for recording from camera sources, editing, and postproduction as well as playout. "We have had interest from camera manufacturers particularly where slow motion cameras need to capture hundreds or thousands of frames a second," said Weigner.

"We are aiming for the same space as AVID DnxHR, Apple's ProRes, or Sony XAVC," he said. "We could put this in a MXF wrapper and standardise it. We are not after the HEVC distribution codec. DANIEL2 could go all the way to playout where finally you turn the stream into a distributed channel and H.264 and HEVC can kick in."

The first generation DANIEL codec was developed with the specific purpose of being an RGBA codec—the A standing for alpha. "The aim was to provide a better, easier way to deal with video with alpha mask for overlays and keying," said Weigner. "This can be done with other codecs like ProRes or DnxHD but these always consume a fixed bitrate even if there is actually not much to encode. We found people were using the DANIEL codec for other purposes such as 4K encoding and playback as it is much lighter on the CPU than comparable codecs."

This, he said, prompted Cinegy to develop a GPU focussed second iteration. DANIEL2 is being made available as an SDK as well as AVI and Quicktime codecs to permit integration with Adobe Premier, After Effects, Avid Media Composer, Vizrt and other popular applications.

"Eventually we are looking at powering this with a server the size of a cigarette box," he said.

The Munich-based developer company's messaging at IBC targeted Imagine Communications. "Don't Imagine Cloud Playout—It's Real," screamed the posters.

"Two years ago Imagine had no solutions in this area at all—they had to completely rewrite everything," said Weigner. "We have been doing cloud playout for years. I can spin up a channel from AWS not in days, hours, or minutes, but in seconds."

Its Cinegy Air PRO provides a broadcast automation front-end and a real-time video server for SD, HD and/or 4K playout in an integrated software suite.

Weigner proceeded to demonstrated playout of a video encoded in H.264 using Nvidia hardware launched from AWS and streamed back to the Cinegy booth in, indeed, a matter of seconds.