multichannel connect
careers
all access

Translation Please

I'm Johnny Cache

8/22/2011 10:34 AM

Few tech terms can instantly generate so many groaner-twists on clichés: The cache cow. Cache in your chips. His mouth is writing checks that his body can’t cache. (OK, OK, I’ll stop.)

Nonetheless, in the language set of the content-delivery network - the CDN - “caches” are a big part of the scene.

The purpose of a cache, in a CDN, is to temporarily store the most popularly viewed content so that it’s readily available to lots of people using lots of different devices (not just the television). It’s all part of this shift toward distributed storage - gigantic servers in the center, linked to regional servers, linked to caching serves at the edge (headend or hub).

Caching is buffering. You’ve seen it dozens of times, especially when video streaming first began. It’s the little animated circle that spins around on the screen when you’re watching something over the Internet. Caching reloads more bits to your screen. As cable operators continue architecting their CDNs this summer, caches matter for local content - stuff that can’t easily be encoded nationally, like cable channels - HGTV, ESPN, Discovery. Big cities can host more than 100 local TV stations. The MSO encoding 500 national channels at a centralized facility (the “origin server,” in CDN speak), and 100 channels locally, may need to cache a million or more file chunks, switched out several times per hour.

An HDTV title, for instance, may get compressed and encoded into eight different stream sizes, ranging from 1 Megabit per second up to 10 Megabits per second, to be adaptively streamed to suit the different screen sizes at the end points - the more bandwidth, the bigger the chunk.

As a direct result, caches can fill up real fast, especially if you’re a service provider tasked with serving up movies and TV at scale, to millions of viewers on dozens of different screen sizes.

Part of the caching equation is figuring out what can and can’t be cached. Some stuff doesn’t lend itself well to caching, like file segments that are “stateful” - meaning they’re tied to something you’re doing, like pausing to resume in another room on a different screen.

As a European cable technologist reminded me last week, all operators delivering VOD already have a CDN, or the beginnings of it. The difference: When VOD began, it was cheaper to store the same title once, at hundreds of edge points, than it was to store everything centrally, then ship it out over satellite or fiber.

These days, both storage and transport is cheap, comparatively. So why not store the hot stuff at the edge, as needed, and keep everything else deeper in the network, to stream as requested?

If you’re a cable operator planning your CDN, chances are high that a big part of the discussion is edge caching: How many servers, at what size, where, with what local encoding to handle off-airs, encryption and ad insertion. It’s the perennial tradeoff between the economics of storage and transport.

Cache out.
________________________________________
Stumped by gibberish? Visit Leslie Ellis at translation-please.com or multichannel.com/blog

September