Cisco Checks on the Cloud and Confirms That It’s Getting Bigger

»»Cisco Checks on the Cloud and Confirms That It’s Getting Bigger

FGC-arrow_304Ever heard of a zettabyte? I’m going to use that word a few times in this story, so it will probably help if I define it first.

You know its smaller siblings, the gigabyte, the terabyte, and maybe petabyte and exabyte. Your average PC hard drive is usually a terabyte or two, and external hard drives are now hitting six terabytes. Big companies with data centers routinely deal with data at the petabyte level. Earlier this year, Facebook said it was setting up an exabyte-scale cold-storage facility at its data center in Prineville, Ore., intended to hold photos of its members forever.

But a zettabyte is second-to-last of the words we have to quantify data storage. If you think of a terabyte as 1,000 gigabytes, then a zettabyte is a trillion gigabytes. Beyond that is one more word, yottabyte, which would be a quadrillion gigabytes. After that, there are no more words that have yet been agreed upon.

So, here’s why I’m getting into all this: Cisco Systems today put out another one of its big trend surveys meant to blow your mind a bit and start you thinking long-term about the demands being put on your network and data center. It’s called the Global Cloud Index, which it says measures the combination of three types of data in motion: Traffic between real people and data centers, surfing video and websites, and the like; traffic between data centers, using shared resources; and traffic within a data center. (Cisco explains its definition and methodology in agonizing detail here.)

Hits: 10

2013-11-12T12:05:23-06:00