A year ago when I was testing 10-Gigabit Ethernet switches at the University of Hawaii, several things became clear. The first was that those switches were really fast. They really can pump data at their rated speeds, and that means sometimes three or four parallel 10-gig channels. That’s a lot of data.
At the time I wondered what all that bandwidth might be used for. Sure, the backbone of a big ISP might need that, but that’s a limited market. And it seemed that the number of enterprises that might need multiple 10-gig pipes was even more limited. But I guess that every time I wonder whether there’s such a thing as too much bandwidth, I should just slap myself and think about something else. There’s never too much bandwidth. It’s kind of like how you can’t be too thin or too rich (not that I’d know about either of those).
But as I talk with IT managers, it’s clear that if there’s one thing they need, it’s faster access to storage, especially backup storage. Unless companies are willing to invest in truly vast backup facilities, they can find themselves with some difficult choices. You can choose between spending a fortune to back everything up or spending less but making backups less often. The reason? Backup technology is still kind of slow.
Yes, it’s true that Fibre Channel runs at 2G bps these day. But when you’re backing up a large enterprise, you still need a lot of 2-gig pipes. More importantly, you need to be able to write to your backup media at a high speed, and that’s hard to do. Right now, the best alternative is to back up your disks to more disks, but that’s expensive and it can make off-site storage tough. Tapes are more portable, but they’re slow.
As the amount of data stored in the enterprise grows, it makes the backup problem bigger. The data becomes more valuable on one hand, but on the other it becomes harder to protect. If backups could be faster, the process might be more palatable. That’s where 10Gig comes in.
A bigger pipe will help move the information needed for backups faster, making that process faster. But there are some gotchas here. For one thing, just making the pipe bigger doesn’t really solve much. It’s got to be bigger the whole way from the disk heads to the backup medium. There are improvements here, but there need to be more improvements. For example, you can now buy 10Gig server adapters. The problem is that while they do support a 10Gig data rate, their effective throughput isn’t nearly that fast.
The same is true at the other end. While there are a few storage devices that can handle the 10Gig data rate, getting the information saved to the backup medium isn’t necessarily all that fast. So where does that leave us?
Right now, it means that you have to plan your really big backups for days when the company is shut down, which means your weekends are toast. And it means that you still need really fast backup storage just for incremental backups. But in the long run what it means is that we’re running out of options as data storage grows faster than total bandwidth. And if your backups take much longer, you could find yourself having to start the next incremental before you’ve finished the last one.
Check out eWEEK.com’s Storage Center for the latest news, reviews and analysis on enterprise and business storage hardware and software.
Be sure to add our eWEEK.com developer and Web services news feed to your RSS newsreader or My Yahoo page
Subscribe for updates!