Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

A recent survey conducted by the AFCOM, which is billed as the largest independent association for the data center industry, found that nearly two-thirds of the respondents said they expected their budgets for 2009 to stay the same or increase, while the remaining one-third of those surveyed said they expect to see their budgets cut on average about 15 percent.

What’s even more startling about those results is that of those that said they would see their budgets cut, about one half of those cuts would be confined to travel and training. About 38 percent of the cuts would actually affect hardware purchases.

This may surprise some folks, but the simple truth of the matter is that data centers are a difficult place to cut spending. The vast majority of the data center budget is typically consumed by labor costs because most of the tasks are still pretty manual in nature. Worse yet, there are so many distinct disciplines and products in the data center that area has its own specialists and related tool sets. The amount of investment that has been made in IT automation tools to reduce these costs has been relatively minimal.

Of course, we hear a whole lot of hoopla these days about virtualization. No doubt interest in virtualization is rising, but not all applications lend themselves to virtualization equally well. Database applications and offerings such as Microsoft Exchange are not particularly virtualization friendly when it comes to maintaining application performance. The end result is there is a lot of focus on using virtualization on file servers, which compared to the cost of other server hardware is relatively trivial. In fact, more than a few people think that it is still more effective to throw tens of $2,000 hardware servers at the problem than it is to manage hundreds of virtual server deployments just to save some money on hardware at the expense of having to hire more people to manage all the virtual servers.

Storage is another area of frustration in the data center. IT managers have been trying to create pools of storage for years, but performance considerations usually force them to align individual storage arrays with specific applications. The good news here is that advances in thin provisioning of storage along with virtualization and clustering are making it easier to create the pools of storage. But the tools for managing dynamic pools of storage across multiple vendor platforms are not all that plentiful. In fact, the only major company trying to make a real run at managing storage across multiple vendor platforms appears to be Symantec.

What all this means is that most IT shops are between a rock and a hard place when it comes the data center. They want to reduce their costs, but they can’t really make the necessary investments in next generation IT automation tools to make that happen. That, of course, creates a major opportunity for solution provider to deliver IT services via next generation data center facilities. We can already see in the interest in providing these types of cloud computing service just from the amount of venture capital that is pouring into the sector.

The challenge that any solution provider will have in this space is not to get bogged down in the same inefficient IT processes that most people use today in the data center. That means, for instance, instead of having dedicated server, storage and networking specialists, they should focus on a more holistic approach to managing the data center that allows fewer, better trained people using more advanced tools to manage a greater number of devices. Any solution provider that shows up with that model is going to be inherently more efficient than 90 percent of the data centers being managed by internal IT organizations today. And that in turn creates a winning business model around next generation data centers.