Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

Sun Microsystems believes the answer to some of the problems of power, cooling and speed of IT deployment in enterprise data centers could be found in a standard shipping container.

Sun President and CEO Jonathan Schwartz and other company officials on Oct. 17 will unveil Sun’s Project Blackbox, an initiative in which all the technology that an enterprise might need for a 10,000-square-foot data center can be fitted and delivered in a shipping box that is commonly seen on the backs of tractor-trailers rolling down the highway.

The concept is aimed at businesses that have run out of room in their data centers—either because of a lack of space or because they can’t bring additional power into their facilities—but still need more compute power.

The container—20 feet long, eight feet wide and eight feet high—can hold as many as 120 Sun Fire T2000 or 240 T1000 servers, or about 250 Opteron-based “Galaxy” systems.

In addition, one storage-focused container can provide up to 2 petabytes of storage, according to Sun Chief Marketing Officer Anil Gadre. A container also can offer up to 15TB of memory.

The compact design’s floor space is about a third the size of a traditional 10,000-square-foot data center. Thanks in large part to the chilled-water cooling technology, it saves up to 20 percent in power and cooling costs and can be deployed about 10 times faster, sometimes in a matter of weeks.

“Basically, it rolls up to you, you hook up your power, you hook up your water, you hook up your network and you’re ready to go,” Gadre said.

In an interview with eWEEK before the event, Greg Papadopoulos, executive vice president and chief technology officer for Sun, said Project Blackbox—which has several patents pending on its design—is the future of infrastructure design.

“This is the system, the next era in system design and system engineering for us. It’s what we do,” Papadopoulos said.

“You can ship these containers anywhere really cheaply around the world, you can put them together wherever you want them put together, and then ship them. You have them on the spot hooked up and running, and that’s a very different cycle around not only speed of deployment, but also where you need the skills.”

Sun will be showing off a container running Sun technology Oct. 17 at an event in Menlo Park, Calif. The company will begin working with early customers now, with full production scheduled for the summer of 2007.

Click here to read more about power and cooling in the data center.

Papadopoulos and Gadre said Sun is targeting several types of customers with Project Blackbox, including Web 2.0 businesses that are looking to rapidly build out their infrastructures, as well as companies that need to grow their technology capabilities globally.

Enterprises with high-performance computing needs, such as oil and gas exploration, also will benefit from this idea, as will people looking for a storage-centric infrastructure, they said.

The company has basic configurations for HPC, Web serving and storage, Gadre said.

He said the idea came from customers looking to speed up the deployment of their IT infrastructures or needed help in the power and cooling area.

Next Page: The efficiency of water.

Sun, of Santa Clara, Calif., already offered its Grid Rack to customers, where racks populated with technology ordered by the customer were put together by Sun and then delivered to the user’s site.

The vendor was looking to transfer that capability to an entire data center setup, and the largest size that made sense was a standard shipping container, Papadopoulos said.

A key to Sun being able to put the technology into such a compact space is the ability to use water to cool the systems, Gadre said.

Water is more efficient than air, which is the method most widely used in traditional data centers, he said.

Inside the shipping containers, the systems are set up front-to-back along the wall of the container, with heat exchangers between each one, Papadopoulos said.

The warm air from one is passed through exchanger, where it’s chilled and then used to cool the next server, he said.

“It forms this kind of perfect cyclonic flow inside the box, and it’s very quiet, it’s very efficient,” Papadopoulos said.

What is Congress doing to reduce data center power consumption? Click here to read more.

Charles King, an analyst with Pund-IT Research, said the concept addresses a lot of concerns that businesses have, but that Sun is going to need to answer some key questions on issues such as security before the shipping container business takes off.

“It’s an interesting idea because it addresses a lot of the challenges that people have concerning data center facility costs, in particular the real estate component,” said King, in Hayward, Calif.

“The whole cost issues around data centers have little to do with the technology and everything to do with the support and construction of the facility.”

Being able to run multiple containers together—even stacking them—would help address those issues, he said.

However, most data centers have several layers of security, and at a time when disaster recovery and compliance are key issues, having a data center that’s housed inside a shipping container might not be enough security for many enterprises, King said.

Gadre admitted that the Blackbox idea won’t be for everyone, including some who might want the highest levels of security. But in the areas of Web serving and HPC, it should find customers, he said.

The idea of integrating cooling, networking and power distribution in a central fashion with the hardware is getting looks from a number of OEMs.

Hewlett-Packard, of Palo Alto, Calif., with its Lights Out Project, is looking to do something similar on a smaller scale, looking at an infrastructure model that brings power and cooling closer to the compute nodes themselves.

The goal is similar: to create an environment that addresses power and cooling concerns while increasing flexibility inside the facility.

Papadopoulos said this is a trend in the industry that is going to grow in importance.

“I think there is a huge pent-up demand for somebody to figure this out,” he said.

Power and cooling have become key issues in data centers as system form factors have become more dense, particularly with the rise of blade computing.

One of the key promises of blades—being able to pack more compute power into smaller areas—is hindered by the amount of power consumed and heat generated.

Major technology consumers like Google predict that soon they will be paying more to power and cool the systems than for buying the machines themselves.

IT vendors are addressing these issues in a number of ways. Chip makers like Advanced Micro Devices and Intel are producing more energy efficient processors; OEMs are building systems with power consumption in mind; and software makers are putting power monitoring and managing functions into their products.

Virtualization—the ability to run multiple operating systems and applications on single physical machines—also is an important technology.

Sun has been vocal on these issues. The company is promoting its UltraSPARC T1 “Niagara” chip, which offers up to eight processing cores while consuming less power—about 70 watts—than many other processors.

The company also is using AMD’s Opteron in their x86 servers.

Check out’s for the latest news, views and analysis on servers, switches and networking protocols for the enterprise and small businesses.