IT and Business Insights for SMB Solution Providers

How "Going Green" in the Data Center Pays Off

Developing a green strategy for a client's data center doesn't have to be an elaborate affair. And it still saves energy--and money. By Carolyn Heinze

"Going green" is a top-of-mind concern for many IT professionals--especially those responsible for overseeing energy usage in their organizations' data centers. This is largely due to the combined pressure from the Environmental Protection Agency and Congress, as well as initiatives from the U.S. Green Building Council, which advocates sustainable building design, and The Green Grid, a global consortium of technology companies dedicated to advancing energy efficiency in the data center.

One equipment category that is singled out as a potential source of lost energy is uninterruptible power supply (UPS) systems. "If a UPS system is only 88 percent efficient, that's a 12 percent energy loss for everything that is running," notes Chris Loeffler, a global applications manager for Raleigh, N.C.-based Eaton Corp., specializing in data center power solutions and services. "It is compounded by the efficiency losses of the servers, and the server power supplies."

Both the National Electrical Manufacturers Association's TP1 standard and the government sponsored Energy Star program have promoted the production of more energy-efficient transformers for power distribution, Loeffler says. He points out that by moving to higher-voltage connectors--208 or 240 volts--a significant amount of energy is conserved. "If you use that type of connector with today's power supply designs, you can usually gain two to three percentage points of efficiency in the server power supply itself," says Loeffler. "Then, every server becomes two or three percentage points higher in efficiency just because you selected a different voltage for the power supply."

Implementing a green strategy in a data center can cover everything from processors to servers, storage, and cooling. To address the power segment of the strategy, you should discuss the following with your clients: Is the highest AC voltage available being used to run the servers? Have energy-efficient transformer and distribution products been incorporated into the power distribution scheme? Have more energy efficient UPS systems been installed? Each of these areas provides an opportunity to develop better systems for your clients--solutions that will save them money in the long run.

For companies with multiple locations, the potential for remote metering, monitoring, and reporting offers the ability to trim electricity costs. Using software, data center managers can monitor cooling and powering systems, and based on the information they gather, shift the power load from one location to another as needed. If a power load is high during peak times in California, for example, it can be shifted to a lower-cost location via wide area networks within the country or even around the world.

Companies are also looking to their channel partners for solutions involving consolidation and virtualization. "Let's get rid of three old servers and replace them with one new one, and now we have one box doing the work of three by consolidating," explains Michael Klein, president of Computer Directions Inc. in Searingtown, N.Y. "Or, with virtualization, if there is a box that [is not] used all that often, we'll [make] it a virtual machine running on another box. It doesn't physically exist."

Whether or not an SMB is a prime candidate for these systems depends on the scope of the SMB, Klein notes. Those at the smaller end of the scale, in which the data center comprises one or possibly two servers, have little need--or ability--to further streamline their systems. According to Klein, "That is still working its way down from the Fortune 500 to the SMB market." 

About the Author

CAROLYN HEINZE is a regular freelance contributor to ChannelPro-SMB.