The Truth Behind Common Data Center Energy Management Myths
February 18, 2013 No CommentsThe increased media attention on data center energy consumption late last year was controversial, but nevertheless sparked a necessary, industry-wide discussion on how to effectively reduce the data center’s environmental impact. While the conversation encourages positive moves toward the use of sustainable technologies to support our increasingly digital society, many false assumptions about data center energy management have also come to fruition.
In this article, we’ll explore three common energy management myths and outline effective strategies for reducing data center environmental impact in cost-effective, practical ways.
Myth: Building an energy efficient data center is time-consuming, complex and expensive.
Fact: By taking advantage of existing low-cost energy efficient technologies, implementing energy management design strategies and installing a modular data center, building an energy efficient data center can actually be easier and lower in cost than a traditional data center.
There are many examples of highly advanced, energy efficient data centers that take advantage of bleeding-edge technologies to reduce energy consumption. Among them is Google’s geothermal data center in Finland, and Apple’s solar-powered facility in Maiden, North Carolina.
As impressive as these data centers are, they do not represent the average data center. Many of the technologies implemented are far out of reach for most facilities, both in terms of cost and practicality.
However, data centers can implement the following technologies and strategies easily in order to reduce energy consumption and reduce costs:
– Energy Procurement: Data centers can now work with third-party sustainability service providers to source the most reliable and affordable clean energy available. These providers help data centers plan, monitor and report energy usage, and can also help control costs and manage complexities such as price volatility.
– Natural Cooling: For those data centers located in colder climates, taking advantage of natural cooling in place of an expensive heating, ventilation, and air conditioning (HVAC) system is an ideal way to slash cooling costs and reduce energy use.
– Air Containment Strategies: Mixing hot and cold air in the data center limits effective operation and overall capacity. Air containment systems can be implemented by placing air tiles and installing supply vents in the cold aisle, installing return vents in the hot aisle, and physically dividing the hot and cold aisles with a curtain or hard enclosure. Air containment strategies not only reduce energy use, but return up to 25 percent in savings.
– Modular Data Centers: For those looking to build or expand a data center, modular and containerized data centers are often a better alternative to customized data centers. Not only are they faster to deploy, but they’re also less expensive, and can offer better operational, efficiency and processing benefits. Systems have factory-verified operation, and are tuned for maximum energy efficiency. The proof is in the pudding: a modular approach can produce capital expenditure cost savings anywhere from 10 to 20 percent, and operating expenditure cost savings from 20 to 35 percent.
Myth: Updating cooling is the only way to lower data center energy costs.
Fact: It’s true that cooling is a tremendous consumer of energy in the data center. However, data center managers shouldn’t just look at their HVAC systems when seeking an increase in energy efficiency. Higher voltage power distribution, more efficient uninterruptable power supply (UPS) products, redundancy requirements and power architecture should also be considered.
Two major factors in the data center can contribute to reduce environmental impact:
– Power Equipment Efficiency: Failing to account for the heat produced by critical data center devices such as UPS’s, transformers, transfer switches and wiring can result in a missed opportunity to reduce energy usage. Cooling the heat generated by these devices maintains optimal conditions but consumes far more energy than necessary. Data centers should instead consider energy efficient equipment, such as high-efficiency UPS, servers, and computer room air conditioners (CRAC) and computer room air handlers (CRAH).
– Centralized Management System: You can’t know if your energy management methods are working without measuring them. With data center infrastructure management (DCIM) software, data centers can integrate all IT equipment and functions into one application. Data center managers can identify opportunities for increasing efficiency by using data collected to optimize load, operate cooling systems and manage capacity.
Myth: Virtualization and/or moving to the cloud means I don’t have to worry about my existing infrastructure.
Fact: Virtualization and outsourcing to the cloud has a tremendous effect on a data center’s remaining physical infrastructure. Operational and efficiency issues can result from failing to properly size and run remaining critical equipment that connects the business to offside data and applications.
To avoid widespread outages due to power outages or physical/environmental threats, consolidation and protection strategies should be implemented.
Virtualization has a large impact on power densities, so data center managers have several options to consider when it comes to high density server power and cooling:
– Spreading the load: Placing a group of high density servers together creates a higher peak wattage and temperature than if they were spread out throughout the data center. By “spreading the load,” data center managers can power and cool data center infrastructure to a lower average wattage and temperature instead.
– Supplemental cooling: As a temporary solution for when high density servers must be placed together, supplemental cooling can be implemented in addition to the normal HVAC system.
– Designation of high density and low density areas with the data center: Another option is to consolidate high density servers and low density servers into designated areas within the data center, often done rack-by-rack, row-by-row, or in pods.
Many of the myths outlined above often serve as barriers preventing data centers from becoming more energy efficient; some of them also psychological barriers that prevent data center managers from pursuing energy efficiency strategies. In truth, IT-enabled energy efficiency not only reduces environmental impact, but can be done without incurring extravagant costs and can be factored into existing plans to save on both capital and operational costs.
Kevin Brown is Vice President, Global Data Center Offer for Schneider Electric. In this role, he is responsible for leading Schneider Electric’s portfolio strategy and leading innovation to respond to emerging data center industry trends.
Kevin is an experienced industry professional in both the data center and HVAC industry. Kevin has over 20 years of experience in various senior management roles including software and hardware development, sales, and product management.
Kevin holds a BS in mechanical engineering from Cornell University.