Study: U.S. Data Center Utility Bills Approach $3 Billion
AMD-commissioned research shows a need for better cooperation between the technology industry and government to improve energy efficiency
February 17, 2007
Cost-conscious companies are paying a big price to power their data centers. According to the results of a recent study funded by processor maker Advanced Micro Devices (AMD), U.S. businesses spend $2.7 billion on electricity to run and cool their servers. The same research pegs worldwide data-center utility costs at $7.2 billion.
The study found that in 2005 businesses used 45 billion kilowatt hours of electricity to run their data-center servers--double the amount used just five years earlier.
Disclosing the results of the study at the LinuxWorld OpenSolutions Summit in New York this week, AMD's Randy Allen, corporate vice president, Server and Workstation division, said the research is proof organizations that fail to employ energy-efficient server solutions run the risk of limiting their own growth.
He said there are enough energy-conserving solutions available today that businesses should be able to get the performance levels they need without consuming an excessive amount of power.
Allen encouraged the IT industry to work closely with government agencies to find a practical way to conserve energy in the data center. He suggested both groups work harder to measure consumption and find new ways to implement energy-efficient solutions. Allen also said it is important for firms to be able to more accurately track their own data-center efficiency.
Read more about:
2007You May Also Like