Nicholas Carr’s blog covers a new study on power consumption in IT servers. It’s now estimated that servers account for 1.2% of all electricity use in the US (about the same as all the color TVs in the country) at a cost of about $2.7B.
A new study from the Lawrence Berkeley National Laboratory, released today, reveals that the electricity used by server computers doubled between 2000 and 2005. The report, which appears to be the most definitive assessment of data center energy consumption yet produced, underscores the growing importance of energy efficiency in effective IT management.
The study excludes other data center components (like networking gear and storage), and only includes servers, not computers for personal use. Large data centers are beginning to face the facility problems that we were facing 5 years ago, while many of us have moved on to infrastructure problems that give non-HPC people nightmares.
[Update: more analysis worth reading at the Green Wombat.]