New research from The New York Times suggests that data center electricity consumption grew more slowly from 2005 to 2010 than the EPA expected it would.

We hear a lot about the huge amount of electricity data centers are using. However, an article in The New York Times, based on research by Stanford University professor Jonathan Koomey,  suggests the demand for electricity in data centers has been curbed.  This is partially due to our global recession, along with the emergence of server virtualization technologies and more power-efficient microprocessors.

The argument isn’t that data center power consumption isn’t growing (it is), but that rate of growth is far slower than the figures originally predicted by the EPA back in 2007.

The fact that power consumption was slower than expected is a great tribute to all the work on energy-efficiency that has been going on across the IT industry.  What do you think about this topic?  Is the idea that IT leaders’ energy-efficient work contributed to the slower rate of growth of data center power consumption just Hype, or do you think this is Ripe?

We would love to hear your opinions in the comments below, too. What have you done to lower your power consumption in your data centers?

[poll id=”66″]

Subscribe to the Hype or Ripe blog to receive text updates! Just text Ripe to 55678.