United States is witnessing a decrease in energy demand from data centers due to Virtualization, Cloud Computing and improved data centre management. Ten years ago, power usage at data centers was growing at an unsustainable rate, soaring 24% from 2005 to 2010. According to a new study made by US department of Energy’s Lawrence Berkeley National Laboratory, data center energy use is expected to increase just 4% from 2014 to 2020.
Total data center electric usage in US, which includes powering servers, storage, networking and the infrastructure to support it, was at 70 billion Kilowatt hours in 2014, representing 1.8% of total US electricity consumption.
If the current trend is taken into account, data centers are expected to consume approximately 73 billion kWh in 2020, becoming nearly flat over the next four years. However, demand for computations and the amount of productivity performed by data centers continues to rise at substantial rates.
The report even suggests that from 2000 to 2005, server shipments increased by 15% each year, resulting in a near doubling of servers in data centers. From 2005 to 2010, the annual shipment increases fell to 5%, but some of this decline was due to global recession. From now, the server growth rate is expected to be 3%, a pace that is expected to continue through 2020.
The reduced server growth rate is a result of the increase in server efficiency, better utilization thanks to virtualization, and a shift to cloud computing. This includes concentration of workloads in so-called “hyperscale” data centers, defined as 400,000 square feet in size and above.
US department of Energy report of 2016 also suggests that energy use by data centers may also decline if more work is shifted to hyperscale data centers, and best practices continue to win adoption.