Electricity consumption from data centers doubled from 2000 to 2005 and rose again by 56% from 2005 to 2010. With the shift to cloud computing in the early 2010s, many feared that data center energy consumption would skyrocket. But in a study published in the journal Science, as reported by the New York Times, researchers found that data center energy consumption climbed only 6% between 2010 and 2018, despite computing output jumping sixfold.
Part of the increased efficiency is due to smaller companies abandoning their inefficient servers and switching to larger providers like Amazon and Google, who have the resources and motivation to eke out every bit of efficiency from their data centers. Big tech companies can reduce power usage through techniques like custom chips, high-density storage, custom cooling systems, and virtualization. And if you follow Apple news, you’re probably familiar with Apple’s efforts in this area—the company says that all of its offices, retail stores, and data centers in 43 countries run entirely on renewable energy, mostly generated from Apple’s own projects. However, Bitcoin is still an issue.