Skip to content
Thoughtful, detailed coverage of everything Apple for 29 years
and the TidBITS Content Network for Apple professionals
3 comments

Cloud Computing More Energy Efficient Than Initially Thought

Electricity consumption from data centers doubled from 2000 to 2005 and rose again by 56% from 2005 to 2010. With the shift to cloud computing in the early 2010s, many feared that data center energy consumption would skyrocket. But in a study published in the journal Science, as reported by the New York Times, researchers found that data center energy consumption climbed only 6% between 2010 and 2018, despite computing output jumping sixfold.

Apple's Iowa datacenter

Part of the increased efficiency is due to smaller companies abandoning their inefficient servers and switching to larger providers like Amazon and Google, who have the resources and motivation to eke out every bit of efficiency from their data centers. Big tech companies can reduce power usage through techniques like custom chips, high-density storage, custom cooling systems, and virtualization. And if you follow Apple news, you’re probably familiar with Apple’s efforts in this area—the company says that all of its offices, retail stores, and data centers in 43 countries run entirely on renewable energy, mostly generated from Apple’s own projects. However, Bitcoin is still an issue.

Read original article

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For 29 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

Comments About Cloud Computing More Energy Efficient Than Initially Thought

Notable Replies

  1. Both things can be true: data centers consume a lot of power and they’re also a lot more efficient now. The 6% increase was much less than expected, which is good, but it’s still an increase.

  2. An important thing to remember is that the businesses using all this cloud computing need the computing power. If they don’t lease it from a cloud provider, then they will need to provision their own private data centers.

    It’s not fair to simply show cloud power consumption in isolation (even comparing it against past consumption), but you need to compare it against estimated power consumption of the private data centers that the cloud services replace.

    I suspect (based on no more than gut feelings) that a cloud provider will, in general, be more efficient than a private data center for several reasons, including:

    • Load balancing. There is less unused resources (CPU cycles, bandwidth, etc.) because the equipment is shared by multiple customers and loads are distributed throughout the data center.
    • More motivation for efficiency. The ginormous data centers used by cloud providers are orders of magnitude larger than private data centers. So there is far more cost savings from even small improvements in efficiency
    • Billing. If a customer is paying based on usage (CPU cycles, transactions, etc.), he is going to be motivated to optimize this usage. In a private data center, equipment and space is paid-for whether or not it is used, so there is less motivation to optimize usage as long as there is spare capability available.
    • Similarly, the cloud provider has a financial incentive to maximize efficiency because that means lower costs for the same amount of service (and therefore income).
    • Focus. Although big companies have IT departments full of people skilled at running data centers, smaller companies generally do not. They focus on their products and less on their IT infrastructure. Therefore, they may not be able to take advantage of opportunities to maximize efficiency and minimize costs. For a cloud provider, however, this is their entire business, so that is what they will focus on.

Join the discussion in the TidBITS Discourse forum

Participants