The Massive Energy Bill Behind Your Data
How will we offset the rapidly increasing power consumption of data centres? Hyperscale has plucked the low hanging fruit. What’s next?
Pop quiz: Which organ of your body uses the most energy?
It’s the brain, of course.
Your brain uses up to 20 percent of all the calories you consume each day.
Maybe that explains why we’ve readily accepted that computers will use a huge amount of energy in processing, and emit so much heat that they also create a secondary power need to cool them down?
We are becoming increasingly aware of the amount of energy the data centres supporting our favourite social media sites, film & music streaming services and online shopping are using, and are expected to use in the future:
Last year, data centres worldwide used an estimated 200 terawatt hours (TWh), which is more than the national energy consumption of some countries, and accounts for approximately 1% of global electricity demand. The data centres’ thirst for electric power is predicted to rise rapidly within the coming years, particularly if computationally intensive cryptocurrency mining continues to grow.
This could potentially make a big contribution to global carbon emissions, since only approximately 20% of the electricity used by data centres comes from renewable sources, according to Greenpeace.
However, some 20 Internet companies, including big players, such as Facebook, Apple and Google have committed to using 100% renewable energy. Some plan to build renewable power plants near their data centres, but in most cases, they are buying electricity from a grid provider.
Google is now the largest corporate purchaser of renewable energy on the planet. There are some interesting implications to the fact that data centres are gobbling up the renewable energy produced in countries like Ireland, which the Guardian newspaper explored in a great article from 2017.
Pessimistic models predict that electricity use by data centres could rise to approximately 8% of the global total by 2030. More moderate estimates say that, although their requirements will increase, efficiency gains will offset them. Let’s consider those.
Is there a better way to cool data centres and recycle their heat wastage?
Efficiency improvements are possible. Hyperscale data centres are ultra-efficient and use computing architecture designed for scalability (up to hundreds of thousands of servers).
The power usage efficiency (PUE) of a data centre is defined as the total energy needed to run it, including lights and cooling, divided by the compute energy used. Traditional data centres typically have a PUE of about 2.0, while hyperscale facilities have been trimmed to about 1.2 by eliminating unnecessary energy sinks (no video monitors, no flashing lights) and building to a purely fit-for-purpose design.
Hyperscale data centres currently number around 400 and account for one fifth of the world’s data centre electricity usage, but within a few years, estimates are that hyperscale centres will step up the pace of replacing traditional data centres and account for almost half of consumption according to an International Energy Agency report.
Hyperscalers have also reduced their PUE by improving cooling. In a traditional data centre, air conditioning accounts for approximately 40% of the energy bill (and uses billions of litres of water). By eliminating the need for compression chillers and cooling towers there are environmental gains in both energy and water consumption. One way to do this is to use colder locations for data centres (hello Scandinavia and Iceland) and blow the cool outside air into them. Other approaches include piped water, which allows data centres to be located anywhere and cooled using warm water.
At a recent summit I attended, there was an interesting discussion about immersing servers in a cooling liquid.
Along with better cooling is the more efficient reuse of any heat produced by the data centre, for example, to warm neighbouring buildings but, because heat doesn’t travel well, this only works when data centres can be placed near any recipients for their heat. Alternative approaches are to convert the excess heat into electricity or use it to power cooling systems that can, in turn, be used to keep data centres cool.
The future
Although our brains use 20% of our calorific intake, they are comparatively far more efficient than computers COUs. Perhaps it’s time to accept a need to redefine the computer itself to cut energy use, by moving beyond silicon?
Besides quantum computing and hybrid models, there are some way-out ideas such as reaction chemical computing, DNA computing, reservoir computing and the use of slime mould (which is verging on the recent series of Star Trek!). Other approaches are to simulate the neurons of animal brains (neuromorphic computing) which is being trialled by IBM and Qualcomm and may be particularly successful for aspects of AI such as pattern recognition. That’s the subject of a future article, which I’m looking forward to writing soon!
Please let me know your thoughts in the responses. Are we headed for a future without silicon? Where are the possible energy savings? Should we reconsider our need for massive data centres?