Neuromorphic computing: a hardware for a greener computing sector?
By Flo Greatrix, Policy Adviser at the UCL Department of Science, Technology, Engineering and Public Policy
As the UK gears up to host the 26th UN Climate Change Conference, COP26, in Glasgow in November, we consider the environmental impacts of the computing sector, taking a close look at data centres. We also consider the opportunities for neuromorphic technologies — a new type of hardware — to reduce the carbon footprint of computing.
The unequivocal threat of climate change demands urgent political action on a global scale. Its impacts are being seen on a daily basis, most recently the devastating bushfires in Australia. The subject is finally climbing higher up the UK’s political agenda; the UK Parliament approved a motion declaring a climate emergency in May last year and the government has committed to net-zero target for all greenhouse gas emissions by 2050. These targets will require strong government policies to make them a reality, a topic of Professor Jim Watson’s recent piece for UCL Policy Postings.
When we think about sources of greenhouse gas emissions, we tend to think of polluting cars, the headline-hitting aviation industry, and the energy used to heat homes and workplaces. You could be forgiven if data centres do not immediately come to mind — but perhaps they should.
Data centres are energy guzzlers
Cisco defines data centres as “a network of computing and storage resources enabling the delivery of shared software applications and data.” Put simply, they are warehouses where computing and networking equipment is concentrated for collecting, storing, processing, distributing or allowing access to vast amounts of data.
The ICT sector in its broadest sense (including mobile phone networks and televisions) already accounts for more than 2% of global carbon emissions. This is on a par with emissions from the aviation industry via fuel. Data centres make up a sizeable proportion of this ICT sector total. That’s because so much of daily life now uses the internet and involves information being sent to and from data centres (aka ‘the cloud’), from watching Netflix or asking Google a question to uploading a photo to Instagram. Companies like Apple, Facebook and Amazon all require huge data centres to run their services.
The electricity demand of data centres has remained roughly level over the last few years, thanks in part due to massive efficiency gains as many of them have transitioned from being small scale to emerging ‘hyperscale centres’ — super-efficient data factories scaled up to hundreds of thousands of servers.
However, we are using these services more and more, while using more devices to access them. Phones and laptops aside, the number of ‘Internet of Things’ (IoT) devices connected to the internet (such as virtual assistants and smart appliances) is expected to triple from 7.5 billion in 2018 to 25 billion by 2025 (GSMA, 2019), and the requirement of data centres for IoT devices is not yet fully understood. While the idea is that more data is stored at ‘the edge’ (where data can be processed without being sent to the cloud or data centre to reduce latency), data may still be sent to centres for collection and reporting by companies.
On top of this, IT infrastructure requirements for blockchain and cryptocurrency technologies are evolving rapidly, and the impacts of these developments on the energy consumption of data centres are not yet well understood.
Can we meet our climate targets and continue with rapid technological advancement?
Current messaging from government is that breakthroughs in technologies, from artificial intelligence to biotechnologies have ‘the power to reshape almost every sector in every country.’ The government’s industrial strategy promised to position the UK to make the most of this transformation, branded as ‘the fourth industrial revolution’.
But can we really achieve this growth alongside ambitious climate change targets? Obvious solutions to the data centre problem that come to mind would be to make them as efficient as possible, or somehow reduce our use of technology as a nation. But neither of these is straightforward.
Data centres can only be as efficient as their hardware, which is reaching its limits. Current computers have separate storage (memory) and computing (processing) units and use most of their time and energy moving data between these units. In recent decades, we have mitigated this problem by increasing the numbers of transistors to keep making computers, smaller, faster and more power-efficient.
Currently, 100 million transistors can fit onto one chip. However, fitting in more is becoming increasingly difficult, which will stop processing power from continuing to increase at the current rate.
An alternative: New computing hardware
New hardware could be part of a solution. Researchers at UCL are developing ‘neuromorphic’ computing hardware, which has the potential to drastically reduce the energy used in data processing.
What is neuromorphic computing?
Neuromorphic computers are inspired by biology, designed to mimic the neural systems found in the human brain. Neuromorphic chips operate in a fundamentally different way to the silicon chips found in traditional computers. In the brain, processing and memory functions are performed by neurons and synapses in a single location. While conventional computers have separate memory and processing units, neuromorphic computers will perform these tasks on one chip.
Without the need to transfer data between memory and processing units, processing time and energy use will be reduced: it is estimated that a neuromorphic computer could use up to 100,000 times less power than conventional computers, which would make data centres much more efficient. What is more, neuromorphic computing could vastly increase the amount of data processing that can be done on individual devices, and for some applications (particularly IoT and edge devices), would remove the need to send data to data centres for processing.
When will it be a reality?
In the next three to five years, the researchers at UCL expect to be using neuromorphic chips they have developed to increase the efficiency of conventional computers by creating ‘hybrid’ machines, which will improve performance relative to conventional computers. Longer term, a new, fully neuromorphic computer will be fundamentally different and powerful for specific applications (from natural language processing to the operation of driverless cars). We also need to create new programming languages and software to operate neuromorphic hardware.
If the UK is to meet our climate targets and maximise the potential societal, economic and public benefits of the ‘fourth industrial revolution’, it’s essential that we have hardware that can support the ‘net-zero’ agenda and meet our increasing computing demands. Neuromorphic hardware could offer both.
________________________________________________________________
Find out more
UCL held a policy roundtable in December on the potential role of neuromorphic computing in the UK’s technological future with representatives from government, parliament, professional institutions and not-for-profit organisations, chaired by Alok Jha. The session included topical discussions on the challenges of current computing, with an aim to consider how neuromorphic hardware could intersect with other advancing computing technologies like quantum, and the policy challenges that may lie ahead. Read more about the discussions here.
Finally, if you want to read more about neuromorphic computing, take a look at our short policy briefing here and take a look at the researchers’ webpages here.
The PIU provides professional policy engagement expertise and collaborates with researchers to help feed research-based evidence into the policymaking process. www.ucl.ac.uk/steapp/PIU