What’s next in processor design?

After 50 years Moore’s law is running out of steam

Mark Hammond
Deep Science Ventures
5 min readJun 29, 2016

--

In 1971 Intel released a chip containing 2300 transistors and the number crammed on to a chip has doubled every 2 years since (Moore’s law) to the point that there are now 10 billion transistors in the same space just 14nm apart. Every development cycle now costs twice as much as the previous one because at this density the designers are fighting against the limit of physics its self. The Economist provides a fantastic summary here.

This might seem like an extremely specialist thing to suggest as an opportunity, however the reason I think it is worth covering is because the solutions are likely to come from tangential areas such as biology and new materials. What’s needed is a smarter approach, not just cramming more transistors in to a small space but finding ways to increase the computational power of those transistors by more effective of specalised designs.

Improve on transistor design

Although circuits are obviously incredibly complex the basic building blocks are really very simple units, bundles of logic circuits built from transistors are simply a gate in the circuit that is opened or closed by changing the conductivity of the gate. The major problem with the current design is that the gates leak (and electrons even appear on the other side via quantum tunneling) which wastes power, generates unnecessary heat and causes errors. The Economist article goes in to more detail on the form and new approaches already being tested.

Image: ExtremeTech article on next generation transistors.

Beyond silicon

Principally better conductors reduce the power requirements (and therefore also heat) and allow transistors to be switched on an off at a faster rate. It is well know that the materials with the best electrical properties are alloys of elements such as indium, gallium and arsenide and the big manufactures are experimenting with these but perhaps there are new approaches to combining these materials or leveraging quantum tunneling effects at small scales to overcome their less favorable properties. Graphene is clearly the king of materials in this space but the difficulty has been in making it stop conducting! Finally another area of potential is spintronics in which the direction (up or down) or an electrode is used to represent data, the major problem here is that the signal is so small that is it difficult to separate from the noise.

Special purpose chips

Generalist chips are by nature far less efficient than chips designed for a specific application (in the order of 100x-100x efficiency difference). GPUs are a good example of this and we’re now beginning to see units specalised for visual processing. An area that is relatively unexplored is specialist processing within data centers; both Microsoft and Google have built search algorithms directly in to reconfigurable FPGAs but as far as I know there is still potential to speed up many other data center operations such as routing, caching and security. If there’s interest we could do a specialist guest post on this in the future.

Opportunity: Identify areas in the data center that could be speeded up with application specific FPGAs.

Bio-inspired

It’s not secret that chip designers have long looked at the brain for potential inspiration and this is beginning to see real world implementation in two forms; firstly neuromorphic chips and secondly borrowing inspiration from the support infrastructure of blood vessels to enable 3D processors.

3D Chips: Traditional architectures are limited to two dimensions not due to complexity but due to the need to get sufficient power in and simultaneously remove heat. Incredibly IBM is working on a 3D microfluidic system that not only removes heat but supplies power by utilising flow battery principals with two different liquids separated by a membrane.

Image: IBM’s microfluidic cooled and powered chip

Opportunity: What other clever ideas could cool 3D architectures or reduce the need for power infrastructure?

Neuromorphic chips: These chips are modeled on neurons to varying degrees. This typically means connections that change in strength according to the frequency at which they fire together (based on Hebb’s law which is thought to underlie memory). Both Intel and Qualcomm have chips in production which are extremely low power and appear to be incredibly quick at image recognition. In the same way as it took some time to identify the right way to leverage GPUs for deep learning there may be some opportunities around properly leveraging this new type of architecture.

Opportunity: Identify challenges and associated algorithms that benefit from the highly parallel low power architecture of neuromorphic chips.

Image: Qualcomm introduces the Zeroth Neural processing unit.

Quantum

Quantum computers do one thing very well — they find the lowest value of a complicated function. Moreover the number of bits that they can represent at once is in the order of the number of atoms in the visible universe. This could make them incredible for modeling extremely complex systems although this is still theory. There’s a good chance that quantum will make current security methods useless so there are clearly opportunities are protecting against this or new quantum security protocols. It’s also likely that they will massively speed up machine learning algorithms which are based on finding the global minimum. Finally the biggest opportunity is likely in physical systems from how drugs interact with receptors to chemical interactions, materials and other complex systems.

All of the big computing companies are trying different approaches to represent qubits from supercooled wires to photos. However the biggest general challenge at hardware level is in isolating the state of the qubits from outside interference.

Opportunity: At the application level quantum ready models and algorithms of complex systems. At the hardware level isolating qubits from interference and measuring their state without the state collapsing.

--

--

Mark Hammond
Deep Science Ventures

Founder at @deepsciventures creating a new paradigm for applied science. Ex-neuropharmacologist & AI researcher.