The future of AI will not be AI, but NI (part II: Beyond digitization)

Quang Nguyen
3 min readDec 26, 2019

--

About 70 years ago, the first electronic transistor was invented and the computer’s revolution accelerated. A transistor can be used as an electronic switch that changes their states between two values, 0 and 1. Two transistors and a resistor can form a logic gate, a computational unit, for example, a NAND in Figure 4

Figure 4: A NAND gate with 2 transistors as switches and 3 resistors. Source: https://electronics.stackexchange.com/

With logic gate (AND, OR, NAND, XOR,…), the computation machine can be built easily: transform all problems to the binary one, by Boolean algebra, then the machine will “computes” using their gates. The whole computer industry accelerated as we have seen, with smaller and smaller transistors (Moore’s law), denser chip (the Integrated Circuit or IC), better software and connected computers (the Internet). The term “Digitization” is created and refers to binary discretization.

Figure 5: Digitalized data is synonym as binary data (0 and 1)

Computing in binary is, therefore, a history coincident. If it were not electronics, one would have not had the “digitalization” or had a “non-binary digitalization”.

Although the switch (transistors) can compute everything, computing in binary is very inefficient. For example, we humans add two small numbers like 24 + 35 in an instant, with some lighting thinking in our head. However, the computer does need at least a thousand instructions:

Figure 6: With digitization, the computers does a simple addition in a thousand steps

This process is completely artificial and has nothing to do with the way we (humans) do. In fact, not only biological system computes, nature does it in many aspects without digitization:

Figure 7: some examples of natural computation (images from Wikipedia)

Together with the serial computation mentioned above, binary digitalization added another heavy constraint to the current AI’s computation. Both are rooted in electronic circuit behavior. The following infographic shows the development steps from materials to AI’s application:

Future AI systems, therefore, could go directly from material, bypass transistors and all the digital processes, deal with real-value signals rather than 0 and 1, and function mainly based on fundamental physics law.

Part III: Natural thinking

References:
1. Boybat, I., Le Gallo, M., Nandakumar, S.R. et al. Neuromorphic computing with multi-memristive synapses. Nat Commun 9, 2514 (2018) doi:10.1038/s41467–018–04933-y
2. Camuñas-Mesa, L.A.; Linares-Barranco, B.; Serrano-Gotarredona, T. Neuromorphic Spiking Neural Networks and Their Memristor-CMOS Hardware Implementations. Materials 2019, 12, 2745.
3. Pfeiffer M and Pfeil T (2018) Deep Learning With Spiking Neurons: Opportunities and Challenges. Front. Neurosci. 12:774. doi: 10.3389/fnins.2018.00774

--

--