Carbon Computers making Silicon ones?

An observation by juxtaposition of Humans and Computers with the current growth of AI

AI (Artificial Intelligence) is a big thing in the Silicon Valley. Google, Amazon are betting on it. Very evident when Google bought Deep Mind for more than $500m. Tesla is trying its best to roll out a complete autonomous car, and so are other traditional companies like Nissan, Delphi. There is so much data being collected in the form of point clouds to sensor value for processing and bring out the best product for VR/AR, autonomous drive, IOT enable systems etc. The progression of innovation is very exponential. If you look at all this from a bird’s eye view, we humans are creating an intelligent system capable of making decisions to reduce human efforts.

It is no doubt that humans themselves are a complex machine in ourselves. People have not yet completely understood humans as a species, where everyday a new organ is being discovered. We have extensive amount of sensory and nervous system that informs the human body to make decisions. Our brain collects and process data at 60 bit/sec. Neuroscientists have been measuring the brain’s capability in the same metrics as that of a computer like 1.5 GB data is stored in human genome etc.

This begs the questions that we humans are some sort of a computational machine who are constantly evolving and the sole aim for us is to survive ourself in the process of doing so. And we are constantly in the process of creating “Artificial Intelligence” that can perform the similar tasks like with autonomous car we are trying to create an AI system that best replicates a human being in decisions making skills.

Humans are a Carbon life-form that are creating another life form on Silicon with superior computational and decision making capability to perform tasks -

Sensory Input
The hardware community in the tech field, is constantly evolving the sensor technology that would enable richer information to be captured. Like a better depth camera for VR, LIDAR sensors for autonomous cars, advanced capacitive sensing capability for better interactions. These techniques will enable us to capture richer information to prepare the databank of informations.

The human on the other hand has five sensory input- Sight (ophthalmoception), hearing (audioception), taste (gustaoception), smell (olfacoception or olfacception), and touch (tactioception). Thus the human body has the best camera, microphone and touch sensor. Infact taste and smell are the senses developed in the human and not yet there in current technology field. All these sensors captures data which is rich enough that enables the brain to make decisions. That is why we need one human to drive a car or make a split second decision. All the big data and neural network is happening in the brain within fractions of seconds.

Actuation System
The actuation goes hand in hand with control algorithm like a smart PID algorithm for a smooth stepper motor response, or the way an electromagnetic valve operates based on a control algorithm. Scientist have developed artificial muscles based on the pneumatics or piezo-electric materials that have certain characteristics.

Similarly if you thing about the human body, the muscles are high fidelity actuators. Based on the signals from the nervous system they actuate with utmost precision very fast! I wish I had a motor that could operate with such high fidelity.

At this point I wish to think out loud that Humans are biological computers that are trying to make a silicon based computer. What separates them are their idiosyncrasies.

Does this pose a question on the evolution of the human being? Where we an endeavor by other life forms to “make a computer on carbon base”? Would our designed AI systems, thousand years later make newer system based on other forms?

Like what you read? Give Ansh Verma a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.