What is a computer? Something that computes? Or someone?

George McKee
4 min readFeb 12, 2019

--

Part 2 of Is “Is the brain a computer?” even a good question?

It’s too late to change the answer, and it distracts from the really useful questions about the relations between computers and brains. Nevertheless, a deeper look finds that brains stretch the definition of computing, perhaps beyond the breaking point.

This is part 2 of a series of brief essays (sometimes very brief) on aspects of this question. Part 1 contains the introduction and an index to the whole series.

What is a computer? Something that computes?

People (and their brains) that perform calculations have been called “computers” for a long time. According to a short history by David Alan Grier, the specialized job of computing values of mathematical functions can be traced back to a team of three people led by Alexis-Claude Clairault which worked for five months in 1757 calculating the orbit of Halley’s comet. This role persisted until the 1950s when the human computers were finally superseded by electronic devices such as ENIAC and the “mainframe” big computers that followed it.

Image from https://www.jpl.nasa.gov/edu/news/2016/10/31/when-computers-were-human/ showing a highly parallel computation being performed by a room of hybrid human-electromechanical computers.
The Electronic Numerical Integrator and Calculator, ENIAC, was called a “giant brain” in 1946.

The first electronic computers were each unique, one-off systems. One of them, ENIAC, was created to compute artillery tables that allow humans to predict where artillery shells will end up. But then, how does the artillery shell know where to go? Does it compute its own trajectory?

That’s a question that philosophers have been arguing about for thousands of years. A key issue is the distinction between reality itself and a description of reality. It’s a miraculous feature of computers that they can bridge that distinction, and execute a description of what should happen, making it actually happen. This feature turns out to be missing from fundamental descriptions of reality such as the Lagrangian equation for the core theory of physics, leading physicists to call computation an emergent property. How emergent properties, not to mention “emergence” itself, emerge from physical theories remains an incoherent mystery, as far as I can tell. Not even research in the foundations of mathematics provides much help there. But we have two existence demonstrations, in both ourselves and in artificial computers.

Getting back to the science of warfare technology, the artillery shell follows a path through space that has a mathematical description written in terms of functions which include parameters such as time and velocity. When specific values for those parameters were dialed in, an analog artillery computer such as the Mark 1 Fire Control Computer, used on US warships during World War II, would predict where the shell would land.

Image from http://www.eugeneleeslover.com/USN-GUNS-AND-RANGE-TABLES/FLOW-SCHEMATIC-COMPUTER-MK-10.html

But hundreds or thousands of years ago when artillery was large scale bows and arrows, catapults, and early cannons, the role of targeting computer was accomplished not by a complicated device, but by the brain of the artilleryman.

A ballista, made from wood and rope, could throw heavy rocks hundreds of meters, over castle walls. [image from http://www.wikiwand.com/nl/Ballista ]

Both the Mark 1 computer and the artilleryman computer could skip the steps needed for digital computers like ENIAC, of converting the battlefield situation to numerical measurements, and performing arithmetic operations on those values to produce numbers that were then used to set continuously adjustable controls on the artillery piece. They evaluated the trajectory of the projectile directly, in an analog fashion whose mathematical foundations are based on real numbers and continuous functions, instead of the integer arithmetic operations that digital computers use.

Many writers who have spent all their time with digital computers seem to have forgotten, or never considered that the brains of human computers had ownership of the “computer” term long before the modern digital devices got that title. It’s an injustice to erase all that history and to say that those archaic “computers” didn’t actually compute.

Go on to Part 3

Go back to the Index

--

--

George McKee

Working on projects in cyber security strategy and computational neurophilosophy. Formerly worked at HP Inc. Twitter:@GMcKCypress