What’s Eating You? Probably Software

Andrew Pederson
Impact Policy
Published in
6 min readJun 16, 2015

When we sometimes hear that “software is eating the world,” the specific implications can be difficult to describe before they occur and upset managers who are uncomfortable with change and resist adapting to new market conditions. In food, agriculture and international development, the three worlds Impact Program Design intersects, data science is “eating” measurement and evaluation for corporate and international development strategy and investments. For example, the large programs managed by the WorldBank, FAO, USAID, IFC, IMF and many others that co-fund investments with corporate capital via public private partnerships.

As Marc Andreessen described, “Software is also eating much of the value chain of industries that are widely viewed as primarily existing in the physical world.” This is the case in food and agriculture, where multinational manufacturers and retailers procure ever more varied materials and products from a diversified global supplier base. What is all this abstract software cloud mobile local social nonsense about? Paul Ford has 38,000 words and an amusing Github Repo for you, and Motherboard has a timely meta-summary if you’re unlikely to take 2 hours to go through Ford’s interactive piece.

Let’s talk more about computing’s physical aspects, since software has been picked apart already this week. The human mind is the most talented organism at pattern recognition, and since it’s based on the brain’s physical reality and limitations, enhancing this inherent capacity requires external resources. Over time, these external memory and computational aids have manifested as a variety of objects and technologies.

In Mesopotamia and Sumeria, cuneiform allowed merchants to keep track of accounts beginning around 4000 years BCE.

Credit: British Museum London 2005

2500 years BCE, the Inca Empire, Tawantinsuyu, used the quipu, a system of knots, to track accounts across 308,000 square miles and 12 million inhabitants.

Source: http://www.ancientscripts.com/quipu.html

Around the same time, the Sumerian invention of the abacus was a smash hit, and the technology spread through the world, including Rome. In parallel, several early writing systems, like Maya and Egyptian hieroglyphics, also recorded events and kept track of assets, though these are more difficult to interpret than calculations. In Rome, the wax writing tablet survived next to the abacus calculators and eventually evolved into the scrolls and codices that would become books.

Source: http://upload.wikimedia.org/wikipedia/commons/0/05/Palenque_glyphs-edit1.jpg

Thousands of years later, the English (recently conquered by the French) produced the Domesday Book, a public accounting of goods and infrastructure which apparently frightened contemporary citizens more than online privacy, as they viewed the book as a sure sign of the end of the world.

Source: http://www.nationalarchives.gov.uk/domesday/images/fig01-Domesday-Book.jpg

“Providing definitive proof of rights to land and obligations to tax and military service, its 913 pages and two million Latin words describe more than 13,000 places in England and parts of Wales. Nicknamed the ‘Domesday’ Book by the native English, after God’s final Day of Judgement, when every soul would be assessed and against which there could be no appeal, this title was eventually adopted by its official custodians, known for years as the Public Record Office, and recently renamed the National Archives.”

Since the Domesday Book could easily kill small children and animals if dropped, more efficient and compact computational tools were needed. Unfortunately, manufacturing technology and materials science were not as advanced as the human desire to increase cognitive capacity. Enter Charles Babbage’s Difference Engine, perhaps the first computing machine, made to calculate tide tables automatically.

Source: http://www.computerhistory.org/babbage/engines/img/3-1.jpg

As Paul Ford rightly points out:

“A computer is a clock with benefits. They all work the same, doing second-grade math, one step at a time: Tick, take a number and put it in box one. Tick, take another number, put it in box two. Tick, operate (an operation might be addition or subtraction) on those two numbers and put the resulting number in box one. Tick, check if the result is zero, and if it is, go to some other box and follow a new set of instructions.”

We dealt with huge stacks of paper, knots and abaci until IBM introduced punch cards in 1928, at which point we dealt with huge stacks of punch cards. Still, the principle depends on distinguishing nothing from something, 1 from 0, knot from space. While the Sumerians’ philosophical dictum at the time they invented cuneiform remains mysterious, perhaps they also related to Parmenides’ maxim that “nothing comes from nothing.” Computers were mechanical until 1964, when UPenn’s ENIAC replaced spaces, switches and punch cards with electronic circuits.

Source: http://upload.wikimedia.org/wikipedia/commons/4/4e/Eniac.jpg

Now with ever smaller transistors, the microscopic electronic switches that power contemporary binary computing, possibly approaching a plateau in Moore’s Law, the computing world is looking ahead to quantum computing, where a qubit, the basic unit of computation, can have many different potential values. But why do we go to such great lengths to physically manifest and scale up our existing, unmatched cognitive capacity?

In 1945 Vannevar Bush’s As We May Think captured this dream as man’s desire for order and easy access to information to guide decisions and make life better, an ethos strongly echoed by Google’s mantra to “organize the world’s information and make it universally accessible and useful.” In other words, to make abstract mental associations physically real and durable in order to expand human cognitive capacity. As this trend accelerates, people will necessarily want to close this gap further by wearing and, eventually, implanting computers on and in their bodies. Don’t believe it? Read Neal Stephenson’s Snow Crash, Charles Stross’s Accelerando, Warren Ellis’ Transmetropolitan and then watch Modify, a documentary on body modification. People will be into it, especially those punk kids down the street who are always Snapchatting on my lawn.

Many fear that the external brain-enhancing calculators, upon achieving “artificial intelligence” and sentience, will rise up to destroy their foolish creators. As Michael Jordan has explained to little public effect, we are still a very long way from computers approaching human intelligence, and today’s terminators still can’t stand up unaided. Predator drones, on the other hand, are controlled by humans and are exceedingly lethal, with no need at all for a computer to do any thinking or killing at all.

Yet we will persist in building and refining these technologies regardless of the mindless fear that drives many away from science and technology? Jung’s understanding of the subconscious and its language of dreams as “constellations of thought” tells us that we are in some way connected by our thoughts — perhaps over the next 50 to 100 years, we will continue to physically manifesting the best our minds have to offer. Or we will all perish in nuclear holocaust. Is the outcome really binary?

Related

Originally published at impactprogramdesign.com on June 16, 2015.

--

--

Andrew Pederson
Impact Policy

My dream is to see evidence based policy triumph over politics as usual, and my personal passion is for woodworking and reading.