The Beginning of Innovation

The story of how we became human

Shigeru Miyagawa
MIT Open Learning
22 min readNov 22, 2019

--

By Shigeru Miyagawa, Senior Associate Dean for Open Learning and Professor of Linguistics, MIT

From 2001: A Space Odyssey

"God creates … I assemble"
— George Balanchine

How did we become a species so obsessed with innovation? Our ancestors were innovating from the earliest time in evolution when tool making began, over two million years ago. When we look closely at the nature and the pace of tool making over the entire arc of the Pleistocene, we find a remarkable resemblance to a modern phenomenon.

The Beginning of Innovation

In the opening scene to Stanley Kubrick’s 2001: A Space Odyssey, a caveman tosses a sun-bleached bone straight up in the air. It rotates end over end until it reaches the top of its flight, and as it begins to fall, it morphs into a spaceship. This one brief scene shows us the entire arc of human innovation — from the original primitive tool to space travel. What drove our ancestors in Africa to start crafting the first tools out of rock and bone? And what spurred the breakthroughs and innovations that, after millions of years, led to the techno-centric being we have become? As it turns out, our primitive past, techno-centric present, and AI-infused future can all be linked by one singular modern phenomenon.

Moore’s Law

The field of electronics, which today seems dizzyingly fast paced, began in a more unhurried manner with the invention of the transistor in the early 20th century. Years of development by physicists at AT&T’s Bell Labs and independently in a subsidiary of Westinghouse in Paris and other places gave birth to the first consumer transistor product, a radio co-produced by Texas Instruments in 1954. This marked the dawn of the age of solid-state consumer electronics. This first radio had four transistors.

From here, the pace of development picks up as defined by Moore’s Law. Moore's Law predicted that the speed and power of the microchip will double every two years. The key breakthrough that made the breakneck pace of innovation possible was the invention of the integrated circuit. Instead of manufacturing one transistor at a time, the integrated circuit allowed transistors and other electronic components to be etched onto a surface as small as a thumbnail. This drastically cut production cost, and made the electronics function far more efficiently because of the components’ proximity to each other. Currents need to travel less distance, which has the dual virtue of increasing speed and reducing the heat that is generated. Today, a single microchip contains up to 20 billion transistors. That packs serious power into an awfully narrow space. As it turns out, this idea of packing transistors into small spaces finds an analog in primordial Moore’s Law, where it isn’t transistors that are packed in, but neurons, into a specific narrow region of the brain. About 16 billion of them — not too far off the 20 billion transistors in today’s microchip. In both cases, miniaturization — of transistors and neurons — makes it possible to pack such a huge number into a confined area. And the distance between the components, transistors/neurons, is strikingly similar — around 10~15 nanometers for transistors and 25 nanometers for neurons.

Primordial Moore’s Law

In 2011 archeologists Sonia Harmand and Jason Lewis of Stony Brook University were headed for a known excavation site in northern Kenya when they took a wrong turn and ended up in an area never before explored. Out of curiosity they surveyed the area located in Lomekwi, and in short time found surprisingly large, knapped tools on the surface and below ground. They named it Lomekwi 3. Eventually they’d learn that these tools were 3.3 million years old.

This was almost a million years older than what was thought to be the oldest tool, and it predated the Homo genus by hundreds of thousands of years. Why did it take so long for archeologists to discover it? The reason has to do with a mystery surrounding the Lomekwian tool. After making its appearance 3.3 million years ago, it disappeared into evolutionary thin air, its craft apparently failing to be transmitted through subsequent generations, so it didn’t catch on and spread over a wide area like any good tool should.

The Oldowan, which until the discovery of the Lomekwian tool, had the distinction of being the oldest tool, dates back 2.4 million years. Discovered in Gona, Ethiopia, the Oldowan tool, much smaller than the Lomekwian, was produced by direct hammering to create one sharp edge. Created by Australopithecus as well as the earliest species of the Homo genus, the Oldowan tool, unlike the Lomekwian tool, was adopted by all of Africa.

And it didn’t just stay in Africa, but migrated alongside our earliest ancestors across a huge span of land mass — Middle East, Europe, and Asia, as far away as southern China and into Indonesia. It took a million years for the tool to spread across this huge terrain, and during that time, the Oldowan industry did not change much in way of innovation. Our ancestors were satisfied with its capability, and they kept busy pushing into new territories, like Europe, which had freezing climate. Our ancestors survived the cold because this simple tool allowed them to shear hide from carcasses to make clothing that kept them warm.

700,000 years after the first appearance of the Oldowan, around 1.7 million years ago, as our earliest ancestors were busy spreading the Oldowan tool across Asia, Middle East, and Europe, a later hominid, the Homo erectus, crafted the first Achulean tool.

Also discovered in Ethiopia, the Silicon Valley of toolmaking during the Pleistocene era, the early Achulean assemblages resembled the Oldowan, but with one striking new feature. The Homo erectus used alternate flaking to produce bifacial handaxes — it had symmetry. Homo erectus spread adoption of the Early Achulean tool throughout Africa and beyond, but its spread stopped at the Movius Line, which extends from India, through the Middle East, and into northern Europe. Below the Movius Line, the Oldowan and the Achulean tools co-existed, but beyond the Movius Line, in Asia and today’s Russia, only the Oldowan persisted, and it did so for hundreds of thousands of years.

A million years after Early Achulean tool, a remarkable tool made its appearance. Unenthusiastically named the Late Achulean, the new tool was thinner, more refined, and crafted into a bifacial handaxe. This coincided with the emergence of Homo heidelbergensis. This new species possessed substantially more brainpower, and possibly because of this newly found power to compute the world, the Late Achulean tool showed one feature never before seen — artistry. The toolmakers cared about what others thought of their tool. They sometimes chose core pieces with mosaic color patterns, or embedded the tool with fossils. These tools were no longer just utilitarian, they were a fashion piece created to interact with the community, to impress others and maybe even to win a favor or two.

There was even showmanship. In Olorgesailie, Kenya, there is a place where the ground is strewn with hundreds of handaxes from Early Achulean, clearly many more than was needed as tools. One could imagine that the toolmakers were showing off their toolmaking prodigy, akin to a peacock spreading its wings.

The first four tool industries, Lomekwian, Oldowan, Early Achulean, and Late Achulean, were often separated by a million years. During this time, tools either disappeared (like the Lomekwian), or persisted without any significant modification. Slow and easy. Then came the Neanderthals. And what they did, starting 300,000 years ago, laid the foundation for toolmaking right up to the end of the Pleistocene. The stone industry, called Mousterian, involved a remarkable technique known as the Levallois method. By flaking off lightweight, sharp edges from a core stone, Neanderthals were able to construct spears and other instruments. From the huge Lomekwian tool 3.3 million years ago, there is a progressive reduction in the size of the tool, right down to the slender and piercing points of the Mousterian, which lasts through the emergence of the modern Homo sapien in Africa 200,000 years ago.

Now the fireworks begin. Hold on to your tools! Around 100,000 years ago, the primordial Moore’s Law kicks in, and the innovations come not in million year units, but in a fraction of the time — every ten thousand years or so. And unlike before, where only one or two types of technologies existed at any given point, all of a sudden there was an explosion of different types of tools across Africa and beyond. These Homo sapiens were much more deliberate in the manufacturing process, transporting material over long distances to sites dedicated to the production of tools. With each succeeding compressed era, pure utility gave way to the decorative and the symbolic — pierced shells and carvings that served as jewelry, figurative art carved out of organic material, and magnificent art depicting animals and mysterious abstract symbols that adorned the walls deep in caves across northern Europe.

Neuro integrated circuit

Miniaturizing transistors and packing them into narrow spaces allowed the microchip to amass power exponentially. It’s at the heart of the microchip, found in all sorts of electronic devices, from the small RFID chip that is used to tag items in stores, to the nerve center of the spaceship that travels to Mars. In our brain, neurons play a role that parallels the transistor, by taking in signals from other neurons, processing these signals, and transmitting the output signals to neighboring neurons. How many neurons does it take to operate a human being? Focusing just on the cerebral cortex, which is the nerve center that allows us to have interesting thoughts, create music, and gab away with language, the number of neurons that are firing away in us is an order of magnitude more than in macaques and three orders of magnitude more than in a house mouse. That’s packing in serious firepower into the narrow confines of the cranial cavity. Before we start actually counting neurons in our brain, we need to understand how our brain grew to such an epic proportion relative to our body weight.

If you had come upon an Australopithecine four million years ago in the savannahs of eastern Africa, you’d think that you came upon a funny looking chimpanzee, but that first impression changes as soon as you realize that it’s got habitual bipedal locomotion, something you only occasionally see with chimps. If you were really lucky, you’d run into Lucy, who became something of a cultural icon when Donald Johanson discovered her in 1974 at the Hadar village site in Ethiopia.

She was a marvel of pre-Pleistocene specimen, dazzling the paleoanthropologists with an astonishing 40% of her fossilized skeletal structure intact, and she had a lot to say from a time far, far away, over 3 million years away. While Lucy was roaming the savannahs of Ethiopia foraging for food, her contemporaries in Kenya did something unheard of in hominid evolution — they picked up a large rock and knapped one side to form a sharp edge, creating the first tool ever made by a hominid. There is mystery surrounding the Lomekwian, as the tool is called. In later times, every new tool was enthusiastically embraced by the community and transmitted from one generation to the next, each subsequent generation spreading the tool over an ever-wider span of geography, eventually bursting out of Africa into all parts of Asia, Middle East, and Europe. But not the Lomekwian. It petered out. The Stony Brook University team discovered it in Kenya, but no one anywhere else has found anything like it. Not in southern Africa where other Australopithecines lived, not in the Afar region of Ethiopia where our cultural icon, Lucy, lived. If you look at the brain of these hominids, we get a hint of what happened. These hominids that predate the Homo genus had a brain that was barely bigger than that of the chimpanzee, 400 cubic centimeters, a modest 30% of our own brain. This brain was good enough for creating the first primitive tool, but may not have had sufficient computing power to transmit the kind of complex knowledge required for toolmaking to others in a sustained way. They couldn’t create a pedagogical culture to pass on its technique effectively — they lacked the computing power necessary to convey complex knowledge. So, over time the Lomekwian was forgotten, never having spread beyond the small confines of the Lomekwi region. In fact, it’s not clear if these early hominids even used the Lomekwian as a tool. Darwin suggested that bipedalism freed the hands to build tools, and by their development and usage, led to the enlargement of the brain. The Australopithecus brain did not increase significantly in size during their time on earth, which may suggest that they fulfilled one half of Darwin’s promise, to create a tool because their hands were freed up, but not the latter half, where they would use and further develop the tool, which, had it happened, would bestow on them a larger brain that enhances their cognitive capability. When we consider what is happening today with the rapid development of AI, we see ourselves at an inflection point similar to what the Australopithecines faced after carving the Lomekwian tool. The creature that was smart enough to invent the Lowekwian tool didn’t have the additional cognitive capacity to convey its manufacturing knowledge. The tool was beyond the cognitive capacity for adaptation. In the same way, thanks to the modern Moore’s Law and the AI algorithms it spawned, technology, in many ways, is overtaking our ability to keep up, to adapt. What will happen? We will put some meat behind this question by looking closely at the development of tools and the parallel development of the brain.

A million years after the Lomekwian tool came and went, one of the first of the Homo species makes an appearance, with the official name of Homo habilis, but a more apt nickname of “handy man,” for their propensity to create primitive tools and transmit them with determination to subsequent generations. Darwin would be pleased to know that the handy man’s brain had grown to 50% of today’s humans, which, by Darwin’s conjecture, resulted from toolmaking, and this, in turn gave the handy man enough brainpower to persistently transfer knowledge to the members of the community. Still looking more like a chimp than your neighbors, the handy man would never blend in with your Saturday barbeque party crowd, but he did have a less protruding face than the Australopithecines, although the disproportionately long arms were an easy giveaway that this was no ordinary Joe or Sue. The tools they created, called the Oldowan, were named after the Olduvai Gorge in Tanzania, where Louis Leakey first discovered Oldowan lithics in the 1930s. The oldest Oldowan tools date back 2.6 million years, and it was taken over by the Homo erectus, who appeared 2 million years ago in Africa.

Homo erectus appeared with a brain twice as large as the Austrolopithecine but still 60% of today’s human brain. They were a restless bunch, moving out of Africa to all parts of Asia and Europe, spreading Oldowan tools as far away as China (where Peking Man was found) and Indonesia (where Java Man was found).They were also a very successful Homo species, having hung around for 1.5 million years, by far the longest-living of all hominids. We can’t hold a candle to them, having only been here for 200,000 years; if only we could reach back in time to speak to a Homo erectus and ask for some survival tips. Their brain, at 800 cc, was powerful enough to take over the Oldowan tool from the handy man, and to spread it all over Asia and Europe. But for a million years, the tool stayed essentially the same, a single-side handaxe shaped by sharp hammering. But the brain didn’t stay put: the use of the tools, and encounters with new environments, led to an upward tick in the size of the brain.

A funny thing happened that had enormous consequences for everything, but especially the brain: the hominids started to play with fire. Until then, the hominids, starting with early Homo erectus, but especially in the later specie of Homo heidelbergensis, developed a much better tool, with the melodic name of Achulean, which was a bifacial handaxe, much more efficient for cutting hide, breaking bones, and crushing roots and nuts. That helped with digestion. But fire made it possible to predigest rough food outside of the body, so that by the time you put it in your mouth, it was reduced to a softer foodstuff with a higher concentration of nutrition. The two organs that consume most energy are our gut and our brain. This combination of cooking and use of tools put less burden on the digestive system, in turn turbocharging the brain, and by the time Homo heidelbergensis appeared 800,000 years ago, they had a brain almost as large as ours, coming in at 90% of the size of our brain.

There was also another change, an anatomical one. Before all this, we needed a big gut to digest tough and lower-quality food, but with fire and better tools, our gut reduced in size and, for the first time in evolution, our ancestors got to have something we take for granted — a waist. This was when they had to have invented the belt to keep their pants up. Not only did fire make us more fashionable, with a waist, it also made us into a more leisure-loving being. When we were eating just raw food, we had to spend over eight hours a day foraging and chewing just to consume enough calories to survive. Suzana Herculano-Houzel points out that fire made it possible to extract almost 100% of the nutrients from foodstuff, instead of just 30% from raw food, so not only did the cooked food become easier to digest, but it was much more nutritious. Now, they had time on their hands. What did they do with all that time? Likely they spent time in their communities to further extend our modern human behavior.

Despite the rapid growth in the size of the brain, the pace of innovation was fairly modest. Homo heidelbergensis, and the Neanderthal that followed, whose brain actually exceeded ours by 10% due to an enlarged occipital region, did come up with one transformative innovation. The tools they produced became symbolic in nature, their handaxes becoming thinner, refined, and often decorated with colored rock or even a fossil in the center as ornament. But then, something clicked, and around the time we established ourselves as Homo sapiens in Africa and beyond, the pace of innovation hit a dizzying tempo, with one innovation stacked on top of another, and a diversity of techniques popping up all over Africa. This was the primordial Moore’s Law making its appearance.

Now we are ready to start counting neurons in brains. But how do you count neurons in a brain? Suzana Herculano-Houzel not only asked that question, being a neuroscientist, she went about answering it in her lab in Rio de Janeiro. The method she came up with is, well, rather unusual. She soaked the whole brain in detergent, which melted it into a soup, leaving only the nuclei of the neurons floating in this semi-transparent goo. After counting the number of nuclei under the microscope in several spots, she extrapolated from that the number of neurons in the entire brain. So, how many neurons are in a human brain? 86 billion. As it turns out, that’s just the beginning of the story, because this number raises a number of curious puzzles. Amazingly, the solution to these puzzles can be found by looking to modern Moore’s Law — miniaturization and packing in, not of transistors, but neurons, into a tight space.

Although the human brain, at three pounds, only takes up about two percent of the body mass, it is a preposterously hungry organ, consuming 500 kilocalories a day, or a whopping 25 percent of our daily caloric intake. Is that normal in the animal world? Not by a long shot. The average daily budget for the brain of other vertebrate species is 10 percent of the overall daily intake, so why does our brain devour two and a half times the norm for other vertebrates? Part of the answer is that this is completely normal, the other part that it is utterly abnormal. When you consider the number of neurons we have, it is completely normal, because each neuron in our brain consumes about the same amount of energy as the neuron in a mouse, a cow, or a monkey. What is abnormal is the sheer number of neurons we have packed into the narrow confines of the cranial cavity. This has been happening throughout evolution. According to Herculano-Houzel, chimpanzees and cows have brains that are similar in mass, yet chimpanzees have twice the number of neurons, what she calls the “primate advantage.” If we trace our lineage from Lucy, the Australopithecine from over three million years ago, had 30 billion neurons; the handy man, the Homo habilis, from two million years ago, had 40 billion neurons; and the restless Homo erectus that took the Oldowan tool to the far reaches of Asia operated on 50 billion neurons. From there to Homo heidelbergensis and Neanderthals, it’s easy steps to reach the 86 billion neurons in modern humans. But what is puzzling is that while the number of neurons kept increasing, the cranial size did not grow proportionately, especially in the last few Homo species. Quite in contrast, rodents have a scaling measure in which the increase in the brain cavity leads to an increase in the size of the neuron. The neurons themselves get larger as the rodent brain expands. But in primates this doesn’t happen because the enlargement of the brain is not accompanied by a corresponding enlargement of the neuron. Instead, the neuron stays the same size. Although the neuron does not decrease in mass, it does relative to the brain size as the brain enlarges because the neuron keeps its size, leading to what is equivalent to miniaturization of the neuron. This is the reason why primates can pack in more and more neurons into a cranial cavity. That is why human brain is so metabolically expensive; each neuron consumes the same energy as that of a mouse, a capybara, or a monkey, but we just have many many more neurons than these other animals.

The story does not end there. As Herculano-Houzel points out, of the 86 billion neurons in the brain, 16 billion are concentrated in the cerebral cortex, which controls much of information processing for high-order cognitive activities such as problem solving, language, and interpreting incoming data from touch, vision, and hearing. Other animals have a cerebral cortex, but no animal comes close to having as many neurons in this region of the brain as our brain. Just for comparison, an elephant’s brain has 257 billion neurons, four times that of the human brain, yet, only 5.6 billion neurons are found in the cerebral cortex, which is just one fourth of the neurons in the human cerebral cortex. So, it’s not just that we have lots of neuron in the brain, but the neurons got packed into the region of the brain that controls all of the behavior that make us modern humans. And packed in they are, with just 25 nanometers separating them, not too far off from 10 nanometers separating transistors in a microchip. So, what is equivalent to miniaturization made it possible for us to have immense number of neurons in the brain, because the neurons did not grow in size as the brain increased in size, and packing the largest number of neurons in the animal world into the cerebral cortex fired up behavior that made us into who we are as modern humans.

To get here from the primordial primate of six million years ago, evolution took big risks, risks that could have easily led to complete demise of our lineage. Our ancestors overcame these times of terrible peril, and the way that evolution assisted, sometimes serendipitously, is a study in how biology and ingenuity together gave us a gift of life many times over. This is the second part of the primordial Moore’s Law: what did we do with all that firepower bestowed on us by the ever-denser packing of neurons into the cerebral cortex? This is the part that takes our ancestors up to the primordial Moore’s Law, and beyond, and it has to do with the way the brain came to process information fundamentally differently, from raw data to symbolic representation.

How in the world did they think of that?

The first tool that persisted, the single-edged Oldowan, emerged 2.4 million years ago in Eastern Africa. How did the Homo species come up with this idea? It is particularly intriguing because the earliest tool, the Lomekwian, disappeared a million years earlier, so these early ancestors had to start from scratch to create the Oldowan tool. Or did they?

The Capuchin monkeys of South America are named after an offshoot of Fransciscans who arrived on the shores of the Americas in the 15thcentury. Having been studied in the field by many Brazilian primatologists, we know quite a bit about their behavior, with one behavior in particular, a puzzling one, that presents an intriguing hint as to how our ancestors may have hit on the idea of the Oldowan tool. These small monkeys are adept at cracking things — they place a nut on a hard surface, and picking up a rock that’s surprisingly large for their small stature, tosses it down on the nut to crack it. Young ones congregate around this show of high skill, and it’s been observed that a larger number gather around the best nut crackers. These skilled monkeys not only crack nuts, but they also place a rock on the hard surface and crack it into several pieces. Surprisingly, the pieces that break off resemble Oldowan tools, some almost indistinguishable on first glance from the knapped tools created by our earliest ancestors. In areas where the Capuchin monkeys are particularly hard at work cracking nuts and rocks, the ground is strewn with these pieces that look surprisingly like the tool made by early hominids. Ah, you think, the monkeys were the first to create tools and use them. But here’s the rub.

The Capuchin monkeys show utterly no interest in these pieces. Instead, all their attention is directed to the powder that bursts out of the cracked rock, sniffing it and even licking it. We don’t know why they do this with the rock powder. Now, suppose that something like this was going on in Africa millions of years ago. One day, a hominid picks up one of these pieces ignored by others, and tries to sheer hide or crack a bone for marrow. Voila! Tool! Did it happen like that? We may never know. But the innovations of the earliest hominid are rooted in reimagining something as ordinary as a rock, and figuring out a new and better use for it. Every tool is like that. So is language. If Darwin is right, language started by using the same system as birdsong, eventually transforming into musical patterns that included words for referring to objects in the world. This ability to reimagine a preexisting thing is virtually unique to the hominid line and especially Homo sapiens; it is the result of a higher-order cognitive capability we gained, first by standing up on our hind legs, and the growing brain that followed.

Our symbolically-mediated behavior came bursting onto the scene some 100,000 years ago as the primordial Moore’s Law was shifting into high gear. It not only formed us as modern humans, but it had another entirely surprising outcome. Our brain membrane is extremely expensive due to the large number of neurons it contains, and the brain consumes 25% of the daily caloric intake. As the symbolic processing of data grew, less raw data needed to be processed, in turn reducing the burden on the brain. Always looking to become more efficient, the brain started to shed unnecessary membrane, so that, after growing in size for millions of years, doubling in size three times over this time, our brain actually has shrank in size by 13% in the last 20,000 years. We lost brain volume equivalent to a tennis ball. At this rate, by iPhone100, our brain will be equivalent in size to — iPhone100.

So, what now?

In his book, Thanks for Being Late, Thomas Friedman notes that we have managed to keep up with the rapidly accelerating technology over the time of Moore’s Law by becoming a better, more efficient, quicker learner. But with ever-faster innovation cycles, technology will overtake our ability to adapt to the newest big thing. What will happen? Recall the Lomekwian tool of 3.3 million years ago. Invented by Australopithecus, who lacked the cognitive capacity to create a culture to transmit its technology, the tool disappeared from the face of the earth. If we can’t comprehend something, it no longer falls in the purview of our control. We may perceive that something as useless — even if it’s potentially dangerous. How can we think about the future so that we don’t shoot ourselves in the evolutionary foot?

The Internet is the single most significant technology of our time. It has brought us convenience, and created a society that is immeasurably more complex, with oceans of information that unceasingly wash over us like tsunami. There are signs that we are starting to loose grip. We see everywhere stresses caused by emotional, physical, and mental exhaustion from having to constantly check social media, and job-related emails chasing us into the home after work. As if just staying afloat in our job is not enough, we are told that 50% of today’s jobs will disappear in our lifetime thanks to AI, and, according to the World Economic Forum, 65% of the jobs waiting in the wings for the younger generation don’t even exist today. So, what do they tell us to do? Not only do we have to work constantly, but we must also continuously update our skills to be ready to make the shift to new opportunities as the old ones die off. Innocently called lifelong learning, it could easily become a vicious cycle of relentless, forced learning of new skills, leading to more stress, less enjoyment, and falls miserably short on basic fulfillments like genuine human engagement.

To solve this problem, we need more technology, not less. But what kind of technology? We need fire. Two million years ago, when the earliest Homo species were honing their toolmaking skills, the tools they produced, Oldowan and then Achulean, allowed them to cut and crush roots, to more easily digest food. It took eight hours a day foraging and chewing down the food. That was the upper limit of the time that could be put in to meeting the daily metabolic need. Anything more would play havoc with the delicate balance between metabolic need and the effort required to fulfill it. It was only when the Neanderthals tamed fire that food gathering and consumption no longer needed to take up most of the day. Instead, time became available for a way of living that left room for creativity, eventually leading to the emergence of modern human behavior.

If we look at our life today, it parallels the life of pre-fire ancestors. While the quantity of data washing over us is immense, the quality is not. Just looking at the emails I received today, less than 30% are worth even a glance, comparable to the nutritional content of pre-fire low-quality food our ancestors consumed. What we need is something comparable to fire, to substantially increase the quality of data and everything that we do with it, so that we aren’t burdened with having to spend eight hours or more sifting through mostly useless garbage. Right now, it’s mostly garbage in, and, too often, garbage out. But how do we invent this technological fire? It isn’t clear what it looks like. If the story of the Capuchin monkeys is of any indication — they create stone pieces that look like Oldowan tools, although they show utterly no interest in them — the invention that we urgently need is right under our noses staring at us. Let’s hope that someone stumbles on it, and fast, so we can look ahead to a more balanced, creative, and humane future.

--

--

Shigeru Miyagawa
MIT Open Learning

Professor of Linguistics, MIT. HIs recent work on evolution and language was featured on the BBC Radio 4 program, "What the Songbird Said."