Takeaways from Homo Deus by Yuval Harari

Corey B
Corey’s Essays
Published in
12 min readApr 14, 2017

This is the sequel to one of my favorites Sapiens: a Brief History of Humankind — and it’s even better. Yuval’s last book talked about how we got to where we are, but this is about where we’re going. It’s bound to become even more of a Silicon Valley ‘it’ book due to it’s tech centric ideas. And just like last time, it’s a thrilling read, almost a page turner. I read all 400 pages in just one weekend.

Yuval’s arguments this time around are thus:

Humanity’s belief in the value of a human life (first as religious souls, now as humanist human rights) is what brought us to today. Our religious thought has evolved from blind faith to skeptical science-empowered humanism. Western liberal humanism (freedom foremost) reigns today, having shrugged off challenges from Soviet socialist humanism (equality foremost) and Nazi evolutionary humanism (fittest foremost) in the 20th century. But it cannot reign in the future, as science strips away our intersubjective notion that human lives hold meaning. The only religions (ethical structures) left to us now are techno-humanism (cyborgs foremost), or data-ism (data foremost).

Here, I will share the most salient takeaways from Homo Deus in my own words, attempting to use Yuval’s whenever possible. Italics are his words.

I’l start with his excellent rationale for studying history, cover his new intersubjective examples, the Modern Covenant exchanging meaning for power, the 3 types of humanisms today, and what we’ll believe in the future, ending with lingering questions of his and my own.

Why Study History?

“Each and every one of us has been born into a given historical reality, ruled by particular norms and values, and managed by a unique economic and political system. We take this reality for granted, thinking it is natural, inevitable and immutable. We forget that our world was created by an accidental chain of events, and that history shaped not only our technology, politics and society, but also our thoughts, fears and dreams.

The cold hand of the past emerges from the grave of our ancestors, grips us by the neck and directs our gaze towards a single future. We have felt that grip from the moment we were born, so we assume that it is a natural and inescapable part of who we are. Therefore we seldom try to shake ourselves free, and envision alternative futures.

This is the best reason to learn history: not in order to predict the future, but to free yourself of the past and imagine alternative destinies. Of course this is not total freedom — we cannot avoid being shaped by the past. But some freedom is better than none.”

The Intersubjective Strikes Back

The biggest idea in Sapiens was the intersubjective — shared fictions like gods, nations, money, or human rights that let strangers cooperate and dominate both the objective and subjective worlds.

“The power of human cooperation networks depends on a delicate balance between truth and fiction. If you distort reality too much, it will weaken you, and you will not be able to compete against more clear-sighted rivals. On the other hand, you cannot organise masses of people effectively without relying on some fictional myths. So if you stick to unalloyed reality, without mixing any fiction with it, few people will follow you.”

Yuval provides a few more intersubjective examples like the gods in ancient Sumeria, which functioned like modern corporations, with temples owning land, priests collecting taxes, and workers tilling the soil — ostensibly all for the god, not for their own sake. Then the Egyptians took it a step further and deified their Pharaoh, allowing the god itself to talk, even if the bureaucracy holding him up changed little as a result.

My favorite is the story of the French Crusader who goes to the Middle East to fight for the Christian god. Considered crazy, today, right? What about a modern day Frenchman going to Syria to fight for human rights? Really, they aren’t that different — both are putting their lives in danger for abstract intersubjective principles. In 500 years, Yuval says, transhumanists might think us equally crazy for valuing human rights.

There’s also the ‘Portuguese Schindler’, Aristides de Sousa Mendes, who saved the lives ofover 30,000 Jews as the consul in Vichy France during WW2 right as the border closed. How? By stamping as many visas as he could in one marathon session until collapsing. Portugal forbade this action, but the visas remained valid, because border officials had no reason to doubt the visa stamps. Yet it’s just ink on paper!

Such texts (like scripture, marks, or degrees) distort reality by becoming self-fulfilling prophecies. What gets measured gets managed, and what you measure determines what gets managed, even if reality doesn’t think the same.

“The modern educational system provides numerous other examples of reality bowing down to written records. When measuring the width of my desk, the yardstick I am using matters little. My desk remains the same width regardless of whether I say it is 200 centimetres or 78.74 inches. However, when bureaucracies measure people, the yardsticks they choose make all the difference. When schools began assessing people according to precise marks, the lives of millions of students and teachers changed dramatically.”

The Modern Covenant

Religions are ethical structures, gods not required. “Religion is anything that confers superhuman legitimacy on human social structures. It legitimises human norms and values by arguing that they reflect superhuman laws.” Whatever offers knowledge as the ultimate truth, gains power: meaning and authority go hand in hand, for he who holds the book of ethics always know what you ‘should’ do.

Yuval offers knowledge recipes for different religions — want to know what is true? Just do the math.

Religious Knowledge = Scripture x Logic. Read the Bible and connect the text’s dots to answer your questions.

Scientific Knowledge = Empirical data x Math. Run experiments, and calculate conclusions.

Humanist Knowledge = Experiences x Self awareness. Search your feelings, and choose what you know to be true.

When humanism replaced religion as the dominant thought structure, we substituted human rights for eternal souls as the core unit of human meaning. Neither exist, but society works better if we pretend they do. Yet the problem here is that our other savior science is increasingly proving that no human rights exist. Religion offers order, and science power, but neither offers truth. We are trading our meaning in the universe for power over it.

I also love this differentiation between religion and spirituality:

“Religion is a deal, whereas spirituality is a journey. Religion gives a complete description of the world, and offers us a well-defined contract with predetermined goals. ‘God exists. He told us to behave in certain ways. If you obey God, you’ll be admitted to heaven. If you disobey Him, you’ll burn in hell.’ The very clarity of this deal allows society to define common norms and values that regulate human behavior.

Spiritual journeys are nothing like that. They usually take people in mysterious ways towards unknown destinations. The quest usually begins with some big question, such as who am I? What is the meaning of life? What is good? Whereas many people just accept the ready-made answers provided by the powers that be, spiritual seekers are not so easily satisfied. They are determined to follow the big question wherever it leads, and not just to places you know well or wish to visit.

Thus for most people, academic studies are a deal rather than a spiritual journey, because they take us to a predetermined goal approved by our elders, governments and banks.”

The 3 Modern Humanisms

Until science erodes our humanity, we will cling to humanism to make sense of our place in the universe. Indeed, the modern era is the story of 3 strains of humanism battling each other: liberalist, socialist, and evolutionary humanism.

All 3 arose in the 19th century, with the French Revolution, Karl Marx, and Charles Darwin — but we have not come up with any new value systems since. (This was the century Nietzsche proclaimed ‘God is Dead’, and Yuval says he’s still correct — for while millions believe in him, his commandments have nothing to say about modern times, while humanistic values do.)

In the 20th century, the 3 humanisms clashed. Western liberalist humanism fell to Nazi evolutionary humanism in the 30s, which fell to Soviet socialist humanism in the 40s, which fell back to liberal humanism again in the 90s.

Now it remains the dominant ethos, powered by science. Yuval paints the ying and yang of modernity as science and humanism, ‘reason and emotion, the lab and the museum, the factory and the supermarket’. (Sounds like my subjective/objective framework, hmmm)

What distinguishes these religions from one another? One way is to look at their solution to inequality: The liberalist solution is to equally value all experiences, rather than making them all the same like the socialists, or by prioritizing some over others with evolutionary rhetoric.

Or once again, ask how they define knowledge:

Evolutionary humanism: Natural selection knows best. Use it to prune our race and become supermen.

Socialist humanism: Institutions know best. The Party knows better than you do — shut up and listen.

Liberalist humanism: I know best. I should have the freedom to choose however I like.

Crucially, these value systems are predicated on feelings, not on superhuman will (even evolutionary humanism, since we decide what we want to evolve towards, rather than whatever breeds best).

“Our feelings provide meaning not only for our private lives, but also for social and political processes. When we want to know who should rule the country, what foreign policy to adopt and what economic steps to take, we don’t look for the answers in scriptures. Nor do we obey the commands of the Pope or the Council of Nobel Laureates. Rather, in most countries, we hold democratic elections and ask people what they think about the matter at hand. We believe that the voter knows best, and that the free choices of individual humans are the ultimate political authority. Yet how does the voter know what to choose?

Theoretically at least, the voter is supposed to consult his or her innermost feelings, and follow their lead. It is not always easy. In order to get in touch with my feelings, I need to filter out the empty propaganda slogans, the endless lies of ruthless politicians, the distracting noise created by cunning spin doctors, and the learned opinions of hired pundits. I need to ignore all this racket, and attend only to my authentic inner voice. And then my authentic inner voice whispers in my ear ‘Vote Cameron’ or ‘Vote Modi’ or ‘Vote Clinton’ or whomever, and I put a cross against that name on the ballot paper — and that’s how we know who should rule the country.”

Religion says ‘listen to God’, but humanism says ‘listen to yourself.’ And science says ‘listen to the data’. Yet science tells us nothing about what ‘should’ be, only what is. And once it tells us once and for all ‘there are no human rights’, where do we go from there?

“Technology depends on religion because every invention has many potential applications, and the engineers need some prophet to make the crucial choices and point towards the required destination. Thus in the nineteenth century engineers invented locomotives, radios and internal combustion engines. But as the twentieth century proved, you can use these very same tools to create fascist societies, communist dictatorships and liberal democracies. Without religious convictions, the locomotives cannot decide which way to go.

On the other hand, technology often defines the scope and limits of our religious visions, like a waiter that demarcates our appetites by handing us a menu. New technologies kill old gods and give birth to new gods. That’s why agricultural deities were different from hunter-gatherer spirits, why factory hands fantasised about different paradises than peasants and why the revolutionary technologies of the twenty-first century are far more likely to spawn unprecedented religious movements than to revive medieval creeds.”

Future Religions: Techno Humanism and Dataism

Yuval uses plenty of facts to back up his assertions about science disproving human consciousness’ exceptionality, so I won’t go into it here. He does mention that it will be a long time before we actually give it up— since the more sacrifices we make for a delusion, the stronger it becomes (aka Our Boys Didn’t Die in Vain syndrome, when politicians continue a fruitless war rather than appearing to give up to early losses).

At the current rate of technological advancement, it won’t be long before we give up the fight. Angelina Jolie illustrates how close we are today, already!

In the 90s, she starred in Cyborg 2, a “liberalist fantasy about protecting liberty and privacy from evil corporations”. But in the 2010s in real life, Angelina sacrificed her privacy by writing about her decision to undergo a double mastectomy, to avoid breast cancer. Her health won out over abstract principles. That same dynamic will likely play out for millions of others who sacrifice privacy for power, in their own version of the modern covenant.

Once humanism is dead, we have two options: techno humanism, which leverages technology to keep humans relevant in the cosmos, and data-ism, which prioritizes algorithms above all. Indeed, humans struggle amongst seas of data already today— governments cannot keep up. They can only administrate, not plan, because any new structures antiquate before ossifying. The data moves too fast!

Yuval thinks data-ism is the only realistic choice, offering several studies saying that humans are no different than biological algorithms anyway. Techno humanism attempts to keep humans relevant, but in a world of AIs and posthumans, what relevance will we have? Very little. Yuval thinks they will treat us like we currently do animals. “It’s not a perfect analogy, of course, but it is the best archetype we can actually observe rather than just imagine.”

Data-ism values nothing but data, and it wants data to be free. It wants to be shared, and networked, and processed (reminds me of Kevin Kelly’s What Technology Wants). Once data is truly plugged into the Matrix, the data will know what to do better than you will. For example, Yuval cites a study that shows that Facebook knows what you like better than your spouse does once you’ve Liked 300 things on the network. That’s the power of data!

Algorithms knowing us better than we do has disturbing consequences. What happens when we disagree with something that literally knows better? We must hand over all control to it, or live inefficiently. And how is that even possible?

It’s because our selves are actually two selves the experiencing self, which lives in the moment, and the narrative self, composed of memories and an ego that constantly spins stories to put the experiences in context. Studies have shown the narrative self usually dominates, and does not remember things as they happened but how it perceives them.

For instance, it averages your memories of pleasant/unpleasant experiences between the peak moment and the final moment, rather than remembering each emotional moment. That’s why doctors give kids candy after shots and females automatically release endorphins at the end of a birth, to brainwash us into loving the time.

Scientists have observed this phenomenon, and even built a transcranial helmet that magnetically stimulates the experiencing parts of the brain and suppresses the narrative parts, so that you only make the best decisions for yourself, devoid of past baggage. Quotes from users of the helmet like Sally Adee are telling. . “It was a near spiritual experience — my brain without self doubt was a revelation. All I wanted to do was return to that helmet.”

With enough data, we can all live in that world all the time. But is that what we want — all efficiency, all the time? To paraphrase Edward Luttwak “everything that we value in human life is within the realm of inefficiency — love, family, attachment, community, culture, old habits, comfortable old shoes.” Data-ism is science personified, and places no importance on consciousness, only intelligence. Where does that leave us conscious beings?

Lingering Questions

Big thoughts in this book! No surprise, since it pokes holes in everything from religion to human rights to the very concept of a self. We live in exciting times.

Yuval ends the book on a few unresolved questions:

1) Are organisms truly indistinguishable from algorithms? (Science says yes)
2) What’s more valuable — intelligence or consciousness? (As algorithms increase in complexity, the two are becoming decoupled)
3) What happens to society when non sentient algorithms know us better than we know ourselves? (cue Facebook, Waze, and Google today)

However, just like with Sapiens’ falsely atomic Japanese surrender, there are a few small hiccups hidden in Yuval’s ever smooth, always-explaining prose.

The sorest point for me was when he shoots down the idea that we control our minds by asking the reader to stop thinking for a minute and see how that goes.

True, the average person can’t still their mind for 1 minute — but enlightened yogis can. I’ve met several individuals who claim to be capable of that — indeed, that’s almost the definition of enlightenment. And while I’m neither a brain expert nor enlightened, all signs I know of point to enlightened brains being empirically different from normal ones.

How does that invalidation fit into his algorithmic worldview? And where else in this book did such a fact go unmentioned that might have changed his argument? Once again, this single chink in his towering argument makes me doubt the rest of his facts.

Still, it’s a thrilling read and totally worth picking up. He goes into way more detail on way more things than I cover here, so be sure to grab a copy and fill out your Knowledge(TM) before the algorithms take over.

Want more of my writing? Read weekly Coreyspondence!

--

--