10 years to shape the future, by Gerd Leonhard | Digital Future Society

Futurist Gerd Leonhard
The Fork In The Road Project
7 min readFeb 16, 2021

Via Digital Future Society

Gerd Leonhard is one of the leading futurists worldwide, having presented at hundreds of leading conferences and events over the past 20 years and worked with the likes of Microsoft, NBC, Visa, Google, the European Commission, Audi and IBM among others.

As an early voice on the dangers of unregulated ‘big tech’ and exponential growth and the bestselling author of Technology vs Humanity: The Coming Clash Between Man, Gerd’s built a storied reputation as a go-to keynote speaker and thought leader when it comes to digital ethics, human-centric technology, and the need for a new form of capitalism.

Some argue that technological development is like an evolutionary process in which humans and technology evolve in a symbiotic way, creating both new opportunities and new risks. We created technology, but how is technology recreating us?

20 years ago technology was changing our behavior, like driving a car or making a phone call using the Internet. But now technology is changing us. E.g., if you wear augmented reality glasses, virtual reality helmets or brain computer interface, then the way that you are, the way that you think, the way that you’re seeing things and hearing things changes fundamentally.

Another example is social media, technology so powerful that it can feed us information and change the way we think, react, feel and narrate.

And then the next generation technology is capable of connecting our brain to computer interfaces or changing the human genome so that we may be able to avoid diabetes.

In another 20 years from now we will essentially be converging with technology because we will use it to stay healthy, live longer, be more powerful, become superhuman.

So technology is no longer on the outside.

A human-centered approach also leads to the question of human control in a society in which machines operate more and more autonomously. How much control should humans have over digital systems? What should the role of humans be in the digital society?

Humans are really not at all like a computer. A computer can look at that Wikipedia and memorize all of Wikipedia in one minute, but it does not have the knowledge that is not explicit. It does not have intuition, imagination and, of course, consciousness.

The famous Moravec’s paradox says that whatever is easy for humans is hard for computers and vice versa.

And that will be true probably for at least 20 years, maybe 30 years, until computers reach -in theory- the capacity of the human brain. Nevertheless, many psychologists point out that we don’t just think with the brain, the brain is just one way of how we think. We think with a body. And human intelligence has many different pieces.

I always say that the more we connect with technology, the more we must protect what makes us human, because what makes us human is not technology. It is engagement, relationships, experiences. It’s all the very unique things.

We have given machines too much control and we have given them control over the media. So when we go to Facebook and social media, the machine is telling us what’s important, not a person. And it’s telling us what’s important because if we click on an add, we may become valuable.

Is it wrong to assume that engineers are now shaping new societal models and that policymakers, regulators and the society in general are having difficulties to keep up?

Basically Silicon Valley and China have said that every human problem has a technology answer. And that’s just not true. The real problems of humanity such as equality, happiness, self realization, all the things that we’re struggling with are all related to policy and political decisions.

And I think that, in the last 10 years, technology has gone from a tool we get the job done with to a purpose in itself.

We could even say technology has become sort of a drug or religion, and I think this is wrong because it replaces our own thinking and becomes too powerful to control. So what we need to do now is go back and give the control back to humans to figure out, for example, what to do with social media and so on. In 10 years time, technology will be virtually unlimited, pretty powerful. So we have to set the limits, the rules and the ethics. And that is the current process that we’re in.

There’s a quote from a member of Greenpeace International Board of directors, saying in an interview with DFS that there’s a lack of moral principles and that we are having trouble replacing religion as a moral compass. In the face of the lack of principles, what drives development in the digital society?

The Dalai Lama said that ethics is more important than religion. The reality is religion is a division of general human ethics.

What we really need now is defining the most basic understanding of what means to be human. And that basic understanding includes aspects like happiness and self realization, but also even more basic things like not killing each other, the right to do the things that we want to do, free will and so on.

We need to come up with a rule based system that defines technology. For example, as I point out in Technology versus Humanity, we need regulation that promotes the right to privacy, the right to be forgotten, the right to disconnect and the right to not connect, the right to all of those things that have slowly been taken away by technology.

I think the challenge is for the world to be on the same page; following the European Union, the principle of advertising and Internet searching should be that you have to opt in, not opt out. And so if somebody wants my data, I have to make a step and say yes. We should have more control and we should get something in return for our participation.

Are we prioritizing economic progress and benefits about human core values?

Well, of course, and this is clearly our biggest mistake — how can you live in a world that has great economic development and progress in financial terms and more and more rich people, but where everything is polarized?

Now, the bottom four and a half billion people have less money than the top 100 billionaires. It’s going to stay this way and this is also what technology is doing.

The Covid crisis has shown us that it doesn’t matter if you’re wealthy because the globe’s issues like pandemics, climate change and others come to impact absolutely everyone, except for maybe the top zero point zero zero one, you know, that can live on the space station then later on. We need a people, planet, purpose and prosperity thinking, as I call it. And if we don’t do that, then there’s really no argument for changing the way that we do things.

Experience tells us that when the market for digital products and services is left free in a laissez faire way with no government intervention at all it rapidly leads to a winner, takes all. Is digital market intervention necessary for not only the economic order, but the societal order to prevail?

I think as far as intervention goes, it’s never a yes or no answer.

We’ve seen that the free market does not work when it comes to climate change because there’s no motivation to move forward on things that don’t make money. And so we need intervention.

Technology worked as a free market when it was being built, but now the free market of technology is much more powerful than oil and gas and banks and everybody else. The top one hundred companies in the world are pretty much tech companies, so when we get to that point intervention makes sense.

Intervention is always about finding the point when it’s gone too far and balancing it out.

A free market does not necessarily lead to a good society under all circumstances and certainly not an entirely free capitalism like in the US, which leads to complete polarization. The balance between those two things is really important, the power of science and technology and business and the requirement of human needs and society.

How important is this time in history?

We are at a fork in the road and the next 10 years will decide what the future could be like. Heaven or hell.

E.g., we want every person’s DNA to be on the cloud so we can fight cancer. But if my DNA is in the cloud, then we have to have an identity and a safeguard and the rights to keep it and to control it so that we can solve cancer. And if we can solve cancer, can we then also use genetic engineering to create superhumans? So who decides on this?

So it all comes down to one thing. We have all the technology, but will we have the will? And that is really the ultimate question that I think every corporation, if the government is to ask.

Originally published at https://digitalfuturesociety.com on February 16, 2021.

--

--

Futurist Gerd Leonhard
The Fork In The Road Project

#KeynoteSpeaker, #Futurist and #Humanist. Bestselling Author: #Technology vs Humanity. Get My 10 #Future Altering #Tech Megashifts futuristgerd.com/FREE