We all love technology.
We all come to depend on technology. And we all want to have the latest technologies. Most of us feel that technology has improved the quality of our lives and most people would argue that technology has made the world a better place by improving health or reducing social exclusion and poverty.
But we also fear technology.
Some people feel they have become addicted to technology. They think that their computer or smartphone has come to dominate their lives.
More generally, we are afraid of losing control of new technologies. We worry about declining privacy in a sensor- and data-oriented world. We don’t like how algorithms are making more of “our” choices and, by doing so, dictating our lives and depriving us of freedom. We are afraid that emerging technology (such as artificial intelligence) will deepen inequalities or, even, take over the world.
A machine-dominated future is not attractive for many people.
And then there are those emerging near-future technologies that we don’t know what to make of. Consider developments that might significantly increase our life expectancy, allow us to clone ourselves, live on other planets, etc.
Joy. Fear. Uncertainty. Three very different attitudes towards new technologies. And, at different times, each of us has felt all of these emotions. We are building a new world that depends on technology but, most of the time, we don’t know what to make of that technology.
These were just some of the remarks made at two conferences I spoke at last week. As is often the case recently, these conferences were attended by both technologists (coders, engineers) and non-technologists (entrepreneurs, regulators, economists, bankers, lawyers).
And what was interesting is that both groups had one common goal:
We Need to Build Trust in Technology!
But what struck me is that both groups offered very different solutions on how to achieve this goal.
The “non-technologists” generally focused on the need for traditional rules and regulations to generate more trust in technology. They were referring to the applications of new technologies, such as crypto-currencies, and the need for stricter state-driven control.
This way of thinking results in the following two questions:
- Should regulation “adjust” to the technology?
- Should technology be made to adapt to the existing rules and regulations?
In this view, trust in technology is a by-product of a well-designed set of regulations that control how technologies are allowed to operate and develop.
Not surprisingly, the “technologists” had different ideas. They didn’t see the solution to the trust problem as “more” or “better” rules.
Instead, they directed their attention to technological and decentralized solutions.
To generate trust, we must empower the crowd. At one point, reference was made to the popular HBO series, Silicon Valley: “Since we no longer trust in the current Internet, we have to build a decentralized, peer-to-peer version of it (without the interference of regulators and large corporations taking advantage of our data).”
“Better law” OR “Better technology.” Two different visions of how to build trust in the future world that we are currently creating.
Education and How to Build Trust in Technology
Listening to this to the debate, it again occurred to me that “education” is the answer. At least, it is a better answer than just leaving things to either the regulators or the technologists.
Better education is a necessary first step if we want to build more trust in the new world.
To prepare the next generation and help them build a better future, we must focus on four dimensions of learning and education.
Knowledge & Skills-based learning
Teaching has always tended to be “backward-looking.” Transmitting the settled knowledge of the past has been the starting point for our approach to education.
But, we also need to think about other skills and capacities that are important in a world that is increasingly built around software, machines and other ground-breaking technologies.
A lot has been written about the importance of building skills that will assist the next generation in living with advanced and autonomous technologies and machines by performing the tasks that technology cannot do. These creativity-oriented skills include critical thinking, entrepreneurship, teamwork, arts, negotiations, etc.
Besides creativity-oriented education, we need to address the issue of “proceduralized” and standardized teaching, merely focusing on the repetition of knowledge and grades. Instead of anchoring instruction in processes and procedures that take all the flexibility out of education, we must focus on helping the next generation how to continually self-learn. Or, to put it differently, we should make sure that training doesn’t kill creativity.
I’ve written before about the importance of “self-learning.” Based on some of the feedback received, we can now identify eight elements — what I call the “8Cs” of the cycle of self-learning — that are necessary to achieve this goal.
After attending the conferences last week, we can now add a “third dimension” to education: co-learning. Co-learning isn’t a new idea, of course. The concepts of collaborative learning and cooperative learning have been around for decades.
Nevertheless, digital technologies give a new impetus and urgency to co-learning. Students can now collaboratively and remotely learn through online forums and chatrooms.
However, I am more interested in the “face-to-face” in-class experience and the opportunity to start co-learning exercises between “non-technologists” and “technologists.”
Bringing more disciplines together transforms the classroom into a diverse and inclusive forum to not only discuss technology but also (and more importantly) to think about and co-design the necessary infrastructure to build trust.
How I Use Co-Learning
Most of the time, I am a techno-optimist, but I also believe that we have to get smart about new technologies (particularly given the speed and adoption rate of these new technologies). We always have to remain cautious and critical.
I plan to introduce “co-learning” exercises in which both non-technologists and technologists will participate. The assignments will focus on the questions of trust and technology:
- What is needed to increase trust in technology?
- How can technology help us increase our trust in other people?
- How can we make sure that everybody — and not just technologists and a small group of “non-technologists” — is equipped to participate in an inclusive and open process of co-creating our technology-based future?
About The Author Erik is a Professor of Business and Financial Law at Tilburg University and Tilburg Law and Economics Center in the Netherlands. He is also Head of Governance/Vice-President at Philips Lighting. Erik is best-described as a “global futurist” and “cross-cultural strategic consultant”. Erik is a regular contributor to Community Works Journal. He writes a blog “Hacker Noon” focused on his educational and personal interests.
© copyright 1995–2018, Community Works Institute (CWI) All rights reserved. CWI is a non-profit educational organization
CONTENT USE POLICY We enthusiastically share our work and that of others through cross-publication. However, no material contained within this web site may be reproduced in print, by electronic or other means, without permission. (We also appreciate knowing where our material has been used.) All materials contained within this web site remain the sole and exclusive property of CWI, or the original author.