An interview with Andrea Bauer

Trust as a social phenomenon; The rigidity of blockchain; The increasing complexity of technology;

With Company
Transformative Times
17 min readMay 25, 2020

--

We close the week with a chat on trust, blockchain, the role of technology in society, and much more, with Andrea Bauer — innovation consultant, futurist, and strategist on digital media. Andrea founded BEAM Innovation Studio and co-founder D.DAY Network. She is also the author of The Crypto Economy and Trusting in Mobile Payment. You can follow her on Twitter. On our side, we brought Thomas Walker and Miguel Coutinho.

Have a great week.

Thomas: Thank you so much for joining us today, Andrea. It’s going to be really great to have a chance to have this chat with you. As you know we’re going to structure this conversation around trust. So for the past two decades or so, even more, we’ve been kind of abstracting certain functions of our real-world experience into the digital world. This reduces friction but also introduces us to new risks. As you’ve done a lot of work in this space around trust in digital payments and cryptocurrencies, we’d kind of like the focus on this; it’s potential and, as I said, the risks moving forward.

So as we’ve been abstracting trust further and further, through things like the so-called trust economy with Airbnb as a famous example, we also have blockchain and this kind of decentralised trust system, which are examples of trust actually functioning digitally. But on the other hand, we haven’t found a way to deal with the proliferation of fake news. How is trust being digitised? And the points I just mentioned, are they several independent movements or are they different expressions of the same movement?

Andrea: I think that, first of all, it’s super interesting to understand the social phenomena of trust; when I was writing my first academic book about this topic, it was about, as you said, trust in mobile payment. I was very excited about learning about trust from a sociological perspective. The term itself — trust — is a social phenomenon. It’s super interesting when you look at the word from a historical perspective and how it was used hundreds of years ago. It has changed, of course, its meaning with how we adopted this phenomenon within our society. Trust is super relevant to how we navigate through our lives because if there is one thing that is true for all of us, it is that we do not know what happens in the future. So we have to kind of guess, what are the right directions, that help us survive in this very unpredictable world.

And so, when you go back hundreds of years, and look at the word ‘trust’ from a more etymological perspective, you see that back in the days the German word was ‘truwen’ , this described the relationship to the gods. So when people use the word trust, they are actually describing their relationship to the gods. And with the social system changing, this word was used more and more to describe interpersonal relationships. So the term “I give you my trust” became more and more like social credit. And then Nikolas Luhmann came along and looked at our economy and saw that, more and more, people were also trusting systems.

This is a very different element, when you look at a trust system, for instance, it is essentially different to interpersonal trust on various levels. And system trust is actually way more resilient than interpersonal trust. A person can disappoint you very quickly and you’re like, “OK, I don’t want to talk to you anymore.” And it’s as if our friendship is over; while you go to an A.T.M. machine and it doesn’t work and you don’t cancel your account with the bank immediately just because some certain interactions with the system did not work out.

So, what is trust? First of all, trust is there to reduce the complexity of the unknown. A positive view into the future, that things will work out. So, when we talk about technical systems, it is very much about the way we design our solutions; if it is the internet or a mobile payment solution and cryptocurrency as a new form of mobile payment solution, it’s very much about how do we design trust towards such a system. And in this case, we can see that there are different aspects of trust that people find important before they start to use a given system. When it comes to payment services, we can very quickly see that it is about security, but also about usability. So you’re always doing this balancing act. In a way, it is about how much usability you can provide, but if it’s easy to use, it can also be easy to hack. And so there are sometimes contradictory goals when you design a financial service, for example.

[00:08:17] So, how does it influence emerging technologies? The trust layer? When you ask me that, it is, of course, essential. Sometimes people don’t think enough about it, and then they are surprised if things are not adopted. Just look at your walk to your office: how many things have you trusted in order for you to arrive there? Right now, we are in the corona-times, and no one goes to the train station because people are highly aware of the things they are trusting. And so if you do not trust a thing or system, you will not adopt it. And I think this is super interesting. If it is, your bike, your apartment, your heating system, your computer, your instant messager, all kinds of stuff, your credit card.

Now, if we look at blockchain, for instance, I thought it was quite interesting to learn that it was very quickly referred to as ‘the trust machine’, or that is a trustless system. When you look at the definition of trust, it is kind of trying to reduce social complexity because life is super complex. And then in the digital realm, and in particular in financial services, what I learned is that, of course, you design the interface in a way that people easily opt-in and they instinctively trust it. But what you see is, in the end, in the digital realm you cut out the risk as much as you can, so you’re not faithful anymore, going back to what I was saying in the beginning, back then, trust was used in the conversation towards the gods, so faith was a synonym to trust. But in a digital world, trust becomes more like the ability to say that something is 100% true. For instance, when you make a digital transaction, you want to know 100% that the money arrived on the other side and it’s designed as a huge green check button. And when you see that, it’s losing this element of faith, of trust, and becomes, more and more, an element of control. Even in your expectation as a user. This is the result of the digital context because you cannot be just faithful and hope that the money arrives in the other account, you really must know.

Coming back to blockchain and its definition, it’s funny because it’s called the trustless system. And yes, it might be, because it’s actually more of a control system than a faithful ‘let’s see how it works out’ system. But it’s super important to understand the difference between interpersonal and system trust, because, you know, we’re using one word towards friends, but it’s not the same when you’re using it towards a technological system. Because a technological system doesn’t know anything about forgiveness, or all these kinds of non-quantifiable social interactions. There is only, a right or wrong, a yes or no.

Thomas: There’s a specific phrase that you said in your answer which really stuck out to me, which was “to reduce the complexity of the unknown.” And, obviously, at the moment — and you kind of touched on this as well — with our current circumstances, as you called it, the corona-time, everything is up in the air, right? Including trust and privacy being questioned with things like contact tracing. What’s your perspective on this? Do you see this as a moment of acceleration for digital trust?

Andrea: I mean, in the current corona-time, I find myself trusting the system because you sort of have to at the moment. This is a very absurd time we’re in, and now this digital layer that we have built over the last decades is helping us a lot to get through this phase — and it’s clearly just a phase. It’s helping us economically and socially, to stay connected. At the same time, what I see is also — even more important than this layer — is that it shows us in a brutal and enormously naked way how our system really evolved since WWII. From an economical perspective, from a political perspective, how different countries are coping in different ways, about what populism really means, and the different virtues that are preached and how they manifest in our systems.

When we talk about applying digital tools to extend the governmental or political or economic system. Private companies are, of course, way ahead of the governmental systems, in using those tools. And so when we talk about governments using digital tools, it depends on how democratic, and not just on paper, the government really is and how free a person really is in that country. Digital systems need to be carefully set up because they can give governments enormous power over citizens, especially when we talk about digital money. Maybe you remember when Julian Assange’s PayPal account was just frozen from one day to the other. This can happen super quickly, you know. Independently of what he did, of course, he’s a controversial person, but still, that this is possible; that you can just say this person is a political enemy and freeze their account. You can do this with a person, you could do this from any sort of philosophical perspective, religious perspective, you can literally just freeze their bank accounts, their financial freedom, you could say.

So what I want to say, is that it is very important to very carefully look at those solutions. And I think when we are talking about this tracing app, and of course, it’s still in the making, it kind of made sense to me, they are really thinking about building a smart, anonymous, and optional solution. So there are elements where I think, ‘okay, this makes sense’. Now they also agreed to go for decentralized data storage solutions, whatever that means in practice, you know, this is always a thing. But I think, it kind of comes through that they understand that there’s a high danger because it’s also health data that you are storing there — this is some of the most sensitive data in the world.

Which brings me to this concept that I think is really interesting, which is discussed in the blockchain community and is, I guess, also independent from it. The concept of the self-sovereign identity, which very much defines what are the aspects of a technical concept that allow you to really own your data. And when it comes to very sensitive data, financial data, or health data, you can say, ‘ok, I’m staying independent’. So there 10 rules of self-sovereign identity and when people try to apply it, they always find out that it’s not that easy, because every use-case is unique, making it tricky to really fulfill all 10 of them. For instance, storing some sensitive data on a phone; it’s about how is it encrypted? Can I go from one service to another service? This implies that there should be some standards on APIs. And so it’s a tricky concept when you try to translate it into reality, but I think this is a way to go.

There’s also another concept which is called zero-knowledge proof where you don’t always give all the information about yourself to a service. So in the end, the service is not receiving your real age but understands that you are over 18 and with this, you’re allowed to access a certain area online or in the real world or whatever. It’s never like someone understands when you are born and where you’re born and all those things.

I definitely think, you know, that there are technical services that improve our social information system; and the information exchange within our national, local, international community. And I think a pandemic like this kind of also shows us that we have to react fast. But at the same time, how do we know what sort of information is valid? And then blockchain could be an interesting technology to verify information or to verify authors who are qualified to say something. We already have these sorts of systems in the offline world, with institutions and titles, and so on.

I think that blockchain could be a sort of technology to validate sources of information and authorship. But again, the question is, who is validating that? So, here we come to the element of transparency, and also as I understand the blockchain system today, it’s very rigid. And in such a dynamic world as we are in, it’s a little bit too rigid for me. And, of course, there are people who like talking about combining blockchain with artificial intelligence, and artificial intelligence is sort of in baby shoes. And it’s just like, coming up today, but there were already some AI ventures in the past, and it sounds great in the beginning but then you think, yeah, but how? Because already with blockchain we have like different philosophies already and different sorts and types of blockchains. So there is an intense complexity that we are experiencing today.

So how can we choose between all those technologies, in particular, because there’s so much complexity in the system, the only way to answer this is by going back to our virtues and values in our society. And we’re not just writing it down on paper, but really, how do we live in our society? How much solidarity is there? How much technology do we need? Is it just sometimes a conversation with your neighbour? It’s also interesting to understand when we look at the evolution of technology itself, why do we develop all these technologies? Why did we develop blockchain in the first place? Blockchain technology is also a next step in the evolution of how we manage information, and when we look at the evolution of technology, we see next to it the evolution of information exchange and the evolution of resource allocation. So this is why we use that stuff. And I think it has reached a certain level of complexity today on a micro-level, yes we have to look at all of the major problems, but also at what we are capable of without technology.

Is it enough when you say, ‘stay at home’, and create some social detectors that people say, ‘I believe in it’ because actually, you just need to say sometimes: ‘There is a virus and it can kill your grandmother, so stay at home’ and people do it. So, of course, you can build a detecting system that sees that you are in a risk group and you just don’t have access to the airport or to the Central Station.

You can do this, but it costs a lot of energy, also, you know, I would rather prefer the first version where people just understand themselves. But also, since digital systems are not as dynamic, sometimes we need to understand from the beginning that what we are automating is something that is a really normal procedure. We don’t want to corrupt our capabilities as a society, and sometimes you can react faster when you just spread a message and people will react; you don’t have to control them.

Miguel: Really interesting this more humane perspective on things, sometimes we focus too much on the sexiness and hype of technology and we forget that, as you said, things like trust are a social phenomenon. And following that, I’d ask, if it’s possible to have digital trust, if that’s something that exists? I believe that it doesn’t, but I’d like for you to talk a bit about it.

Andrea: Yeah. I think, you know, digital has become such a normal term, right? In the end, I would just use automation as a synonym of digital because that’s what it is. Technology and digital systems are all about automation. Sometimes, we have more dynamic elements in it, or sometimes they’re more rigid. So you would ask me is there an automated trust? Yes, there is, but it’s the same as control. It’s like an equation.

Miguel: It’s more like exchanging information that allows both ends to trust each other.

Andrea: Yeah. It actually because it is about the fact that you are not betraying me. In the traditional sense, I would say back in the days, when you trusted a person, there was always a risk of betrayal, that you might lose. When I did my first Bitcoin transaction I really had to trust because I had no idea how this system works, but today, these systems are so well defined in particular areas. It’s completely automated and I trust it because it’s so rigid and so well constructed. There’s no element, or very little element, of risk in it. So, I would say, there is something like digital trust, but risk is an element which becomes less and less present, in this concept, which I find very interesting to think about. It says a lot about us, as a society, because you go through your day and you don’t recognise how many things you trust, how many systems and people you trust. If trust isn’t there, you talk about it a lot, and if it’s there, you don’t talk about it at all. So maybe it’s a good time, this pandemic, it kind of reminds us that things get shaken up. To be reminded of how much uncertainty our ancestors had to live with, and they still made it. Or not, and that was also fine.

Thomas: I’d like to stick with this theme of trust versus control a little bit here. So, I think, as we put our trust more and more in digital technologies, specifically at the moment in algorithms, let’s say social media algorithms, they learn more about us and they actually cater to our own desires. My question is, what’s next? Like when digital systems don’t just control aspects of our digital lives, but — as you kind of touched on — potentially on our physical lives without access to an airport or to a train station as you pointed out. How is this going to play out? Do we need to evolve or change our understanding of trust in these systems, so to speak?

Andrea: What I see at the moment, is that actually, people and economists see huge potential for growth in these technologies. And the problem is that it’s portrayed as if it’s already finished, but we are actually in a research phase. If it is blockchain or artificial intelligence or drones or whatever, there are so many new technologies that are kind of coming up. And at the moment, on one hand, you hear of all these utopias, mostly from neoliberal driven minds who want to sell you something, this better world. On the other side, you know, the artistic constructions, sort of a reminder of a system like Orwell or, Huxley’s Brave New World or something like this.

But the fact is that we are in a research phase. It’s very hard to say how it will turn out. But I’ve been working in technology for 15 years now and I was enormously inspired in the beginning and still am, but I became more realistic. I’m not so naive. I definitely had my moments in the beginning: oh my God, the online world. Oh my God, mobile is the best. Oh my God, it’s crazy, the iPhone was the best device. I got all the apps, all of them. No one talked about privacy then. But looking into the code, you’d think, what would someone who gets all of this information do? But no, this would not happen. And voilá, then there was Edward Snowden. And he informed us about the reality. And it was a shock. And what I learned then was also that when you think about a technology and the worst thing that could happen with a technology, it can happen.

Of course, when you think about the innovation process, you should not limit your creativity and really think about what things you can do: make the world a better place and think like the sky is the limit. And this is all fine, but we really have to check those systems and transparently learn from them. So this is what I also like from the blockchain world, that there is a great open-source culture; of course, they have their conflicts like everywhere. But still, you learn from it and then come up with your own version, and share it with the community. And this is something that is very important with technology because there is no single solution. Talking about, for instance, self-sovereign identity; every use case might need its own identity protocol. It’s different when you’re in the health context to when you just want to have a shopping cart. It’s a different thing.

So, where are we heading to? Technology will not go away, of course, and it will maybe become even more complex. But we need a lot of transparency, a good open-source and learning with each other culture to understand what are really good applications for some solutions. And always think about, what could really be the worst case and to have like a really hardcore stress tests from time to time. Is it idiot-proof? If some totalitarian comes around, could she profit from it? Because if people can profit from something, they will. And the blockchain showed us that even with the best intentions, there were a lot of people who were representing solutions, all these ICOs, who sort of took the money and ran. It doesn’t really show their intentions, I would say we’re just humans, and those tools need to really have a stress test before they go out and be applied to hundreds of thousands of people.

Thomas: It’s interesting because some of the language that you’re using around stress tests or these worst-case scenario tests, it’s almost sounding like it’s fringing on the concept of regulation. Is this the case? And if so, do our traditional institutions actually have the agility to be able to keep up?

Andrea: It’s becoming more and more difficult for traditional companies to adapt. And for some companies, maybe it’s really not needed. But this gap is rising with all the complexity of these new technologies. And, at the same time, they are changing, they’re not finished. Mobile is sort of finished, it’s still changing, of course, but to adapt in those areas, it’s sort of easy. Then to adapt to understand blockchain, then you have to understand AI, then you have to understand virtual reality because somewhere in that mix, there might be a solution exactly for your business. So it needs specialists.

But this is the thing, what happened decades ago in universities is now out in the open economic realm, in the startup world. And this is also a little bit, the problem, of course, that they are funded not by the government, but by a VC and this gives the whole research process a completely different dynamic because you have to not just be profitable but to sell the whole thing for X amount of money.

This is tricky. What I just recommended will be super hard to be fulfilled in a private research context, almost impossible. But at the same time, those technologies can be applied for like the worst you can imagine. And some people will apply them for the worst, I’m 100% sure.

Today we talk about blockchain, but I think this term will disappear because, already in the mainstream, it’s talked about less. The concept is pretty complex and it’s ever-changing, so you have to really keep track of it. So it’s going to disappear in the B-to-B solution discussion, it’s going to disappear on some conferences. But it will be applied, of course, so the question is: how much transparency do we have? Is our data safe? Is it really a self-sovereign system? When I delete something is it really deleted? So there are many questions which are unanswered, but really you can only break it down to the use-case and then really make that use-case as good as possible.

Really owning your data is an interesting and enormously needed concept, independent of blockchain, AI, or whatever. Because today you live in a democracy, tomorrow this might change. Things are changing constantly, so you don’t know. And you don’t want your information lying on a corporate server or on a government server — if these changes are happening against you.

Thomas: With this in mind, I have one final question. Imagine, we are now in 2025. Look around, what do you see?

Andrea: It’s not very far. Actually. What do I see… I see masks, a lot of masks. But what do I see? I think you know that from the outside, things will look almost the same as today. I think that there will be a little bit more control, and there will be a sort of a control system which will try to be as decent as possible. And sometimes it’ll make a small beep, you know, in some quadrants when you go somewhere.. Sometimes you will know why, and sometimes you will not know why.

This conversation shouldn’t stop here; share your thoughts on this conversation in the comments below. We’ll be back next week, talking with Leyla Acaroglu about education, sustainability, and a lot more.

If you have any feedback for us or would like to participate in some way, drop us an email: tt@with-company.com

--

--