The Obsolescence of Trust

Urban AI
Urban AI
Published in
7 min readJan 20, 2024

By Hubert Beroche, Founder of Urban AI

Source: Uber

In a recent lecture, American designer Judith Donath presents an idea that may seem surprising: technologies are replacing trust. This is evident, for example, when ordering an Uber or booking an Airbnb. In such cases, we willingly get into a car with a stranger or stay in the home of someone we don’t know. Normally, these actions would require a high level of trust, which is why institutions had to be developed to make them possible. The taxi license, for instance, is a form of institutionalized trust. In addition to attesting to certain skills, it is part of a network of legitimate actors and infrastructures. Therefore, when we get into a taxi, it’s not just the driver we trust but the institutions that support them.

But in what — or whom — do we place our trust when we use Uber?

The answer to this question is actually quite simple: no one.

Uber doesn’t offer trust but efficiency and transparency. Efficiency by providing an optimized transportation service. And transparency by displaying an average rating, the number of trips completed by each driver, and the price of the journey. As explained by the researcher Rachel Botsman, transparency is, in fact, the opposite of trust:

Trust is a form of belief, of faith. It characterizes our ability to rely on individuals whose future actions we do not know — and cannot know. As mentioned earlier, this trust can be institutional, but it can also be personal. In this case, it is proximity (emotional, geographical, etc.) that encourages to trust. This is what happens when we confide in a friend or lend money to a loved one. In each of these cases, it is impossible to be certain that the secret will be kept or that we will get our money back. But despite or perhaps thanks to this uncertainty, we choose to trust. It is therefore the unknown, or more precisely, the asymmetry of information, that characterizes trust.

On the contrary, transparency is what rebalances the asymmetry of information. For example, in the case of the prisoner’s dilemma, if one of the players had access to the other player’s decision, there would be no need to decide whether to trust or not (see below). More generally, if we had a crystal ball capable of revealing the secret motivations of our loved ones, we would not need to trust them. As summarized by Diego Gambetta, a researcher specializing in this topic: “If we were blessed with unlimited computational ability to map out all possible contingencies in enforceable contracts, trust would not be a problem.

In many ways, this is also what Uber aims to achieve by providing users with as much information as possible about its drivers. Access to their ride histories and ratings is intended to inform users about their behavior and reliability, while displaying the price of the trip, and implementing a secure payment system makes fraud impossible. All this information exchanged between users, drivers, and Uber itself reduces the level of trust involved in this process.

In reality, the majority of the urban technologies and applications (such as Airbnb, Google Maps, Amazon, Deliveroo, etc.) that we use operate on this same modality. More generally, our information societies are characterized by a quest for transparency. Every fragment of our reality is captured, calculated, and exchanged. The goal is to unleash information and data, enabling each individual to make better decisions and society to function more efficiently as a whole. This vision finds its most advanced realization in the Chinese social credit system. This initiative aims to calculate a score representative of the integrity and reliability of each citizen by aggregating their more or less positive actions. Purchases, social interactions, solvency, and even behaviors are among the many parameters taken into account to calculate their “Citizen Score.” The higher the score, the more the individual is considered a “good citizen,” with greater access to services and privileges (such as certain credits, jobs, or urban spaces).

This article by Rachel Botsman is particularly interesting on the (non) role of trust in the Chinese social credit system. Source: WIRED

But is it really a bad thing? Who hasn’t been betrayed by a close one or taken advantage of by a merchant? Who hasn’t suffered from having, perhaps wrongly, placed their trust? In addition to being deceptive and risky, trust is discriminatory. We indeed have a natural inclination to believe those who resemble us. Finally, from a progressive perspective, one might be inclined to rejoice in the replacement of trust with information. Just as science succeeded religious beliefs, trust, a remnant of an uncertain and tribal world, would be condemned to disappear in favor of knowledge.

Faced with these questions, one might initially be tempted to romanticize trust. To see it as the spice of life, the “little existential leap” that adds excitement to existence. However, one would soon be compelled to acknowledge that this is a pleasant but unsustainable argument on a societal scale. It might then be considered that trust is precisely “one of the most important synthetic forces within society” (George Simmel). Nevertheless, several anthropologists and sociologists have demonstrated that it is possible to have forms of cooperation without trust. Moreover, in these cases, trust comes secondarily, as a result and not a prerequisite for a successful association.

So isn’t it time to joyfully work towards the obsolescence of trust?

Source: NewScientist

Before answering this question, we must first note that trust is contextual. An individual may be very reliable with payments but unfaithful in love. A friend may be helpful but unable to keep a secret. Trust, in itself, means nothing. That is why a “Citizen Score” or a trust rating, regardless of the data and algorithms on which they are based is nonsensical.

Furthermore, it is often rightly observed that trust is our way of understanding the freedom of others. We trust because those in whom we believe have the freedom not to conform to our expectations. They always have the possibility to disappoint, betray, manipulate. But they can also surprise us, delight us, enchant us. And this is probably one of the most fundamental characteristics of trust: it attests to freedom.

“Abraham Is Ordered By God To Sacrifice His Son” painting by Harry G Seabright. In “Free and Trembling”, Kierkegaard interprets Abrham’s trust in God as an existential ascension toward freedom.

By definition, it is not possible to calculate, rate, or synthesize this freedom. However, what can be done — and what the majority of technologies are designed to achieve today — is to restrict it. Rating systems, behavioral analysis algorithms, and surveillance devices do not reveal our innermost selves. They do not liberate information. They are regulatory instruments, more or less coercive, to bring about a certain future. Going back to our initial example, we do not know if the Uber driver is a good driver, if they know the city well, or if they are truly reliable. But these considerations are rendered obsolete by a combination of control mechanisms (behavioral nudging through a rating system, calculation of an optimized route, vehicle geolocation, etc.) that limit the driver’s freedom and increase the predictability of the ride.

Therefore, information technologies do not make trust obsolete by providing more knowledge but by restricting our autonomy. In other words, these surveillance technologies do not illuminate the future, they simply make it more predictable.

The purpose of these considerations is not to reject information technologies but rather to emphasize that they should not — and cannot — replace trust. On the contrary, they can strengthen and enlighten it. This is precisely what Eric Gordon and Tomas Guarna propose in Solving For Trust. In this report, the two American researchers explore how information technologies, particularly AI, can contribute to strengthening trust between citizens and local actors, notably by creating more proximity (for example, by enhancing communication channels between these actors or by processing citizen requests more quickly). It is interesting to note that these recommendations can also apply to private urban actors. Fundamentally, ride-sharing systems offer a real advancement and make life easier for many city dwellers by decentralizing individual transportation services. However, it would also be possible to use these applications as tools for building proximity. Among other possibilities, the matching algorithm could prioritize connections between drivers and users who have previously shared a ride, rather than insisting on each trip being with a different driver. Such an application would thus serve the purpose of growing interpersonal trust among its users.

A final word on the progressive vision mentioned earlier. Since the 20th century, we have known (and experienced) that history is not a straight line. Science does not succeed religion, and knowledge is not the progress of faith. These realities can coexist as they exist at different levels of commitment and rationality. This is also true for trust and information. The difference between ‘trust to’ and ‘trust in’ probably best captures this phenomenon. The first form of trust is a probabilistic calculation that refines as information becomes available. But the second form of trust is unconditional; it abstracts from calculation and involves a subjective truth as it is engaging. For instance, a teacher may believe that a student will not pass an exam but continue to trust in their potential. This latter belief is a form of faith, an intimate conviction — not necessarily mystical — driven by professional ethics, natural optimism, or intuition. This trust is a precious gift as it opens up new futures for the one who receives it.

--

--

Urban AI
Urban AI

The 1st Think Tank on Urban Artificial Intelligences