Thinking about trust

By Eduardo Salazar

On April 10, Barclays Research published its 2018 Equity Gilt Study, this year focusing on the economic implications of technology. Predictably, “crypto-technology” [a] became part of the analysis, and Barclays position on the subject could be best defined as lukewarm. Marvin Barth, Head of FX & EM Macro Strategy at the bank’s Investment Unit, talked about crypto-technologies as “interesting” and “innovative” but equally argued they are “a solution still seeking a problem.”

In support of such assessment, Barclays’ report enumerated four challenges that crypto technologies need to overcome, one of them being “acceptance and trust.” [b]

A few days later in a completely separate development, a story involving cryptos published by Bloomberg also touched, in passing, on the topic of trust. The piece reported on a group of Harvard teenagers who decided to launch a crypto hedge fund (it seems only helped by friends and family, to the tune of $700k so far) under the name Plympton Capital. To spice the news a bit, one of the team candidly acknowledged that “[we] don’t necessarily know a lot” (sic) but “they [friends and family] have full trust in us.”

Nice one. But it could be equally argued that if not “trusted” by friends and family, then by who else?

Still, why the emphasis on trust? Trust, or lack thereof, is arguably the most inescapable component of social relations: at times being quite visible (have you ever signed a NDA?) and at other times not so visible, but always in the middle. Even Nakamoto once said “[the] root problem with conventional currency is all the trust that’s required to make it work.” [c] See, trust once again.

The thing is, although it can be colloquially interpreted by everyone, trust is a very elusive concept.

In what follows, to make things simple, I’ll focus on exchange situations.

Let’s begin by saying some people prefer speaking of “confidence” rather than trust (confidence is a component of trust, though) when the objective is to ascertain the level of competence in a “market transaction” such as buying a product or contracting a service. Here, it is important to remember that trust is ultimately contextual. In other words, it will depend on the circumstances, roles and expectations of those entering a transaction.

Having this in mind, we’ll keep referring to trust whenever there is no ambiguity.

But what do we mean by trust?

I’ll use a handful of examples to illustrate.

  1. An employer trusts his employee to provide the services he’s been employed for, in the same way an employee trusts his employer to regularly and timely pay his wages.
  2. He then trusts his bank, where his wages are routinely deposited, to be a proper custodian of his money and to not go bust by making wrong decisions (e.g. taking undue risks).
  3. In years now gone, my university students trusted that I would show-up to my lectures (then on macro-econometrics) and, likewise, I also trusted that I would not be facing an empty classroom early in the morning. That said, you could argue a key mediator here would be the quality of the delivery: bad or boring lectures generate no interest.
  4. People trust certain brands because their products or services have consistently proven to deliver on their promise. Sometimes, trust could also relate to a product that is not intrinsically of good quality but that instead delivers on an emotional dimension, such as garments youngsters “have to wear” to emphasise they “belong to” whatever (a group, the latest trend and so on). Owning the brand becomes a credential for peer acceptance, or makes a personality statement.

Trust is therefore a precursor to many things, such as credibility, love or relevance.

Pushing things a bit further, I will introduce two concepts involved in the causal mechanism (not necessarily unique, mind you) which explains how trust is formed. Those concepts are reciprocity and reputation.

Before that, however, I should mention that reciprocity (as outlined, e.g. by Omstrom and Walker [2002] but also in Becker [1990] or McCabe [1996] using a game theory setup) has been criticised as a rather restrictive concept for explaining the mechanisms of trust. That is, indeed, a possibility in some cases; for example, I can trust someone might do something for me simply out of courtesy. Hence, we have to keep in mind that we’re focusing on exchange situations.

What does that mean?

In an exchange there are trustors (individuals, consumers) and trustees (entities capable of causing some impact due to their products or service) that are linked by a process (in our example, a commercial exchange: goods for money at an agreed price) producing a desired outcome. Let’s say I walk into a shop to purchase a pair of jeans. On the basis of available information (which could be something as simple as “convenience”) I, the trustor, make an evaluation leading to a decision: to purchase jeans stocked by that shop, the trustee. In such scenario, reciprocity is basically dyadic: it occurs between me and the shop.

Considering the exchange at individual level, it does not necessarily convey any expectation about the outcome. However, the accumulation of “encounters” (regular purchases in that shop) together with the transactions the shop enters into with other individuals (as in Granovetter [1985] possibly, but not exclusively, within the “social network” that I belong to) sets a level of reputation: the shop being considered as selling good quality garments at a fair price.

The important thing here is that reputation becomes a key by-product of trust, as it serves to mitigate the effects of information asymmetry.

Trust at individual or micro level, through reciprocity (here, an exchange) has a cumulative result (at macro level, when “many exchanges” take place) in the form of reputation. Reputation fuels a feed-forward mechanism that influences the trust of other individuals to engage (enter an exchange) with that shop, only limited by the market size and the reputation of any competitors to that shop.

In the wake of the 2007–08 financial crisis (and the undeniable “crisis in confidence” that brought with it) Nakamoto proposed what was dubbed a “non-trust-based platform.” [d] But it’s a fundamental mistake to assume, as Nakamoto did, that’s indeed the case.

  • Looking at a very basic level, you can’t simply wipe off trusting the system or protocol. In Bitcoin (and other distributed ledger implementations) the protocol is anchored by cryptographic proof. In other words, you either trust that approach or not.
  • So, how do you go about it? Humans use an additional mechanism (frequently in a subconscious rather than reasoned way) to help come to a decision; namely, that the ability to trust is not conditional upon the intentions of any particular entity of individual.
  • For this to hold, the system or protocol has to provide as a necessary condition a guarantee that you can confirm things by yourself.
  • Although that’s indeed possible in Bitcoin, there is still a third party involved at the moment of transaction. In a POW-based protocol, those are the “miners” everyone talks about.[e]

The promise of Bitcoin and other distributed ledger systems is simply that by virtue of

  • their design;
  • the challenges posed, otherwise known as “in-protocol games” (such as POW); and
  • a number of incentives and punishments,

they should help people replace their trust in individual actors (or institutions) with trust in the system itself. No more trust in bankers or politicians, you only have to trust the code.

That means you have to trust the programmer(s) but, given the platform is open source and community-driven, the argument goes that proper (as in “democratic”) governance and transparency are reasonably guaranteed. For the time being I’ll put aside the debate around this and only pay attention to the protocol. Let’s also focus on the Bitcoin network, simply because it captures the biggest slice of the cryptocurrency market (valued by MarCap, in itself an imperfect and possibly misleading metric, but again, let’s park that discussion as well). Now, looking at the pie chart below, when 77% of the hashing power ends up in the hands of 5 parties (each holding over 10% of the hashing power share) or about 57% in the hands of just 3 players, the principle of a trustless system just gets thrown out of the window.

Hashing rate distribution in the Bitcoin network as of April 22, 2018 (4 days average). For the more up to date information go to https://bit.ly/1oZP923

The possibility of making yourself available to the network is wide open, no restrictions there. By the same token, the mere thought that nowadays you could spend a few clock cycles to mine a Bitcoin block is a delusion. [f] Because of the quite tangible barriers to entry (cost, quite simply put) it means the mining leaders are unlikely to be challenged, and collusion becomes a possibility; of course, if it’ll ever happen is another matter altogether.

In a nutshell: we’re replacing trust in government and institutions (for what they’re worth) by trust in a number of private individuals or groups, under the pretence their objectives will be, somehow, aligned to ours.

It’s not just at the level of pure-play technology. Think about the teens featured in Bloomberg or the Confido, Loopx or Prodeum scams, even the recent Savedroid’s PR faux pas (take as example, e.g., Stefan Grasmann — an investor in their ICO — reflecting on Savedroid’s CEO as always coming across as “a trustworthy person”). It’s also a key theme in the attempts at creating stable virtual currencies; I’ll touch on this in a future piece here in Modum. In fact, trust is nearly inescapable for crypto technologies, be it at the level of bare-bones algorithmic design and implementation or their subsequent monetisation.

Unfortunately, as people ride the crypto hype no-one seems to pay too much attention to this very fact, as if technology is de facto neutral. Just for a moment, think about the recent Facebook data leakage scandal.

Be that as it may, I wrote “nearly inescapable” above because, in theory at least, there might be a way (and by that meaning a “grammar”) that would enable us, somehow, to avoid being too conditioned by trust. For now, take it as a rather reluctant admission that it might be possible, just might.

In the meantime, “in God we trust”?

_________________________

Notes

[a] Blockchain and cryptocurrencies mentioned by Braclays as prime examples, but they are certainly not the only applications of the technology.

[b] The “verfication” that such obstacle is overcome happens the moment crypto-technologies become adopted by the wider population.

[c] Nakamoto’s post on the P2P foundation forum on February 11, 2009, quoted by Salazar (2018, p. 4).

[d] Nakamoto’s reply to Sepp Hasslberger on the P2P foundation forum on February 15, 2009. (see also Salazar, op. cit., p.54 and note VII).

[e] POW is an acronym for “proof of work.” This is not limited to POW as, e.g. POS (or “proof of stake”) implementations face the same issue.

[f] Nakamoto knew quite well that an “arms race” was a probable outcome. See https://bitcointalk.org/index.php?topic=12.msg54#msg54

Some bibliography for the interested reader.

[1] Huynh, T. D., Jennings, N. R., and Shadbolt, N. R. (2006) An integrated trust and reputation model for open multiagent systems. Autonomous Agents and Multi-Agent Systems, 13(2): 119–154.

[2] Axelrod, R. (1984) The Evolution of Cooperation. New York: Basic Books.

[3] Ziegler, C. (2009), On Propagating Interpersonal Trust in Social Networks. Computing with Social Trust, 133–168.

[4] Jonker, C. and Treur, J. (1999) Formal analysis of models for the dynamics of trust based on experiences. Multi-Agent System Engineering, 221–231.

[5] Ostrom, E. (1998) A Behavioral Approach to the Rational-Choice Theory of Collective Action. American Political Science Review, 92(1): 1–22.

[6] Falcone, R., Pezzulo, G., Castelfranchi, C., and Calvi, G. (2004) Why a cognitive trustier perform better: Simulating trust-based Contract Nets. In 3rd Int. Conf. on Autonomous Agents and Multi-Agent Systems (AAMAS 2004). ACM, 1392–1393.

[7] Pollock, G. B. and Dugatkin, L.A. (1992) Reciprocity and the Evolution of Reputation. Journal of Theoretical Biology, 159: 25–37.

[8] Burnett, C., Norman, T., and Sycara, K. (2010) Bootstrapping trust evaluations through stereotypes. In 9th Int. Conf. on Autonomous Agents and Multiagent Systems (AAMAS 2010), van der Hoek, Kaminka, Lesperance, Luck, and Sen, (editors). ACM, 241–248.

[9] Bacharach, M. and Gambetta, D. (2001) Trust as type detection. In Trust and deception in virtual societies. Amsterdam: Kluwer Academic Publishers, 1–26.

[10] Gambetta, D. (1988) Trust: Making and Breaking Cooperative Relations, Oxford: Basil Blackwell.

[11] Granovetter, M. (1985) Economic Action and Social Structure: the Problem of Embeddedness. American Journal of Sociology, (91): 481–510.

[12] Salazar, E. (2018) On Cryptocurrency: A Position Paper. Forctis Research Paper Series 001–18. Accessible from

Like what you read? Give Forctis AG a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.