Empowering Trust in Civic Technology

The current hype of the blockchain is epitomized by the statement “blockchains create trust”. Early adopters and evangelicals of this idea love to throw around this blanket statement regarding trust as a way to sell decentralized ledger technology to the masses.

It’s obvious that these bold statements of techno-solutionism appeal to those grasping for ways to disrupt and fundamentally shift the impact of technology — especially those within obscure and even autocratic power systems — with the help of distributed contractual agreements and data immutability. Since the de-hype of the blockchain could be a textbook on its own, I will not get into this but rather focus on the message of trust created by technology. Technology is a medium, and as such does not create trust. Trust is created from the individuals and communities engaging with technology.

The field called Civic Technologies is a term which describes the technology used to engage citizens and communities to create, empower, and facilitate new social dynamics around collective participation and problem-solving. In a data-driven future, civic technology systems need to adhere to the needs of growing communities. They need to be decentralized networks inhibiting neither adoption nor future engagement, but be ubiquitous and facilitate tangible interactions. This ubiquity is called the Internet of Things (IoT) of Civic Tech, where all intelligent objects are connected to aid and empower the citizen.

Image for post
Image for post
We created this graph from the chapter: “From Social to Civic: Public Engagement with IoT in Places and Communities” written by Can Liu, Mara Balestrini and Giovanna Nues Vilaza. It shows their summary and categorization of current IoT Civic Tech technologies when examined against engagement and potential societal impact.

While there has been a lot of work around trust, its impacts, and definitions, this progress simply provides insight into the economic construct of trust as opposed to a socio-cognitive one. By understanding and explaining factors in cognition, we might fundamentally better our own understandings of social perceptions and subsequent actions—especially those mediated by technology. In other words, we can better not just the micro-level experiences of an individual, but macro-level interactions between individuals. In a society where technology is said to be creating change, new facets of education and innovative decision making need to be examined in order to equip these technologies for further innovation. In traversing human-civic technology interactions for more engagement and impact, we need to increase the need and facilitation of trust; viewing this construct as an informational delegate of social decision making.

Trust in decision making and its outputs of competition and cooperation can arguably be the most important factor in our survival. In previous work on trust (Towards a Conceptual Space of Trust in the Social Neuroscience of Consciousness — draft soon to be in press) we conducted literature reviews on the social neuroscience of trust, and grouped our findings into the following functional categories: (i) construction and use of mental models of others and yourself, (ii), risk and (iii) reward assessment, (iv) emotions and (v) the social memory. Our first three categories were grouped into ones related to confidence in the prediction of others actions and intentions, and last two into valence, or general attractiveness or aversiveness of a circumstance or situation. We added a third metric, called accuracy, which handles the objective placement of social confidence in order to asses scenarios where a decision maker has the information they believe is right but objectively is misplaced.

Image for post
Image for post

We construct a conceptual space of trust as defined by three dimensions: confidence(C), valence (V), and accuracy (A), each with their respective gradients: high (+), low (-) and neutral. We hypothesize the concept of trust into five instantiations: Trust, Distrust, Mistrust, Mis-distrust, and Untrust. Trust is the willingness to make oneself vulnerable and subjective probability the trustee’s behavioral outcome will result in positive utility. (+C, +V, +A). Distrust is the willingness to not make oneself vulnerable and subjective probability the trustee’s behavioral outcome will result in negative utility (+C, -V, +A). Mistrust is a measure misplaced trust, where the trustor’s mental model has low accuracy. (+C, +V, -A). Mis-distrust is a measure of misplaced distrust (+C, -V, -A). Untrust is the unwillingness to make decisions about vulnerability. (-/+C, neutral V, -/+A).

Image for post
Image for post

Valence, confidence in predictions, and accuracy might seem like abstract concepts but we creatives use instantiations of them all the time. Interface and experience design—which plays massive roles in the creative process—influence how technology becomes instantiated; this is where information architecture matters, where a choice to support multiple languages makes a difference, and where accessibility and even the use of animations might make or break an experience. The choice of how to construct a digital medium is just as important as the message itself, if not more. Tons of content has been written about proper ways to design for transparency, communication, and education. If creatives link good design standards with practices that are meant to empower, they can really equip individuals to have agency over their technology interactions.

Ubiquitous communication is crucial. We are social beings and we must think of our intelligent technologies in the same light. By creating an ecosystem of key players, of both human and machines alike, we can expand social constructs—such as trust—through the symbiosis of digital experience. These communities have uniquely defined methods of social information which means they have unique problems as well. Addressing social concepts like innovation, democracy, and inclusion face new challenges where trust might be the only way to empower change. At a communal level, communication drives social constructs like trust from the inside out. Technology must work the same way; being designed by the people and for the people. Where they trust their own domain of confidence, valence, and accuracy to drive their unique social decision making.

Surely most designers and developers have a good understanding of proper design practices, yet we need to reach into this symbiosis between communal/communal value and trust. Along with designing from a point of empathy (another topic which has a lot of literature out there), we need to critically understand and strive to operationalize value systems for these technologies while not losing much information. Rather than simply creating tooling, we need to craft storytelling devices and digital experiences that question, understand and abstract goals, values, and risks—all alongside those within the communities we want to serve.

Want to change the way we finance communal projects? Understand how and why individuals value money and how their trust in financial security shifts when it becomes a common good.

Want to empower individuals to not only get access to healthcare data but to also get paid for it by donating their data? Understand their value of healthcare data, and under which circumstances they trust or distrust those asking for the data.

Want to leverage circular economies in cities to produce more locally fabricated goods and less waste? Understand how global power dynamics influence the value of consumers’ purchases and how to shift trust from global producers to local ones.

Understanding how we are biologically wired for assessing social decisions might help creators support the impacts of their civic technologies, the story it tells, the social exchanges it facilitates and the impact it empowers. By creating technologies around social interactions — and more specifically trust — can we better learn how our own social dynamics work. Instantiations of emergent and ubiquitous fields like empathetic AI, affective computing, adaptive and tangible interfaces, and even social robotics, will soon be absorbed into the umbrella of ubiquitous technologies. But only when these technologies are created within spaces of collective intelligence—where community values and systems work together with ubiquitous and adaptive technology interactions—can trust truly impact the way we interact with not only in each other but in the way we can ultimately create a better future.

Significant References / Read More:

Written by

Cognitive Technologist: Social HCI, UX Design, FullStack Engineering. Creating impactful experiences between technology and social systems.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store