Why Metaphors are Important in Technology

Audrey Lobo-Pulo
Phoensight
Published in
6 min readFeb 27, 2021
Photo by Peter Pryharski on Unsplash

“The human species thinks in metaphors and learns through stories” — Mary Catherine Bateson

As someone who’s spent most of my time in the natural sciences, metaphors and analogies did not come naturally to me. Yet, our vibrant digital ecosystems are teeming with them: ‘Data is the new oil,Technology is a just a tool’ or “Internet is the information superhighway”. What did not occur to me was that these metaphors were inadvertently shaping not only the way I saw and understood technology — but also how its designed and used.

As our technological world grows increasingly more complex and sophisticated, we are beginning to see not just a digital disruption across our many sectors — but the consequences of how we’ve chosen to think about technology.

Take for example, “Data is the new oil” — while there have been many articles describing the ways in which data differs from oil, such as its unlimited supply, how it can be reused and add value — what’s missing is the deeper meta-messaging around its use. That data can be ‘extracted’ as a commodity (similar to oil) carries capitalist connotations, which taken too far, could result in a violation of human rights depending on the data.

Language around ‘data mining’, ‘data pipelines’ and ‘data engineering’ invoke imagery related to the Industrial era — one of power-driven machinery and various refining, transportation or warehousing processes. Indeed, even the methodologies, tools and techniques developed during this time were focused within that context. One example being the use of statistical tools to improve production or manufacturing processes (my favourite modern version is the coffee valve defect rate study!).

But embedded in this setting come ideas from other entangled contexts relating to the social and economic systems of the time. And where previously the focus was on defective machine parts and improving production efficiency, these metaphors and analogies have been creeping into the way we work with human data with the stated aim of improving ‘human productivity’.

We need to stop and ask the question — “What were the original intention and purpose behind the development of these tools and techniques?” and also “Are those same tools and techniques appropriate or applicable in the same way for human data?

There’s Data… and then there’s Data

Photo by Christian Olaf Adickes on Medium

“We have to be careful of the metaphors we use to make meaning, because metaphors are the language of spirit and that’s how we operate in our fields…”

— Tyson Yunkaporta from “sand talk”

For all the classification methods and taxonomies we’ve developed for data, few have really considered how this information collected has been decontextualised in the process of measurement or recording — and more importantly, what this loss means in terms of the limitations or validity of the tools we use.

If we were to use the metaphor to our advantage — and change it to “data is the new apple”, how would we think and work differently with data? For a start, it’s worth recognising that the apple comes from a tree, which belongs to a forest and is in relationship with the soil and creatures in that forest.

Perhaps we could reflect that some data cannot be taken out of its many contexts without losing some of its quality.

Take for example, Artificial Intelligence (AI) algorithms that are trained on human responses in a busy city may differ vastly from how it would perform in a rural village in India. Or that certain types of data may have a ‘shelf-life’ — data from the 1950s on women in the workforce may not really be appropriate as input for predicting work performance for women today.

Obvious? Perhaps.

But while these examples seem fairly straightforward, what is not clear is where the ‘delineation’ that separates information that can be used from that which shouldn’t lies.

And what makes this even more complex is that the many entangled contexts, for example around societal attitudes, racial justice and economic equality, affect some types of data more than others.

So it’s difficult to really know which contexts affect the viability of our data for a specific purpose. Nevertheless, it’s important for us to not only be aware that they might — but also have an awareness of the consequences of these limitations.

Yet, organisations still use human data to predict human behaviour, such as the propensity for an individual to leave their organisation, without much attention to not only the ethical implications for their employees — but also the analytical rigour in assessing the viability of the models used. Meanwhile, others are discovering that even large datasets and the ‘best predictive AI-models’ do little to predict life outcomes for social programs and in the criminal justice system.

Technology is a Forest and the Internet a River

Copyright © Audrey Lobo-Pulo (CC BY-NC-SA), 2020

I’ve often heard arguments about the ‘neutrality’ of technology accompanied by the metaphor that “technology is just a tool” — which in my mind always brought up the image of a person holding a hammer looking at most things as a nail!

What I’ve been oblivious to, until now, was the relationship between the person, the hammer and the environment — and how the attributes of that hammer change the perceptions within the person and their relationship to their environment… And we haven’t even got to thinking about how this applies to AI yet!

Questions around how these ‘tools’, such as AI, shape our perceptions and change our inter-relationships within society, and our systems such as health, education, economic, legal etc., depend not only on the tool itself — but also our human ability (or inability) to foresee the totality of the consequences in using them.

I prefer the metaphor, “Technology is a Forest” because it conjures up a deeper sense of awe within me — a sense that reminds me to be humble, lest I disrupt the larger ecosystem by lighting a small fire only to create an uncontrollable bushfire!

Compare that to the fire that Apple came under when they launched their credit card in August 2019, without really knowing how their AI credit rating algorithm managed to evade third party vetting to discriminate against women. It’s a tough call, but digital fires may travel faster and further than their real counterparts…

Then there’s the “Internet, the Information Superhighway” - but these days it’s more like a river that’s turned into a swamp of fake news and misinformation. Yet, most of the public debate on the regulation of social media and messaging apps are focused on regulating the content on these platforms — so it’s like making sure the ‘traffic’ on these ‘information superhighways’ are ‘roadworthy’…

But I prefer the ‘river’ metaphor — we need to stop and ask, “What is it about the conditions in and around this river that’s turning it into a swamp, where the fish get sick?” and “What are the conditions required for digital platforms to become healthier places to interact virtually?

It is in choosing our metaphors wisely, that we may be able to better intuit and understand the technologies around us, how we should be using it more responsibly and how it may be designed and regulated. So the next time you think of a tech-metaphor, it might be worth pausing to reflect on what that says about the way you think about technology and society!

“The way in which a concept is delivered speaks not only to the concept itself but also offers a description, between the lines, of what is unquestioned.”

— Nora Bateson from “Small Arcs of Larger Circles”

— — —

Phoensight is an international consultancy dedicated to supporting the interrelationships between people and technology, and is accredited by the International Bateson Institute to host Warm Data Labs.

--

--

Audrey Lobo-Pulo
Phoensight

Founder of Phoensight, Public Interest Technologist, Tech Reg, Open Gov & Public Policy geek. Supporting the interrelationships between people, society & tech.