Responsibility in IoT: What does it mean to “do good”?

ThingsCon
The State of Responsible IoT 2018
9 min readAug 24, 2018

By Prof. Dr. Irina Shklovski

The ThingsCon report The State of Responsible IoT is an annual collection of essays by experts from the ThingsCon community. With the Riot Report 2018 we want to investigate the current state of responsible IoT. In this report we explore observations, questions, concerns and hopes from practitioners and researchers alike. The authors share the challenges and opportunities they perceive right now for the development of an IoT that serves us all, based on their experiences in the field. The report presents a variety of differing opinions and experiences across the technological, regional, social, philosophical domains the IoT touches upon. You can read all essays as a Medium publication and learn more at thingscon.com.

“The door refused to open. It said, “Five cents, please.”
He searched his pockets. No more coins; nothing. “I’ll pay you tomorrow,” he told the door.
Again it remained locked tight. “What I pay you,” he informed it, “is in the nature of a gratuity; I don’t have to pay you.”
“I think otherwise,” the door said. “Look in the purchase contract you signed when you bought this conapt.”
…he found the contract. Sure enough; payment to his door for opening and shutting constituted a mandatory fee. Not a tip.
“You discover I’m right,” the door said. It sounded smug.”
— From Ubik, by Philip K. Dick. Published by Doubleday in 1969

Philip K. Dick had an uncanny sense of the possibility of technologies and of their potential impact. I find this excerpt eery in how uncomfortably close to our current technological realities it is, even if it was written nearly 50 years ago. There are so many new devices that are rapidly coming on the market that are smart, connected and wanting to help. You may have heard about the Amazon Echo personal assistant named Alexa that constantly listens to its environment and can play music, report weather or even order products directly from Amazon on command. This device has been in the news relatively frequently as it first took to ordering doll houses at children’s command and cookies on its own initiative, then proceeded to laugh at random, startling its owners and even recorded snippets of conversation and sent these to recipients randomly selected from the contact list. There is the Google Home personal assistant that can do much the same thing as Amazon’s Echo but when two such devices were put together they got into some deep arguments about the nature of the universe. There are smaller much more specific objects as well — internet connected toothbrushes, TV’s, hairbrushes, mirrors and, of course, there are many smart locks. These do not yet require a payment every time they open the door, but they can be hacked by enterprising hackers, broken by a simple ‘dumb’ screwdriver or suddenly made inoperable by an errant firmware update.

Complex consumer-oriented IoT devices such as Google and Amazon home assistants or Samsung mobile phones are suddenly implicated in sending unexpected types of data to unexpected recipients with increasing frequency. Whether clever hacks or results of unexpectedly buggy software, such discoveries are invariably troubling and creepy, prompting efforts to reverse-engineer ways to check what our own devices might “have on us”. Of course, the smart printers, toothbrushes and TVs are not far behind in constantly phoning home and reporting on their users. The problem is that when physical environments become instrumented with all manner of smart sensors and devices, the flows of data and decisions about its collection and use become ever-more invisible to the end-user. In this situation the onus of morality shifts further towards the developer and designer of the technology in question because they get to unilaterally decide the rights and wrongs of device behavior. The use of these devices assumes and requires increasing amounts of trust from the end-user. Perhaps it is possible to hold the end-user responsible for the decision to put an Echo device into their kitchen or for purchasing a smart tooth-brush, but such arguments can only go so far. After all, in some places it is practically impossible to buy a “dumb” TV these days and who has the time to really pay attention and read all of the interminable end- user license agreements and privacy policies? Such expectations are intractable.

Connected devices are entering our homes, our lives and ever more intimate spaces of bedrooms, bathrooms and boudoirs, collecting data about incredibly private moments and consumers are asked to trust these opaque systems with the data they collect. Their usefulness is sometimes questionable and sometimes undeniable, but the crucial question is: Who is responsible for the behavior of these devices? There is no playbook, no rules of conduct — technology developers and designers at large companies and those working to disrupt and innovate, entrepreneurs, makers, hackers — are charged with making moral choices and are expected to get it “right”. The EU GDPR has set out a number of guidelines and each member country has a complicated web of laws and policies with regards to data, but as we all know laws are complex and yet limited. Besides can we really expect entrepreneurs, developers, designers or innovators working in maker and hacker spaces to be able to navigate this complex web where even the lawyers get tangled?

Many designers and developers in startups as well as in mature companies are struggling because, as one developer explained to me during an IoT meet up in Copenhagen: “we don’t yet have much of an idea of what is ok and not ok.” In other words, the decision making about what is “good and responsible behavior” does not yet have real precedents or pre-existing experience to guide it. Even if IoT developers are attempting to be responsible, what constitutes responsibility is not yet agreed upon. Clearly communities of IoT developers must come together in collaboration with their stakeholders to develop ideas about what responsibility means in this context. What relations must be considered, what obligations must be taken on and enacted are important decisions precisely because building new systems requires acknowledgment and renegotiation of interrelations of responsibilities. At the same time the shifting standards and new regulations continuously shape and structure what sorts of decisions might be made. Who gets to make these decisions and whose values might guide these are also pertinent questions. In a globalized economy, the notion of “good” does not work as a local concept and yet “good” is always contextual, so who is responsible for moments when “good” pivots and takes on negative consequences? If nobody can predict the future, is it actually worth trying? This stuff is complicated and what constitutes responsible action does not have a clear answer.

Google offers many definitions of the term responsibility.

When asked to define the term responsibility, Google produces many definitions. This variety of definitions makes one thing clear — as individuals, all of us are enmeshed in a variety of different interdependencies — responsibilities to many others that are sometimes complimentary and sometimes at odds and must be negotiated. We are responsible in different ways and for different things to our families, friends, neighbors, workplaces, institutional arrangements in which we take part, the state and even global communities of many kinds. I am responsible to the editors of this RIOT collection for producing this essay by the deadline. At the same time, I owe my family time and attention in these summer months, I owe my friends some thought in absence, I am obligated to my employer to respond to email and I must also submit reports and deliverables to the European Union. I have made promises that I must fulfill and more often than not these obligations clash in their demands on my rather finite time and other resources. All promises and obligations are inter-related and at times competing; their fulfillment is a balancing act. What’s more, many of these varying types of responsibilities are reciprocal — my commitment to spend time with friends or family is moot unless they make that commitment as well.

Individuals are always entangled in a diversity of relationships that hold contradictory values and conflicting demands. For example, collaboration is seen and acknowledged as an important value among the IoT community (e.g. open access software, off-the-shelf hardware). At the same time, for many startups, the pressures of ‘making it’ in the ever more competitive IoT market push people to focus on ‘survival’ thus privileging some collaborative relationships over others and perhaps even eschewing relationships that previously held significant sway. So how might these notions of responsibility be translated with respect to IoT? Who must be responsible, for what, how and why?

Technologies in general and IoT technologies in particular are, of course, not neutral. They embody and reflect their designers’ values and ideas of what counts as “good” or “responsible.” After all, if a smart lock or a digital home assistant is intended to improve people’s lives, the design of these technologies is driven by someone’s idea of what counts as “improvement.” Among the calls for building technologies responsibly and for doing good, what does it mean to “do good”? The Dutch ethnographer and philosopher Annemarie Mol says that “It is important to do good, to make life better than it would otherwise have been. But what it is to do good, what leads to a better life, is not given before the act. It has to be established along the way.” Importantly, it does not mean that every developer or designer must focus on figuring this out for themselves, separately from others. No matter the emphasis on personal improvement, perhaps it is time to acknowledge that we are never separate individuals, but are instead composed of our many memberships, relationships and social entanglements that span our lives. We might want to hold those responsible for the design choices as accountable for their positive and negative outcomes. Maybe the engineer responsible for the “like” button on Facebook is worried about addictive behaviors or the designer who developed “pull to refresh” behavior is appalled at how it has been used and feels personally responsible. I would like to propose, however, that feeling guilty for these outcomes is not going to get us anywhere useful.

I have no real use for guilt. Instead, let’s acknowledge the problems and try again. This, I think, is a way around the paralyzing realizations of downright apocalyptic possibilities of IoT that my colleagues and I have previously observed in our analysis of IoT manifestos. If calling for being responsible, let’s reflect of what we mean by responsibility and consider who ought to be responsible for what, how and why. Being responsible individually is often lauded as an ideal, but that’s one lonely mountain-top and I think responsibility ought to be taken on together as groups and communities. If what constitutes “good” needs to be established along the way, then it needs to be established together. One way to allow for this deliberation along the way is to design with legal scholar Julie Cohen’s idea of “semantic discontinuity — the opposite of seamlessness” — a call for strategically under-designing technologies in order to allow spaces for experimentation and play. Such intentional building in of flexibility may be one way to offer possibilities for alternatives, for seeking out what a “good life” ought to look like with IoT.

— -

Irina Shklovski

Irina Shklovski is an Associate Professor at the IT University of Copenhagen. Although for her primary field as human computer interaction, her work spans a lot of other fields from computer science to sociology and science & technology science. Irina’s research focuses on big data, information privacy, social networks and relational practice. Her projects address online information disclosure, data leakage on mobile devices and the sense of powerlessness people experience in the face of massive personal data collection. She is very much concerned with how everyday technologies are becoming increasingly “creepy” and how people come to normalize and ignore those feelings of discomfort. To that end she recently launched a “Daily Creepy” Tumblr to monitor the latest in creepy technology. She leads an EU-funded collaborative project VIRT-EU, examining how IoT developers enact ethics in practice in order to co-design interventions into the IoT development process to support ethical reflection on data and privacy in the EU context.

ThingsCon is a global community & event platform for IoT practitioners. Our mission is to foster the creation of a human-centric & responsible Internet of Things (IoT). With our events, research, publications and other initiatives — like the Trustable Tech mark for IoT — we aim to provide practitioners with an open environment for reflection & collaborative action. Learn more at thingscon.com

This text is licensed under Creative Commons (attribution/non-commercial/share-alike: CC BY-NC-SA). Images are provided by the author and used with permission. Please reference the author’s or the authors’ name(s).

--

--

ThingsCon
The State of Responsible IoT 2018

ThingsCon explores and promotes the development of fair, responsible, and human-centric technologies for IoT and beyond. https://thingscon.org