Responsible and trustworthy IoT

ThingsCon
The State of Responsible IoT 2018
13 min readAug 24, 2018

By Dr. Laura James

The ThingsCon report The State of Responsible IoT is an annual collection of essays by experts from the ThingsCon community. With the Riot Report 2018 we want to investigate the current state of responsible IoT. In this report we explore observations, questions, concerns and hopes from practitioners and researchers alike. The authors share the challenges and opportunities they perceive right now for the development of an IoT that serves us all, based on their experiences in the field. The report presents a variety of differing opinions and experiences across the technological, regional, social, philosophical domains the IoT touches upon. You can read all essays as a Medium publication and learn more at thingscon.com.

Why do we want responsible IOT?

For the same reasons we want responsible technology in general — so that we get technology that is useful, has benefits definitely outweighing harms, that we can rely on. We look to the developers and operators of technology to act responsibly, so that we can have confidence in their products and services. Whilst it’s easy to get carried away thinking that this is just an issue for the big silicon valley companies, or about personal information, the issues we face with irresponsible technology go beyond these things. This is not a new concern, but one I have been reflecting and working on more recently.

As I wrote last year:

Ten years ago I was at AlertMe, architecting and developing an internet of things system. We were within 6 months of a shipping product — in January 2008 people we didn’t know were buying AlertMe systems online, and receiving boxed kits ready to install. AlertMe set out to create broadband home security, bringing burglar alarms into the internet age, and redesigning them to be rather more useful to householders. We made a hub to connect to your router, and used a secure ZigBee mesh network to link up a mixture of detectors and buttons, both mobile and static devices. We thought about how the consumer would buy and set up the kit, did low power radio for key exchange, designed for hardware sale on the second hand market, considered the different people in a household and their data and privacy needs. We did user research and user testing and field testing and got independent security experts to review our architecture, and we followed standards where they existed (and contributed to development where they didn’t). It was quite a lot of work, but seemed like the least we should do for a connected home product, especially as we were thinking about the security market (we also thought about future extensions which would use the AlertMe platform, such as energy monitoring and control, which also has security and privacy requirements). It took two years to go from a three-word brief to a shipping product (with customer support and sales channels and all the rest of it), which felt simultaneously very fast and frustratingly slow, as it seemed as if the market for such things was about to take off.

Ten years on, there’s nothing much like that available for mainstream consumers. The IOT products we see for the home and for individuals more generally are mostly simple things — perhaps one device, connecting via wifi to the internet or via bluetooth to some other device such as a phone. Mesh networks are rare, security holes seem disturbingly common, embedded systems hold personal data and forget to get rid of it when they are sold, connected home systems (with very few exceptions) seem to have forgotten that houses have many different people in them. Not only are the products made much simpler, but they often don’t seem to have thought through what even ten years ago were reasonably obvious privacy and security basics. What happened? Did we oversell the “anyone can make hardware” idea? (When I’ve said “hardware is hard,” I didn’t just mean the manufacturing bit :)

Even though all this was possible a decade ago, somehow there is still work to do in driving up standards, and making it easier and more valuable to make secure and trustworthy systems.

Only a few months after I wrote this, the AlertMe servers (now owned by British Gas) were turned off, disabling the systems of those of us still using it for home monitoring. Requests for the service to be passed to the community of users were not responded to.

I think we did a good job of producing trustworthy technology at AlertMe — by the standards of development 10–12 years ago. Undoubtedly we would do things differently now — different technologies, different values, different regulations. I’d like to think that contributed to the longevity of the service, which was exceptional. And yet how odd it is to celebrate ten years of service, when the vision and functionality of so many IOT devices is designed to form part of the fabric of our homes, our workplaces, our public spaces. Things which are designed for many decades, not one. We still have a lot to learn about architecting and creating digital systems that can last, still operational and secure, and be maintained and adapted for the lifetimes we might reasonably expect. The challenges here are partly technical, and partly about business model — having organisation, capability and capacity to tend to and repair these systems. Even if we opted for a more dynamic society, where our environments change more often, we would need to think about the energy and materials used in replacing our connected infrastructures and things.

(I’d like to learn of other examples of consumer IOT which has operated for over a decade, and which is of greater complexity than a Bluetooth device that connects to a phone.)

Aside from that personal example, the last year has seen some interesting bits of news about the state of responsible IOT. Tesla’s attitude to information after an accident; the black market of smart agricultural equipment repair; fitness trackers and privacy as a feminist issue. In all of the recent focus on individual user experience design, we seem to have forgotten the need for adversarial thinking (although not all IOT devices will be at risk of unlikely hacks, such as sound waves to fool accelerometers), or to consider devices which are used and interacted with by more than one person. Perhaps it’s just because there’s more IOT in the field now, or more visibility of issues when they arise, but it feels like the level of responsibility in practice is getting worse, not better.

Luckily there are more people working on this critical issue now.

A year ago, I wrote an outline of ten aspects of responsible technology for Doteveryone. It includes a mix of areas very specific to digital with broader business responsibilities related to tech. Responsible practice isn’t just about the technology itself — it’s about the people who develop, manage and invest in tech, about the users, individually and collectively, and the wider context. Responsibility is more than simply applying an encryption standard, or complying with GDPR — it is integrated through everything an organisation making a tech product.

In responsible technology:

  1. The business model, ownership and control of the organisation is responsible and appropriate for what the organisation does and the products/services made
  2. Every worker, including suppliers and subcontractors, is paid fairly and has good working conditions in an inclusive environment
  3. The people, communities, projects and businesses contributing effort or information to the organisation are rewarded fairly
  4. The organisation’s products and services make a positive (or neutral) contribution to public and societal value
  5. Risks, systems effects, side effects, potential harms and unintended consequences have been considered, both for the organisation overall and for the products/services — including for potential acquisition, or market dominance
  6. Plans for maintenance and support of products/services into the future, including clarity on how long support and updates will be available for and what happens when they stop, have been considered and published
  7. People can easily find out and understand how the product or service works, how it is supported, what costs there may be, and what happens with data.
  8. The product/service follows relevant standards and best practices — in design, architecture, developing, testing, deploying, maintaining and supporting technology
  9. The product or service is usable and accessible for the range of users who may need to use it, and appropriate support is provided for them
  10. The wider context around the product/service has been considered and addressed appropriately, including thinking about the people who may encounter the service and their lives, the environment and sustainability in terms of energy and materials.

All of these apply well to IOT, as well as to broader technologies in general. It’s not always helpful to silo specific technologies when thinking about ethics or responsibility — people encounter holistic products and services and their effects on the wider world. IOT is both a buzzword (perhaps an aging one now, but still alive), and a big tent, encompassing a huge range of different kinds of tools, toys, infrastructure, and more. Coming up with actionable principles that apply across this range is not easy.

The IOTMark project has been working on codifying responsible concepts over the last year too, and has made some good progress. Many of the above ten things, especially the technical ones, are echoed in the IOTMark principles. Still, the challenge comes when ideals hit reality. Are we setting a gold standard few products will reach, but which articulates our ideals and where we should be aiming? Or a pragmatic one, which will build momentum in the industry and raise the bar a little (perhaps allowing for future adjustment upward later)?

What — or perhaps whose — values do we want to capture anyway?

Is responsible IOT about European cultural values? Or North American, Silicon Valley ones? Or Chinese ones? Or something else, or something more specific? If we are to have practical guidelines for people developing IOT, we need to be able to answer this. There will not be a single global solution. We may think that GDPR sets out how the personal data aspects of IOT should be, and that these European values are good for people everywhere. Disagreement comes more quickly if we consider what fair value exchange for information might look like, or whether it’s essential to publish the provenance of all the hardware components, or which software should be open source and under what circumstances, or what acceptable documentation would look like for a mass consumer product some of whose customers may not be highly literate. A well designed, 21st-century mark to help consumers choose IOT products could be valuable, but the IOTMark community have yet to nail down what values it will embody.

Today we hear calls for ethical design in tech and in IOT more than ever. There are many IOT manifestos, and lists of principles (as well as ethical tech ones in general). There are varied technical privacy and security standards, relating to different parts of the IOT or different application areas. These initiatives often very siloed, when IOT is always a cross-cutting endeavour, with decisions about hardware, software, data, application area and users intertwined. We need approaches to responsibility that reflect this, and which support collaborative discussion across the teams making and maintaining products.

A year on, Doteveryone has streamlined the 10 aspects to 3 key ones, which we now champion and are building into an initial toolkit which fits in an agile design process. These reflect our values as a think tank focussed on responsible technology, which we believe will be better for everyone in society. Assuming that the business is responsible already (and there are tools and support systems available to enable a business to sort out its employment, governance, and practices in this way), what are the really critical components of responsibility for digital technologies and IOT?

Context — looking beyond the individual user and taking into account the technology’s potential impact and consequences on society

Technology that understands and respects the greater contexts and ecosystems it operates within and the potential impacts — positive, negative or a bit of both — it could have on the institutions, communities and relationships that make up society. This is about deciding on tradeoffs and explaining these to not only the direct stakeholders of your technology but those who might be affected.

Contribution — sharing how value is created in a transparent and understandable way

Determining all of the ways different parties contribute value to a technology product/service; this can include information, formal or informal labour. Then sharing publicly these value flows — about who is involved in them, what is being exchanged — in a clear way that is easy to understand.

Continuity — ensuring ongoing best practice in technology and design, that accounts for real human lives

We should be creating and supporting products and services that are safe, secure and reliable for real, messy human lives and situations. Ensuring people with different needs and abilities who might reasonably use a system are accounted for with inclusive design, and that the technology is suitably supported and maintained. Following appropriate best practices for the specific hardware and software elements of a product, and anticipating and adapting to new needs and threats as they emerge.

We think these are practical and reasonable for today’s tech sector to work towards — but they are still principles, requiring thought and effort to put them into practice, not a simple recipe to follow.

From conversations with some tech businesses, simple recipes are wanted. But ethics and responsibility aren’t a free lunch — they take hard work. Consider an organisation struggling between good treatment of user information and the business model their investors think will give highest growth, or an individual concerned about practice in their work: knowing what is right is not always obvious, and knowing what to do about it even less so.

It’s somewhat easier for organisations who set out to do the right thing from the start; their values, the people they recruit, the customers and investors they target are more aligned. It’s not an easy ride, even for them — there’s a lot to think about, and practical tools can help even these organisations to make good choices throughout their work.

Still being responsible doesn’t demand diving into deep philosophy in most cases. It’s often basic common sense, thinking through risks and planning sensibly for whatever you are doing. (If you are designing a lock, think about how malicious people could open it!) Incidents of bad design affect the perception of IOT as a whole. We need to do better at calling out silly mistakes early, helping each other to build better products, learning together, and encouraging others to be part of the responsible IOT movement.

Because consumer trust in IOT is starting to be threatened by incidents like this, and by fears and misunderstandings about the internet companies who are so pervasive in our lives. This is worrying for those of us who would like to see connected technologies delivering valuable services and benefitting society — if trust is lost, the potential benefits will be diminished.

At an event last year I heard many people in an educated, thoughtful audience express genuine concern that Amazon Echo devices are listening to them all the time, and that Amazon gets that information. I am reasonably confident that this is not the case; but it was a real belief, and one which it is very hard to counter. Part of my confidence comes from knowing some of the developers personally; that does not scale as a route to trust.

Trust is not a wholly rational response to the world. It is necessarily only possible when the trusted thing or person cannot be perfectly known (if we know something entirely, we do not trust it — we have well-founded confidence in it instead.) An IOT system is opaque, complex, impossible for a person (even a deeply technical developer of it) to know completely.

We can’t engineer people’s trust. That would be manipulative — and sometimes people are right not to trust some technologies.

We can engineer trustworthy IOT products, services and systems, which are competently made, reliable, and honest about what they do and how they do it.

“Those who want others’ trust have to do two things. First, they have to be trustworthy, which requires competence, honesty and reliability. Second, they have to provide intelligible evidence that they are trustworthy, enabling others to judge intelligently where they should place or refuse their trust.” — Onora O’Neill

It is an individual responsibility on each IOT developer, designer, leader.

More than that, changing the landscape — through customers, investors and employees all demanding better IOT development practice — will drive change. It may not be rapid, though.

Creating, using (and for those who commission or buy tech, requiring) frameworks that strongly encourage good practice is helpful too. It doesn’t have to be regulation, with all the heavyweight process and slowness that people expect. We can have governance through open technical standards; improved practice sector-wide through templates for better IOT design, tools to evaluate impact and think through risks. We can make it easier to build responsible IOT, and we can research and showcase how being responsible can realise business value, too.

I’m excited by the new technology concepts, such as distributed machine learning which can keep data on the device but still create powerful, actionable IOT insights, and platforms like DataBox or the Hub Of All Things, which change the data dynamics. GDPR is likely to change the IOT data space in some ways — although it’s not clear how, yet. More people will produce toolkits which help organisations to think about ethical issues, and to use good design patterns. Conferences and online communities Existing responsible business tools and certification systems like BCorp and Responsible100 are looking to enhance their technology cover, and are being joined by the Zebra movement, platform co-ops and tech co-ops more generally. There are new ways of doing things across business and technology that give me hope for greater responsibility in many IOT systems in coming years.

There’s no perfectly responsible, ethical and trustworthy IOT project. There will be compromises and tradeoffs, especially in the tough competitive landscape of consumer ‘things’. The landscape will always include shining examples of good practice, and shocking mistakes and malicious products. But individually, and together, we can shift the balance.

I wonder where we’ll be ten years from now.

Dr. Laura James

Dr Laura James is Entrepreneur in Residence at the University of Cambridge Computer Laboratory, catalysing multidisciplinary research and activities around trust and technology, and Technology Principal at Doteveryone. She has spent nearly twenty years exploring cutting edge technologies and turning them into useful products and systems, in technology and leadership roles in diverse contexts. Laura has been the first employee at a connected home startup, on the management team of an AI startup, scaled a rapidly growing civic tech nonprofit, ran mission critical open source systems for a whole university, and cofounded a community workshop, a startup humanitarian NGO, and most recently a member owned co-operative.

ThingsCon is a global community & event platform for IoT practitioners. Our mission is to foster the creation of a human-centric & responsible Internet of Things (IoT). With our events, research, publications and other initiatives — like the Trustable Tech mark for IoT — we aim to provide practitioners with an open environment for reflection & collaborative action. Learn more at thingscon.com

This text is licensed under Creative Commons (attribution/non-commercial/share-alike: CC BY-NC-SA). Images are provided by the author and used with permission. Please reference the author’s or the authors’ name(s).

--

--

ThingsCon
The State of Responsible IoT 2018

ThingsCon explores and promotes the development of fair, responsible, and human-centric technologies for IoT and beyond. https://thingscon.org