Building Trust in an Internet of Things World

Nuala O'Connor
5 min readJul 6, 2017

It may once have seemed the stuff of science fiction that people would voluntarily wear a device that tracked their every move or talk to a computer sitting on their kitchen counter to order pizza. But here we are, working out (and sleeping) while our Fitbits report back to us, and listening to music and calling an Uber with Amazon’s Echo. On a recent visit to Best Buy, I found that the small appliance section had every manner of smart home devices (but no iron — people don’t iron anymore?).

Internet-connected devices have brought much-needed convenience into my life, as well as giving me useful data that helps inform many of my daily decisions. But like many consumers, I worry about how companies that store the data from my devices are protecting my personal data. I wonder about how much data they collect, when it is collected, and how this data is used, stored, or shared.

Like many consumers, I worry about how companies that store the data from my devices are protecting my personal data.

In the United States, the answers to these questions are left largely to the private-sector companies creating this new world of Internet of Things (IoT) devices. Our legal framework around privacy is incredibly fragmented, and most IoT devices fall outside of sectoral laws that prescribe privacy protections, such as those for health, financial, or children’s information. While many of us have a certain level of trust in larger brand names, the fact is that not all IoT-producing companies are as advanced on the legal and privacy fronts as they should be, and many fail to fully consider the ramifications of their security programs. The range of devices they are creating is astounding and, in some cases, surprising; everything from connected toasters to internet-enabled sex toys are on the market.

The legal uncertainty around the privacy and security of our personal data in the IoT world is even more disconcerting when one considers the implications of government access to this data. Not that long ago, the idea that granular data about our activities at home could be collected by devices was remote, and the idea that government could or would request such data sounded paranoid. Times have changed. Just this year, law enforcement requested data from both Fitbit and Amazon in separate murder cases. And while we understand the government’s intent to pursue justice in these cases, the rules around government access to this kind of data should be explicit, as they will have profound and lasting effect on all of us.

In the absence of such clarity, responsibility falls on companies to secure our personal data and shield what they choose to collect from government surveillance. A foundation of our country is the right to our personal, private spaces, and this is not a task they should take lightly. If tech companies want to advance daily use of their products, and if they — and we — want to reap the benefits of connected living, devices must be designed to merit the trust and confidence of all consumers. Here are four basic ways companies in the IoT space can build trust:

  1. Build privacy into the design early. Think about the human experience and the digital dignity of consumers. Think through what data you need for your IoT product to operate, and don’t collect more data just because you can. When it comes to data, sometimes less is more. The less data you collect about your customers, the less personal information the government can attempt to collect; the less data you’re storing in an identifiable format, the lower the incentive for outside actors to breach your data. Collecting unnecessary personal data potentially increases your liability in the event of a breach. It also requires that you allocate scarce resources towards even stronger data security measures, which add to your expenses, thereby hurting both your top- and bottom-line.
  2. Make secure the default. Companies that decide that the benefits of collecting sensitive customer data are worth the risks need to implement sufficient security measures to protect this data. While there is no ‘one-size-fits-all’ solution to data protection, some common measures might include ensuring that data ‘in transit’ from the device to your servers should be encrypted (‘end-to-end’); data held at rest has strict access controls and — where appropriate — is encrypted; de-identifying data when possible; and deleting huge data stores that aren’t necessary.
  3. Provide clear, contextual notice. IoT devices come in all shapes, sizes, and forms, so it’s necessary to make sure that privacy policies and practices are communicated in an accessible, understandable format. For a toaster, this might look like the traditional manual that comes with it, as well as a web-based prompt that appears when connecting it to the internet. For something like a device that audibly speaks, it should speak the policy out loud in clear language — ideally, it should offer users the option to hear it in multiple languages.
  4. Empower and educate users. Companies can and should go beyond writing and communicating a privacy policy. They should empower consumers to take control of their online privacy and security, while providing them the tools and information to do so. Some of the major social media platforms have started to do this, such as Facebook with its privacy “checkup” and Microsoft’s Privacy Dashboard. IoT companies should emulate these examples as they explore approaches that regularly encourage their users — in the context of the user’s interaction with the device — to check their privacy preferences, while educating them about the implications of their choices.

As we near the tipping point of internet-connected devices, these are the fundamental actions companies should take to build trust and empower their consumers. Concerns about unlawful government access to personal information and the ability of companies to keep our information private and secure are, perhaps, the biggest impediment to the future of IoT. Policymakers must establish clear protections that reflect the reality of the digital age across sectors, while providing flexibility for industry experimentation; as we wait for these policies, however, the private sector must lead the way.

--

--

Nuala O'Connor

President & CEO at the Center for Democracy & Technology. Thinking about individual liberty & the boundaries of self in the digital world.