Unpacking the Issue of Missed Use And Misuse of Data

Robert Kirkpatrick, Director OF UN Global Pulse, gives his take on the issue

Just because data misuse is at the forefront of recent conversations, we shouldn’t ignore the harms associated with missed use. Lost opportunities to use big data to achieve the Sustainable Development Goals (SDGs) are probably to blame for at least as much harm as leaks and privacy breaches.

A few years ago, I was leading a World Economic Forum session with top-tier executives keen to hear more about what was then still a new idea: using big data for the public good. I gave them the scenario of an industrial accident in which a plume of toxic smoke begins drifting toward suburban neighborhoods and asked them, “who has data that could help first responders save lives?”

Someone raised their hand and said, ”Well, I’m from a large insurance company. We could let the firefighters know where all the people with asthma are, all the people in wheelchairs, all the elderly, all the children, so that they could evacuate those houses first.” Another participant countered, “OK, but what about their privacy?”, and we soon settled on the idea of a mobile mapping app that colors some houses red, some orange, and some yellow. It wouldn’t disclose why certain houses were given certain colors, but firefighters would know that houses marked as red needed to be evacuated first.

What happened next has had a profound impact on my view of debates over data privacy. Enthusiastic responses gave way to an uneasy silence, and then one of the participants asked “So…first responders don’t already have access to this information?” What followed was disbelief, and then anger. Once everyone realized that this data could be anonymized and still used to save lives, the fact that it wasn’t already in use became unacceptable. It was the ‘aha’ moment when everyone in the room realized that we are all paying a daily — and largely invisible — opportunity-cost in ‘missed use’ of data for the public good.

Recent scandals over data breaches and corporate misuse have understandably made people around the world skeptical of big data and the way it’s being used. The growing public outrage against the wrongful use of personal information has led to a wave of new regulations — from GDPR in Europe, to similar frameworks in Asia-Pacific, Latin America, Africa and other parts of the world, as well as within the United Nations — all of them looking to increase transparency and accountability and to give people more of a say in what happens to their personal data.

So yes, we do need robust regulations to prevent misuse of our data. But we also need regulations to make sure that whenever our data can be used safely for the public good, it always is.

So what would it take for ‘our firefighters’ to have the best information they need to save lives? Do we have to sacrifice our privacy in the process?

Today, implicit in many of the latest regulatory reforms is an assumption that we are dealing with a zero-sum game, in which all innovations in the use of big data come at a cost to our privacy. Indeed, in the case of commercial applications such as targeted advertising, this is quite true: the more detailed the information about our behavior, the better an algorithm is able to predict what we might buy. The problem is that these same data protection regulations — in their effort to prevent misuse — may inadvertently perpetuate ‘missed use’. We need to reframe how we think about risk in a world of abundant data.

As a technologist, privacy advocate and head of an innovation lab inside the United Nations, I have seen first hand extraordinary opportunities to use new technologies in beneficial ways — without compromising on data protection and privacy. At UN Global Pulse, we work with partners to build safe and responsible applications of big data and AI and we advocate for policy reform to support this type of innovation. We partnered with Telefonica to use mobile network data to predict the spread of Zika virus, and more recently with Digicel to inform emergency response to volcanic eruptions in Vanuatu. We worked with BBVA Bank to show how anonymized debit card transactions can monitor people’s recovery from a hurricane in Mexico. We used data from postal flows to estimate countries’ GDP, literacy rates, and carbon footprints, and we used marine traffic data to detect refugee rescue events in the Mediterranean Sea. Our work with big data is guided by instruments such as the Guidelines on Big Data and the UN Personal Data Protection and Privacy Principles.

So what will it take for responsible uses of big data and AI to go mainstream and usher in the long-awaited data revolution for development?

A number of the companies we engage with across different industries do see the opportunity here and are keen to unlock the social value of insights from their data through data philanthropy. The mobile industry is furthest along here. Some regulators and policy makers, likewise, have told us that they recognize the untapped potential of big data for the public good, but this hasn’t translated into greater space for innovation in this new field. By and large, inertia still prevails.

I believe that large scale change will come only when people everywhere understand that, even as our data was being collected and used for years in ways we could neither discover nor determine, it also wasn’t being used to improve official statistics, public service delivery, early warning, and disaster response. In a sense, the lack of innovation here has resulted in a failure to protect the public from what turns out to be preventable harms.

We must account for both ‘misuse’ and ‘missed use’ of data

As privacy regulations continue to strengthen safeguards against misuse, we also need widespread public support for regulatory reform to address the risks of ‘missed use’ of this technology to benefit the whole of society. The stakes are too high not to do so.


Pulse Lab Jakarta is grateful for the generous support from the Government of Australia.