Agreed that it’s exciting to consider how surfacing opaque patterns can help us live richer lives. Still, as we imagine what we gain, it’s worth asking what we lose in the Age of Context.
I think working with data is almost a little fraught, because when I started with it, it looked like “sky’s the limit,” and I saw it from a very personal angle. Now, if you generate data, the first question to ask about it is, “How is this data going to be used against you?” While I’m still fighting the fight to show the narrative potential of data, there is a real shadow over it. I’m not sure where we land if the value added by having access to your data or having the data saved is greater than the potential damage that that data can create in the wrong hands. That’s something I wrestle with at the moment.
Consider the scenario described where my Jawbone does more than count steps:
In the future my Jawbone won’t simply count my steps, it will also be able to integrate with other data sets to generate personal health insights. It will have tracked over time that my blood pressure rises every morning at 9:20 after I have consumed the third coffee of the day. Comparing my blood rate to thousands of others of my age range and demographic background it will know that the levels are unhealthy and it will help me take a conscious decision not to consume that extra coffee through a notification. Data will derive insight and that insight will, hopefully, drive action.
I want this.
I want a world where the ambient environment gently nudges me to make better choices. But it would be reckless not to consider the full implications of this. What does it mean when your employer and your insurance company have access to the same health data? What does it mean if your company requires you to wear a fitness tracker at all times as a condition of employment?
On the surface of it, anything that coerces more people to live healthier, longer lives can only be a net benefit to all of us. But honestly, with every revelation about yet another way governments violate my privacy, or yet another security breach, I’ve grown more and more anxious about the digital places I used to naively consider safe spaces. The prospect of having the innermost workings of my body accessible to people I don’t know, who might wish me ill, to parse any way they please…it doesn’t feel good.
So yes to deeper, more useful relationships with our tools. But hopefully we get there with both eyes wide open, and with a full understanding of the risks as well as the rewards.