Ethically Sourced Data

Axonium
4 min readJul 5, 2018

--

Here’s an understatement: Your data is out there. Actually, it’s everywhere. You are generating volumes of data — usually without realising it — and unlike the unheard loner clapping in the forest, the waves you are creating are not only heard, but are being recorded …then analysed and sold to a company that re-analyses it in order to make products that lonely clappers are more likely to buy before heading out into the woods.

Meanwhile, we’re clapping away thinking that no one is listening. It may seem a little dramatic, but it has been happening for a while and we are only just starting to realise it. I say “we” as a collective whole, even though some of us have been saying this for some time now…

In speaking to a friend of mine recently about the Facebook / Cambridge Analytica scandal, he mentioned that what struck him the most is the realisation that if you think you’re getting a product or service for free, then the reality is that you are the product and what you thought was the product was actually just the bait. In these instances, your data is the product being traded by data merchants.

In defence of the data-merchant-trade, we’ve been complicit. As a whole, we have willingly (albeit incrementally) given up our privacy and data as an acceptable price for convenience. I accept Google Maps knowing where I am in order for them to tell me which roads have bad traffic on the way to work. We gave up our data because we didn’t understand its value — but the middlemen did, and that’s when we first started being taken for a ride.

The intervening period between the start of the trade-off and the bear that is now rousing from its hibernation gave the middlemen confidence to take increasing advantage of our collective slumber. Their hubris is best demonstrated in a concept central to any discussion about privacy: proportionality — is what is being given up proportional to what is being gained? Generally, the convenience that we embrace is disproportionate to the conditions we have to agree to in order to get it. Not sure what I mean? Have you ever read any of the T&Cs that you agree to unlock that “special feature” or give you “special access”? My guess is that you read it and said “No”, or you didn’t read it, and clicked “I Agree”.

It is a sliding scale, obviously. Most people don’t mind giving up some data in order to achieve a self-serving interest. For example, you willingly tell your doctor personal information that you wouldn’t otherwise share in order to get treatment that makes your life better. Rightly so. But what if that doctor gave — or even sold — some of that information to a big pharmaceutical to produce a pill that you will end up having to pay for? Your information is being transacted without your consent. Worse still, instead of being paid for your data, your unconsented data is being used to take money from you.

Data is useful. Data is powerful. Data can be used to make more effective medications, help doctors and patients achieve better health outcomes, support law enforcement, empower teachers and students, support more efficient government expenditure of tax payers dollars, and much more. Data can do all these things, but my point is that it is your data. Your consent should be required, and if there’s money to be made, you should be getting most of it.

Data can and should be used to improve our individual and collective lives, but it must be done ethically. In order to use data ethically, it must be sourced ethically. In order to source data ethically, consent is key.

The current mess of unconsented data use came about because not enough people thought or cared about the consequences. Fortunately, that is changing. People are seeing through the fallacy that giving over data by using a service doesn’t equate to handing over all rights to that data, and the offence of unconsented data use is not lessened because it can be hidden. The ignorance (and to our own shame, the apathy) of the populous toward secondary use of data has provided cover for the data merchants to continue.

Consequently, laws have not been written to adequately protect data owners. Just because the law allows or prohibits something, does not make it right. Case in point: Facebook / Cambridge Analytica. In that scandal, there wasn’t any breach of law. It was a breach of trust.

Much like evidence presented in a court room, data obtained without proper procedure can render it unable to be used. Such data would be unused, with its latent potential remaining unfulfilled. There is an answer — consent from the data owner.

We need ethically sourced data.

Author: Ian Varughese (Founder of Axonium)

--

--

Axonium

An information-rich platform where users access and benefit from their ethically sourced health data in a decentralized, secure and trusted environment.