Personal Data Protection: The Killer Paranoia

How real is the sense of privacy and is it worth it in hard times?

Sethuraj Nair
The Startup
9 min readNov 7, 2020

--

Photo by Dayne Topkin on Unsplash

Renowned AI researcher and overall jovial Ben Goertzel rarely sounds so miffed. An interviewer just asked him of the role of technology in tackling the pandemic. Goertzel’s view was earnest, yet strong: the progress we should’ve made by now hasn’t been made since most researches are going on in seclusion.

And he’s so right. Most datasets are all too fragmented, siloed in mutually inaccessible lab records. Precious information lie aching to meet its kind elsewhere in the world, destined never to be aggregated, extrapolated, enhanced. Vital cues that correlate viral genomics with medical indicators languish in the confines of some doctor’s office where it is used, at best, to personalize medication for a single sick man. Any wonder even the finest algorithms struggle to yield groundbreaking insights?

Anyone with a sober head on shoulders may find it insane that things should go this way. As Goertzel remarked later in the interview, his eyes betraying slight trepidation, all pertinent data should be masked and encrypted duly and made available for smart analysis and machine learning, so that somebody with the right byte-cruncher stands a chance to employ it to the benefit of humanity.

Tragically, this might forever remain a pipe dream. We are high on a massive doze of paranoia surrounding data privacy. It seems that we, the super thin-skinned modern humans, are doomed in much weirder ways than imagined.

Why aren’t there enough debates on how a liberal sharing of data would’ve helped us benefit from the cumulative power of myriad tests and clinical trials? Because not many dare. By now, we might’ve traced the causes of what currently seem arbitrary. Think, for instance, all those random deaths of young persons with next to no latent comorbidity. We would’ve been better poised to peg rightly the pandemic’s demographic distributions, gauge its medico-social and economic impact across populations, at all levels.

But no. We like playing our otherwise useless cards too close to our infected chests. Fear of death hasn’t proved persuasive enough. To the contrary, it has invigorated the peddlers of the privacy paranoia. Beware of the state, they tell you, for it wants your data for pulling a China on you.

But isn’t data protection important? Sure it is. You don’t want the whole world know whom you called on last night, how scant your bank account is, that private ailment of yours, how bad a student your child is. But behind such insecurities lurk some toxically touchy norms and notions whose numbers grow from week to week. Now, with the mere mention of word privacy one can raise a smokescreen or stir up a state of panic. Many of these engineered concerns are uninformed or overtly malicious, blind blissfully to the great advancements in the fields of encryption, cloud storage, data masking, secure transmission and latest analytical paradigms.

Unpacking the “Private” Data Myth

PII, PHI, PIFI, GDPR, and on and on…

The European Union has recently brought in a radical regulation called GDPR to protect personal and health information. The intention sounds fair but its manifestation in these times can’t be worse. GDRP has made it ever more harder for analytic and machine learning systems and communities to forage for data. Privacy-sensitivity is at an all time high. Companies, especially the modest of the lot, wouldn’t want to get sued for some inadvertent breach and pay colossal sums in penalty, balking at the mere thought of data sharing . As high as 41% of small businesses believe they can’t embrace Big Data processing in near future as the stakeholders stay wary of the strategies to be tailored in the GDPR context.

If it weren’t for such sinister-sounding, acrimonious restrictions, the personalized data problems would’ve been reduced to random folks getting touchy about it — a sound cyber security framework would have dealt with them.

Cyber Attacks

It isn’t hard to beguile the gullible into finding every personal data-breach to be hazardous, poised for the most ominous aftermath. Take the GDPR stipulation that everything from a person’s first name to e-mail address are to be protected. There’re folks who think a noncompliance here might mean passwords getting cracked, e-mail hacked, money and life at once exposed to critical risks.

Heightened regulations heighten panic, as almost none precedes sufficient public sensitization. This is a grave gap, bound to widen in tune with each lawsuit filed and every dollar paid in penalty.

Political and Social Manipulation

Well, one signs up for manipulation the moment one signs up for democracy; only the means and modes of it may vary from epoch to epoch and reign to reign.

In our time the dark games are played subtly, subliminally. It doesn’t so much matter what your name is as it is to know you likes and dislikes, thoughts and affinities. Your digital persona may be amply masked and faceless, all the data well-encrypted, every regulation complied with, but could still be exposed to persuasion and propaganda as long as you interact with the cyber world.

There won’t ever be a way to encrypt and protect a hyper-engaged social mind. And it’s not that bad a deal.

The “Authoritarian” Argument

This goes well beyond just nosing around for paltry gains. Here we deal with potential Hitlers and closet Stalins, and go all out to throttle their evil plots. To get hold of your life and mind, you see, these folks adopt all methods of intrusion. They snoop on the patterns of your posts and tweets, swerve spy cameras your suspicious ways, tune up AI bots to say Hi to your profiled face before your mom could.

All of it should be angrily resisted, as it is. But not without a jot of hypocrisy. When some investigation hits the dead-end, the same prim public lambast the cops’ inability to grab the right footage to net the culprit. We blame the agencies for their disinclination to trace the e-mail trails and scrounge the call history of those we suspect, loathe, or distrust.

And what if one says GDPR could sound just as authoritarian? What if I’m wishing my data to be used in all sorts of altruistic ways? Why block my way?

Who decides the right flavor of liberty?

Discrimination

Here postmodernism meets privacy. We accuse the partisan establishment of being slyly nepotistic, racial, at times patriarchal, and of using personal data as the tacit lens to pay selective attention. Those noxious algorithms, for instance, have women and men see different job ads.

Unfair, for sure. But the moment a remark made online sounds incorrect politically, too racially coloured, too anti-women, we want the state to intervene with their set of filters and handcuffs. How blithely we forget that our civil behaviors don’t reflect our prejudices and outlooks, which run much deeper, with enough power to precipitate and perpetuate grave troubles in the log run.

Perhaps we must look less and less at the shallow social signs and start addressing the innate complexity permeating the great mesh-work of subconscious and evolution.

Improper Profiling

All too often an individual’s perception of oneself can come in conflict with those of the society and the government. I may have been born to Hindu parents but may not identify myself as Hindu. All the same, we live in one of the most identity-conscious eras in the history of humans. We have all but forfeited the convenience of gender binaries in favour of inclusivity. We tote around a ‘negative dictionary’ of unutterable words that offend who knows who. Such changes have been accomplished to a degree by the engineering of the public and societal perception, intense sensitization of gender, religion and race. We want social security based on our economic, racial or gender standing, want the welfare-programs to be smart and digital, but aren’t prepared for digital profiling.

Erratic Identification and Its Implications

This, perhaps, is the strongest and the most compulsive of arguments favouring individual data privacy. Your data can be inadvertently abused or misused by a broken algorithm, inexact analytics, poor data processing. But isn’t this too universal to be an issue? Anything and everything can go wrong. Odds of an algorithm screwing up isn’t much greater than you mislaying your passport, sending a wrong attachment, or forgetting to delete message that’s too private. Perhaps an authority or agency formally obligated to guard your privacy might act more responsibly than you do. It may sound annoying, but it’s true.

Forcing Loyalty and Choices

In 2014, Facebook expressed its willingness to invest in the internet infrastructure and connectivity in rural India, a plan that had to be later shelved due to a strident critical hullabaloo over the possibility of monopoly and undermining net-neutrality. I still believe it was a shame that the rural poor, who anyway wouldn’t have harmed by such nuances, had lost their chance to go cyber-mainstream. Sure, Facebook may benefit but there wouldn’t have been much to lose for many other than their competitors.

Probe a little broader to watch things work the other way. Just see how India (as well as the US) keeps banning Chinese apps citing potential data breach which, as it’s amply evident, is an act of sheer political retribution through economic means. Here the state is asking citizens to think the way it thinks, choose what it does. As an individual, I should find it more bothersome to see my choices taken away in the name of privacy.

Financial Misuse

When money went telephonic first and then digital, with it grew a certain reliance on such incontrovertible fact-lets as one’s date of birth for cross-authentication. It might be to this arrangement that the private data paranoia traces its roots. Oh, they know your mother’s maiden name? Brace for the smart loot, man.

Now — those in the business of security know this is hardly the case any more. Encryption has since been advanced much, and so have the digital certificates, tokens, multi-factor authentication, bio-metric identification, and tons of other checks. But someone wants people to stay deluded, to keep thinking an exposed DoB stands to risk revealing not just the state of your age but also of your bank account.

One shouldn’t also miss here to note a bit of libertarian duplicity. Some of us are all for laissez faire that tacitly lifts governmental curbs and control over trade and transactions, but then want stern state-guarded regulations in the name of privacy. This betrays an inability to think systemically.

Freedom to Fake

If I’m asked to pick one real cause for the privacy paranoia, it’ll be the perceived threat on one’s freedom to fake.

Not that such freedom isn’t important, but it’s perhaps the hardest aspect of the paranoia to address and alleviate. So many forces are in action here, from socio-political to neurotic. It’s of consequence to me to secure my facades, tuck away my personal ghosts and shadow selves. And I should have the liberty to define the extend of such protections, which flings it beyond the realms of logical evaluation of my own version of privacy needs and discretion. If I say I’m not open to see my fever info as an elemental value in a Big Data effort that may lead to some significant scientific insight, you can’t question the rationale or the ethic of my choice — that’s that. Still, even if we can do little about it, the effect and influence this issue can be subdued by a basic appreciation of the arbitrary and often unfounded nature of the fears that lead to it.

So?

These are weird times. The clumsy human with all his evolutionary baggage and atavistic fears is already finding it hard to keep pace with his smart creations. Technology has a life of its own, is capable of redrawing its bounds without warning, of tuning own trajectory of progress, forcing its apostles to rush back to the drawing board and resume fumbling with a few fresh regulations, almost always to no avail. Imagine adding to this irredeemable mix the extreme postmodernist identity-consciousness, staunch individualism, and putatively sacrosanct mob-sensibilities.

The trouble is, as the present-day problem of privacy protection can seem purely techno-genic, one would keep expecting technological solutions whereas it really is deeply systemic, its roots tracing right down to human nature and tribal ethos. It’s time to give a break to fortifying the digital locks and and speak to the collective concerns in the face. If anything can be resolved through dialogue and sensitization, it should be. Remember: machines are only as smart as data. Digital intelligence and analytical capacities will continue to advance, and so will the potential to synthesize tremendous insights using them.

Life, health, longevity, innovation, sustainability and collective welfare must prevail over isolationism and an ephemeral sense of privacy.

Why die with all the data?

--

--

Sethuraj Nair
The Startup

Lover of words. Lover the worlds, both real and digital.