Not My Keys, Not My Thoughts

The Coming Battle for Privacy in Consumer Neurotechnology

NeuroTechX Content Lab
NeuroTechX Content Lab
11 min readJul 26, 2021

--

It’s no secret. Tech companies have amassed vast fortunes by predicting the behavior of online users using rudimentary metrics such as clicks, swipes, and taps. Today, corporate coffers are full and ready to fund research on direct measures of how you think, feel, and react when plugged into the web.

70% of all new economic value created by tech companies in this next decade will be claimed by digital platforms and services that commodify all aspects of our life, predicts the World Economic Forum. The majority of this value will be harvested from our personal data, including biometric brain data gathered through neurotechnologies.

Neurotechnologies are steadily becoming more integrated into everyday wearables (e.g. headphones, earphones). More invasive neurotechnologies that have existed for decades in a clinical capacity (for example, to treat conditions such as depression and Parkinson’s disease), are also now undergoing commercial development. Most notable of these is Elon Musk’s Neuralink and the dark horse Synchron.

Advances in neurotech are hurtling forward without taking the time to ensure adequate space for ethical discussion, particularly around privacy protection. Nevermind that our very brains are becoming public commodities. Most of us can agree that predicting and influencing user behavior with overt metrics is one thing, but taking a peek inside unsuspecting users’ heads to make a buck may be a step too far. The fight for privacy protection today, like the pioneering Chilean Neurorights Bill, is a fight for Self-Sovereign Brains in the not-too-distant future.

If you ask us, our thoughts are an intimate expression of self, agency, and identity and should always remain sovereign. At the same time, neurotechnology is an inevitable eventuality with compelling advantages for the future of neuroscience, knowledge-sharing, and health.

Fortunately, recent advancements in peer-to-peer decentralized technology and cryptography may minimize potential privacy risks while unfettering the benefits of neurotechnology toward a prescient view of a majority digital economy in the next decade.

Consumer loss of privacy: How it started

The first two decades of the 21st century have seen the mass automation of economic engines of value creation on digital platforms engineered to engage and retain users. Personal data troves amassed by software service providers are transmuted into predictive models of human social activity, commerce, and patterns of decision-making to drive optimized revenue-generating products and services.

This first wave of personal data aggregation has so far been limited to external signals of user intention and behavior; an evolution of the infamous Tinder-swipe. Essentially, cognition and preference are being inferred by motor metrics alone.

Clicks, taps, swipes, speech, and text have served very well as motor inputs for training machine learning models to predict decision-making, with exceptional success in behaviour modification, ranging from purchasing decisions to voting.

Despite this abundant power, Big Tech has continued to test the boundaries of what degree of privacy consumers are willing to sacrifice in return for services. For example, in 2010 Google attempted to transition into more intimate metrics of the mind itself, with eye-tracking technology acting as a proxy for cognitive states to boost ad targeting. At the time, this provoked a strong guttural reaction from consumers and anti-privacy groups.

For Big Tech, the issue was timing, not the technology itself. Given enough time, the convenience offered by service platforms engineered to maximize feelings of pleasure would eventually outmatch user caution.

Consumer loss of privacy: How it’s going

The pace of innovation over the past decade has resulted in a range of non-invasive technologies that passively capture and use our biometric data. Today, smart watches read our heart rate and emojis respond to our facial expressions. With the rise of consumer neurotech wearables, our brain metrics are now up for grabs.

Low-cost and non-invasive neurotech “wearables” are widely available. These existing devices are capable of decoding both general brain activity (e.g. inferred levels of stress and concentration) and specific thoughts (i.e. mental commands) through electrodes touching the scalp. Current solutions range from relaxation apps that measure a user’s cognitive state to modify virtual worlds, to “pass-thoughts” (as an alternative to traditional passwords), and protocols around “risk reduction” (for example flagging when a person operating machinery is drowsy).

According to the Neurorights Initiative, current neurotechnologies have the potential to infringe on our rights to mental privacy, personal identity, free will, and protection from algorithmic bias. For example, invasive technologies have been documented as disrupting people’s sense of identity and challenging people’s sense of self. One patient undergoing stimulation for depression reported, “it blurs to the point where I’m not sure … frankly, who I am” (Yuste et al., 2017).

Manipulation of brain activity through neurotechnologies confers risks in terms of removing ultimate control over our own decision making. For example, it would already be possible with our current neurotechnologies to learn “brain states” associated with particular spending or browsing habits. In turn making it possible to take advantage of, and even induce with targeted content, favourable brain states.

Informed consent is also a key ethical issue. Tech demos pitched as company “recruitment drives” could in fact be considered veiled PR campaigns for human clinical trials of invasive consumer neurotech devices. In order to uphold citizens rights to informed consent, science will have to adapt quickly to navigate this new blurring of science and spectacle. One such concern is mass elective brain surgery to implant devices ultimately controlled by one central company, board of directors, or even perhaps subject to the whims of a single influential personality.

A Catalyst for Rapid Digitalization

The sudden digitalization of society accelerated by COVID-19 has signaled a cultural shift in attitudes towards consumer services powered by biometrics. A new swell of interest in vaccination and digital health records, biometrics for security, automation of civic services, and personalized health has emerged. With this, regulators are struggling to keep up with this rapid pace of innovation.

A Capitalist’s Carte Blanche

The United States of America, while retaining an overwhelming share (66%) of the digital economy, has not yet passed comprehensive personal data protections, let alone recognized the unprecedented danger of social engineering for consumerism.

The most comprehensive legal precedents have been established by the 2018 California Consumer Privacy Act. Experts have pointed out that the CCPA classifies biometric as generic personal data, providing regulatory enforcement gaps that may endanger unwitting consumers.

The lack of special protections for this data, such as verifiable informed consent, enables prospective companies to establish precedents for massive data harvesting and social engineering campaigns that may make the political capital required for any future regulatory response too immense for any meaningful government action.

History has shown how powerless our political institutions can be in the face of powerful tech lobbyist groups. But if our government can’t protect us from the data predators of silicon valley, who can? Maybe only we can.

A Global Rallying Cry for Decentralization

Personal data regulations in the United States affect a global consumer market of 4.27 billion internet users, whose personal data falls under the jurisdiction of the 2018 Cloud Act, superseding local and international law, such as the EU’s 2016 General Data Protection Regulation (GDPR).

This has led to an outcry and swift legal action, including the shoring up of privacy protection laws in the European Union and the diversion of massive institutional funding to catalyze homegrown startups, attract foreign investment, and catalyze bleeding-edge research in privacy protection.

A particular focus of internet privacy activism has honed on migrating user data, financial and civic services, and critical cloud infrastructure from centralized servers to alternative peer-to-peer networks. This has had the overall effect of rebuilding the web on a decentralized network of numerous small individual nodes that in time may rival the computational strength, storage capabilities, and stored data value of Big Tech.

Not My Keys, Not My Identity!

The European Commission has led the charge with a surprising announcement to grant every EU citizen a digital ID wallet that serves as a personalized firewall between their most sensitive data and cloud service providers. The wallet operates using end-to-end cryptography to store and transmit data, so that only those that are intended to receive transmitted data are able to read its contents.

The digital ID wallet uses asymmetric cryptography to achieve improved privacy. Each time a new wallet is created, a mathematical solution is computed to generate a public/private pair of keys, or a long string of letters, numbers, or symbols.

The public key can be freely shared with others on the internet and is used to encrypt messages that only the private key holder can read. In this way, data remains protected through obfuscation from harvesting operations. The only way for the data to be compromised is if the user gives up or leaks their private key, or if the encryption is broken through brute force; a nontrivial task for modern systems.

Cryptographic key pairs grant absolute ownership, or self-sovereignty over the data of the user wielding those keys. Individuals can store their data on their own devices and choose what they share with external parties without depending on a centralized service to process their sensitive information.

The proposed digital ID wallet regulations have the additional heavy-hitting consequences that all internet services providers operating in the EU will be required to interact with citizens through this digital wallet. This is only the tip of the iceberg in a slew of technological advancements in cryptography that are beginning to find a place on the digital battlefield for consumer privacy.

Strength in Numbers

Despite these important advancements, asymmetric encryption alone is not sufficient for verifying or enforcing data ownership when multiple parties are involved and exchanging information at very fast rates. Imagine that Bob shares their brain data with Dr. Alice, who promises to provide Bob with a diagnosis on their health. Dr. Alice opens the dataset and realizes they can’t perform the computation-heavy diagnosis on their own so they must send the re-encrypted data to a third party, Sam. The ownership of the data cannot be guaranteed in this case if the data is re-encrypted with a new private key pair unless there is an impartial third party that can guarantee record keeping of the original data provider, which individuals they’ve shared their data with, and the terms of sharing. Today, these legal constructs are built into unwieldy code running on centralized cloud providers that may not always meet definitions of impartiality due to regulatory or geopolitical backdrops.

Blockchain technology has emerged as a promising counterweight to centralization of web infrastructure. A blockchain is a distributed ledger that records transactions between parties and is as secure as the number of unique nodes participating in the network. The greater the participation, the more difficult it is to manipulate the global state of the blockchain or to extract data or information from individual nodes.

Autonomous Solutions to Data Ownership

A smart contract blockchain solves Bob’s data sharing solution by running smart contracts; specialized code optimized to execute agreements between agents on a redundant ledger of permissions and transactions that is exceptionally cost-prohibitive to corrupt or modify.

The results of a smart contract are considered valid when sufficient consensus is achieved on the computation by each node in the network. This decentralizes the burden of proof and reduces the possibility of a violation of the smart contract agreement.

The utility of smart contracts have been beautifully demonstrated by the fledgling decentralized finance ecosystem running on the Ethereum Virtual Machine. 55 billion US dollars are locked in smart contracts that execute automated agreements between businesses and individuals for lending, borrowing, payment, insurance, and other complex financial products.

Recently, Neuroethics groups have advocated for the use of blockchain technologies as a potential protection against infringements of Neurorights (Yuste et al., 2017). The Neurorights Initiative advises centralized storage of brain data should be restricted and instead advocate for decentralised storage and managed using blockchain technologies. The inherent transparency of these technologies enables data to be tracked and audited, and smart-contracts allow for the transparent control of data usage without the need for a centralised authority (Yuste et al., 2017). Competitors to Neuralink, such as inBrain have taken note and committed to researching this technology for their products.

The benefits of smart contracts do not stop at opening up complex financial tools to a global population. The Ocean Protocol is an open source smart contract software recognized by the World Economic Forum that allows users to tokenize their data on a distributed ledger, solidify claims of intellectual property, and control the monetization of their data through programmable royalties and embedded licenses.

Ocean has also introduced Compute-to-Data, a smart-contract infrastructure for executing algorithms on sensitive data without ever revealing the information to the requestor. This way Dr. Alice can use Sam’s sophisticated analytic services without revealing Bob’s identity or their data.

The Ocean Protocol technology stack has encouraged the growth of decentralized autonomous organizations (DAOs) of data providers that pool their data together into vaults controlled by smart contracts that enforce community consensus. A neurotech data union would have the ability to decide under what terms their data is used and enforce consent in real time. For example, Bob can join forces with others to generate highly coveted pristine quality open-access neuroimaging datasets reserved for scientific researchers but that perhaps require payment for access by marketing or for-profit researchers.

The Campaign for Self-Sovereignty: What Should You Do?

The fight for privacy online has entered a new chapter. Niche cult software, such as asymmetric encryption has begun to enter mainstream consumer applications intended to shield users from prying eyes. For the very first time, public-private key pairs coupled with decentralized smart contracts that enforce agreements between data providers and consumers now empower individuals to dictate the terms of how their digital footprint is used.

You can contribute to this fight by creating your own digital identity wallet, joining a data DAO such as DataUnion, or the OceanDAO, signing up to participate in research with the neuroscience DAO Opscientia to store your neurotech wearable data, or running a node on a distributed peer-to-peer network.

Vote with your data by using end-to-end encryption whenever possible and cut off the flow of revenue to big tech companies that profit from your digital activity. Failure to coordinate may very well set the path to a dystopian future of mass monocultural thought, where stimulus-reward pairings have replaced the joys of art, creativity, and the pursuit of enlightenment through rational thought and scientific discovery.

There are many unknowns regarding how regulations, technology, and geopolitical interests will unfold — only time will reveal the result of consensus on the direction our digital communities will take in the imminent future.

References

Yuste, R., Goering, S., Bi, G., Carmena, J. M., Carter, A., Fins, J. J., … & Wolpaw, J. (2017). Four ethical priorities for neurotechnologies and AI. Nature News, 551(7679), 159.

Written by Shady El Damaty, Ph.D. & Sarah Hamburg, Ph.D., edited by Hazal Celik and Garrett Flynn, with artwork by Firas Safieddine.

Shady is a Cognitive Neuroscientist and founder of Opscientia, a community-owned ecosystem that unlocks data silos, revolutionises collaboration, and democratises funding.

Sarah is a Cognitive Neuroscientist (specialising in EEG) with three years of industry experience in emerging digital technologies.

Hazal Celik is a researcher in cognitive science, neurotech enthusiast and consultant.

Garrett Flynn is a creative technologist working at the intersection of neurotechnology, ethics, and interactive media.

Firas Safieddine is a Barcelona-based designer, architect, artist, researcher, and neurotech enthusiast

--

--

NeuroTechX Content Lab
NeuroTechX Content Lab

NeuroTechX is a non-profit whose mission is to build a strong global neurotechnology community by providing key resources and learning opportunities.