Neurotechnology : we aren’t ready

Brain Imaging System on the Samsung Galaxy Note. Creative Commons License.

The human brain might just be the final frontier of both modern technology and society. Although devices that provide “greater insight into brain or nervous system activity, or affect brain or nervous system function.” (Neurotechnologies) may seem otherworldly and futuristic, what was once the technology of science fiction movies, is now available to consumers. You might be under the impression that you haven’t experienced neurotechnology in your everyday life… and you’d be right! While it’s true that you may not have a headset that allows you to run your dishwasher or preheat your oven with a simple thought, that doesn’t mean these capabilities aren’t being explored. The possibility of such devices isn’t a far fetched future. Neural devices are already capable of either stimulating a user’s nervous system, replacing lost functions of the user, or even allowing the user to control external software or hardware — doing so by detecting and interpreting neural activity! (Friedrich et al.) This means that neurotechnologies aren’t just the mind control devices that are often portrayed to the public. Its applications already extend to research, therapeutic processes, rehabilitation, gaming, self monitoring, and more — allowing neural devices to establish themselves in all sectors of society largely under the radar. Is complete integration of humans and technology dystopian? Are we ready for widespread use of technologies that access and reshape human consciousness? No — I’m afraid we are woefully unprepared for the development of neurotechnology devices, because they will bring unprecedented privacy issues, worsen already existing inequalities and the digital divide, complicate or remove users’ autonomy, reinforce the matrix of domination, and there’s no existing legislation that addresses the complicated legal challenges brought about by these devices. That is a LOT of scary things. The consequences of neurotechnology devices are too dangerous to continue progressing them rapidly without a strong framework that anticipates the unprecedented challenges that they will bring.

Consider this — do you want your boss knowing that you are predisposed to depression? One of the most obvious concerns about neurotechnologies is the plethora of privacy concerns that they generate — if even our thoughts are shared unwillingly, what is left to ourselves? Unfortunately, the collection and distribution of neural data will certainly occur. It is naive to think that this new wealth of data won’t be used and exploited for profit — it will be another weapon in a companies’ arsenal of tools to increase their benefits and gains. These concerns aren’t just crazy speculations. They are founded in past breaches in technological privacy, as ​​“large corporations already provide third-party access, usually through customer’s unknowing approval, that infringes on patients’ right to privacy” (Khan and Tipu). Unless extremely stringent regulations are put in place that protect data privacy for neurotechnology users, we should anticipate that none of our confidential data or neural activity would be “off limits” to corporations trying to make a buck. Our unconscious brain activity will be up for grabs!

Beyond privacy, neurotechnologies may very well expand and worsen the digital divide that already exists between the upper and lower classes. Neurotechnologies are being explored at the moment for their potential to be used as enhancement devices. As these devices become commonplace, Luciano Floridi says in his book Information and Computer Ethics that it’s likely that “The digital divide will become a chasm… generating new forms of discrimination…. Between insiders and outsiders, information rich and information poor” (Floridi, pg 7). Imagine two evenly matched soccer teams. They are longtime rivals and their yearly game is super important to their communities. One team is from a private school that has the funds for these neural devices that increase stamina to lessen fatigue in their athletes, but unfortunately the other team can’t afford these devices. Is this fair? If the wealthier team wins the game, did they really earn it? I know it sounds absurd, but don’t be fooled into believing that this type of inequality may never affect you in your lifetime. In May 2018, USA Cycling announced it was partnering with Halo (a leading corporation in neuromodulation for performance enhancement) to help train its cyclists (CB Insights).

Halo Neuroscience’s Halo Sport headset. Creative Commons License.

That was 4 years ago — and innovation in the field has continued to increase in the past years. Regrettably, it is inevitable that lower social classes often are left behind when new technology advances are developed — especially when it comes to healthcare advances (Wexler and Reiner). This will be especially apparent when it comes to neural devices which have the potential to offer unthinkable advantages to its owners, such as: increased memory, improved physical performance, treatment of many psychiatric disorders, recovery from certain types of paralysis, and infinite other applications (Waltz). While already existing and hurtful, the digital divide will expand in unheard of and impossible to anticipate ways.

I think one of the scariest oversights when it comes to development of neurotechnologies are the ways that these devices could make us feel even more detached from ourselves and the “real world”. Do you ever feel like a zombie scrolling through your phone aimlessly? Have you ever wondered if YOU chose the movie… or maybe just picked the one that the Netflix algorithm targeted you for? I know from personal experience that some of my favorite shows have come from those recommended to me — but is that actually my taste or has technology changed it? Now imagine that you own a device that lets you do a multitude of actions “aimlessly”. Suddenly you can draft your essay, browse for a movie, order your takeout by the power of thought… I’m not convinced that these actions are your own. Does the core of who you are change when the powerhouse of your decision making is connected to something “other” than yourself? Early studies of neurotechnology users have uncovered that these devices indeed complicate a users ability to feel that they’re acting and existing separately from their device and others. In fact, some users felt completely “unsure about the authenticity or authorship of their feelings and behaviors” when using neurotechnologies (Brunner et al.). We should be concerned about the rising popularity of these devices if it means we will become disconnected from our bodies, losing our free will and sense of self.

Artist’s imagining of mind-control. Creative Commons License.

I have never liked when other people have told me what to do, and I think I would like it even less if it was a device rather than a friend or parent guiding me to do certain actions! If people can’t even tell if they are acting on their own wishes or are simply feeling compelled to do something that their integrated device thinks they would enjoy, all of the things that make people unique individuals will be blurred with algorithmic guesses. That is not the future I want to live in! The reality is there is so much uncertainty about the long term effects of these devices that I’m not inclined to use them, nor should anyone until more research has been done. It is scary to consider the way neurotechnologies could alter human nature if we aren’t proactive and set very strict guidelines after thorough and continuous research. I imagine neurotech would be especially damaging to young people who are just beginning to form their identities and are susceptible to peer pressures to conform!

Just like all products in the technology sphere, the creators of neural devices are — surprise surprise — overwhelmingly white males. That lack of diversity will introduce biases into the very architecture of the devices! Its safe to assume these biases could result in the strengthening of the Matrix of Domination, which “works to uphold the undue privilege of dominant groups while unfairly oppressing minoritized groups” (D’Ignazio and Klein). We already know there are deeply ingrained issues with algorithms and that even their creators aren’t always sure how they are working (Engineering)! If the algorithms that are embedded in neural devices are imputed with biased data… is it that crazy to assume those algorithms may begin to try to correct or change certain behaviors of minority groups? This could have consequences that change the diversity and customs of our society, and in his book — Floridi issues a fitting statement that the development of Information and Communication Technologies “has not only brought enormous benefits and opportunities, but also greatly outpaced our understanding of its conceptual nature and implications” (Floridi p 3). Thus, society can’t keep up and anticipate all the harm that will come from utilizing neurotechnology as enhancement devices created by a few powerful developers. Further, with flashy companies and innovators such as Elon Musk of the neurotechnology company “Neuralink” offering various enhancements to their users such as improved athletic and academic performance, and “superhuman cognition”, requires those in power to define what society should view as superior and good. So their ideas of “neural perfection” will rule the framework of neurotechnology for everyone.

This far reaching arm of the Matrix of Domination, strengthened by biased neural devices, will disadvantage minorities in all sectors — especially in terms of employment. A world with widespread neural devices could even mean that an employer won’t hire you in the first place! It’s no secret that employer discrimination already is a problem — in fact, a meta analysis of all available hiring discrimination field experiments in the US since 1989 found hiring discrimination to be consistently high magnitude over the past 25 years, with no significant progress made towards equal opportunity hiring. (Quillian et al.) So, why would we think that companies wouldn’t use all this new cognitive information available to them to hire the most profitable employees? They will likely do so even if that means turning down a more qualified individual whose brain activity flagged them for potential problems in the future. It’s not right!

Artist’s Rendering of CyberLaw. Creative Commons License.

Even in a perfect world where neurotech would be used for nothing but approved medical applications — what happens when they malfunction? Yes not if… when! Even the greatest technologies are susceptible to the occasional error or bug. But, what makes neurotechnology scary is the question of who is responsible if someone gets hurt at the hands of a prosthetic limb under partial control of its user? We don’t know, and that is a problem. This means that we will need GLOBAL changes to laws and regulations that protect citizens. We are incredibly unprepared for these complicated legal issues if we keep our current legal framework and definitions of “actions”. There will be serious problems if someone gets hurt at the hands of a user of a neural device — is it the device’s fault? Is the user innocent? What about the manufacturer of the device? My opinion is that it would be too easy for a guilty party to put the blame on another. This is a problem that has already arisen in testing — users have both reported feeling unsure if they had controlled their device and others thinking that they were in control of a device when they weren’t (Bashford and Mehring).

Above all, I just think the technology sphere needs to take a step back to create and enforce strict regulations and testing for neural devices, because while promising and novel, they could unleash unthinkable harm on our societies and way of life! As Floridi claims in his book, technologies are “making humanity increasingly responsible, morally speaking, for the way the world is, will and should be” and thus we should remain vigilant and “Give ourselves a chance to anticipate difficulties, identify opportunities, resolve problems” (Floridi, pg 3). To me, it just is common sense to be critical about what we allow to access, interpret, and even alter our brain activity! I hope as you see neurotechnologies becoming more and more relevant in the future, you yourself will do some research and demand high standards from all forms of technology!

SOURCES:

Bashford, Luke, and Carsten Mehring. “Ownership and Agency of an Independent Supernumerary Hand Induced by an Imitation Brain-Computer Interface.” PLOS ONE, vol. 11, no. 6, 2016, https://doi.org/10.1371/journal.pone.0156591. Retrieved March 5 2022.

Brunner, P, et al. “Current Trends in Hardware and Software for Brain–Computer Interfaces (BCIs).” Journal of Neural Engineering, vol. 8, no. 2, 2011, p. 025001., https://doi.org/10.1088/1741-2560/8/2/025001. Retrieved March 7 2022.

CB Insights. “21 Neurotech Startups: Brain Technology, Implantables, and Neuroprosthetics: CB Insights.” CB Insights Research, CB Insights, 21 June 2021, https://www.cbinsights.com/research/neurotech-startups-to-watch/. Retrieved February 23 2022.

D’Ignazio, Catherine, and Lauren F. Klein. “Data Feminism”. The MIT Press, 2020. Retrieved March 9 2022.

“Engineering With The Brain.” Neuralink, 2021, https://neuralink.com/applications/. Retrieved March 5 2022.

Floridi, Luciano. “Chapter 1: Ethics after the Information Revolution.” The Cambridge Handbook of Information and Computer Ethics, Cambridge University Press, Cambridge, UK, 2012. Retrieved February 20 2022.

Friedrich, Orsolya, et al. “Clinical Neurotechnology Meets Artificial Intelligence: Philosophical, Ethical, Legal and Social Implications”. Springer, 2021, https://link.springer.com/content/pdf/10.1007%2F978-3-030-64590-8.pdf. Retrieved March 1 2022.

Khan, Shujhat, and Tipu, Aziz. “Transcending the Brain: Is There a Cost to Hacking the Nervous System?” OUP Academic, Oxford University Press, 16 Sept. 2019, https://academic.oup.com/braincomms/article/1/1/fcz015/5570173?login=true. Retrieved March 5 2022.

“Neurotechnologies: The Next Technology Frontier.” IEEE Brain, 26 May 2021, https://brain.ieee.org/topics/neurotechnologies-the-next-technology-frontier/. Retrieved March 8 2022.

Quillian, Lincoln, et al. “Meta-Analysis of Field Experiments Shows No Change in Racial Discrimination in Hiring over Time.” Proceedings of the National Academy of Sciences, vol. 114, no. 41, 2017, pp. 10870–10875., https://doi.org/10.1073/pnas.1706255114. Retrieved February 24 2022.

Waltz, Emily. “How Do Neural Implants Work?” IEEE Spectrum, 24 June 2021, https://spectrum.ieee.org/what-is-neural-implant-neuromodulation-brain-implants-electroceuticals-neuralink-definition-examples. Retrieved February 21 2022.

Wexler, Anna, and Peter B. Reiner. “Oversight of Direct-to-Consumer Neurotechnologies.” Science, vol. 363, no. 6424, 2019, pp. 234–235., https://doi.org/10.1126/science.aav0223. Retrieved February 22 2022.

--

--