From Brain Reading to Brain-hacking to Neuro-capitalism

Arturo Di Corinto
The Startup
Published in
6 min readFeb 28, 2021

Neurotechnologies imply a conjunction between neuroscience and digital capitalism — neurocapitalism — which has potentially disruptive implications for individual and collective life.

By ARTURO DI CORINTO

January 29, 2021

Pulse: Owned painting

Would privacy exist if thoughts were legible? Evidently not. Then it becomes important to reflect on the scope of the “brain reading” techniques that science offers us. On the occasion of the European day of privacy, the Italian Privacy Authority wanted to convene experts and scholars to question the protection of data processed by neuroscience and used by artificial intelligences.

In fact, today there are already projects for the cerebral installation of microchips designed to contain the effects of neurodegenerative diseases, enhance perceptions, save memories, amplifying or selectively erasing them. Functional magnetic resonance imaging can already decode different types of brain signals, and tomorrow perhaps we will be able to read thoughts and influence mental states and behaviors, acting directly on the neuropsychological sphere. But, quoting a phrase dear to Stefano Rodotà, former president of the Italian Privacy Authority “not everything that is technically possible is legally lawful and ethically admissible”.

In fact, if someone can read our brains, will there still be the right to defense, the right to be forgotten, the right to silence? And the freedom of the vote, confessional freedom, information pluralism? Obviously not, since every other freedom depends on the secrecy of our cognitive sphere. So the question is: what happens when this sphere is invaded by neurotechnology? How can we remain masters of the data produced by our brain activity?

For the president of the Authority, professor Pasquale Stanzione, just as the habeas corpus was the foundation of the rule of law and the habeas data the foundation of the modern notion of privacy, the habeas mentem could become the foundation of individual freedom and rights connected to the person in a world of machines that read our brains.

The habeas mentem as the basis of the “neurorights” necessary to avoid a neurodeterministic drift, to outline a legal and ethical status to combine technological innovation with the dignity of the person. With the defense of the sovereign self, which is the prerequisite for every other right and freedom. According to Stanzione, the risk is that valuable innovations such as those for the treatment of neural diseases transform man into a ‘non-person’, a subject to be trained, normalized or excluded.

Elon Musk’s project

The central example of his cultured dissertation is Elon Musk’s Neuralink. A device that, inserted in the brain of pigs, and tomorrow in the human one, is able to read it to identify any pathologies. What if it was used for different purposes? With brain-reading technologies, human information can be exploited for commercial purposes and with semantic interpretation they could become a real “truth serum”.

The risk, according to Stanzione, is that the cognitive enhancement purposes of the current brain-machine interfaces that allow us to amplify ‘transhuman’ capabilities — such as remote control of objects for paraplegics -, used outside the clinical context could make us slaves. Think of Facebook and neuromarketing. From persuasion based on profiling, the increase in the predictive capacity of platforms would pass from suggestion to awe, to slavery. The risk is not just hacking the brain. Neurotechnologies imply a conjunction between neuroscience and digital capitalism — neurocapitalism — which has potentially disruptive implications for individual and collective life.

Between technology and human rights

In spite of those who still believe in the neutrality of science and technology, one of the panelists of the conference, Professor Father Paolo Benanti, clearly spelled out that “every technological artifact is a device of power. Any technological artifact implies or denies recognized rights ”. To make himself better understood, he gave the example of the mechanical tomato harvester, which in California led to the transition from 3,000 tomato growers to a few monopolists. With the result of depressing the labor market, often without guarantees, of the collectors who through it aspired to citizenship, as well as the small owners who could not afford it and who lost the challenge of competition.

Benanti therefore wanted to remember that neurotechnologies are widespread in society thanks to drugs and that unlike electronic ones they are immediately available to everyone, with consequences that we do not always understand or want. But today that “life before becoming history becomes data” algorithms can “cluster” people making their behavior manipulable by others.
Benanti said that today there are applications to improve the cognitive capacity of young people, promote full and satisfying sociality, protect themselves from cyberbullying, but which could backfire. What to do then ? Here is his proposal: “After using algorithmic machines and collecting such privately intimate data, that data should be deleted, and the same should apply to the ‘cognitive faculties’ of artificial intelligences trained with that data”. “All consumer items transmit data, so the question is: how do we use them? How do we label them? How do we control them? “ We have to decide, otherwise, says Benanti, “what remains of freedom, awareness and human dignity?”

Paolo Benanti and Arturo Di Corinto

The neural data

For the researcher of the Polytechnic of Zurich, Marcello Ienca, “the importance of neural data as a correlation of mental faculties and its implications in terms of self-perception and subjective identity has probably not been understood”. But also the epistemological importance of the neural data, which is predictive, for example for an insurer or employer. In short, the neural data, even when it is a ‘simple’ biological marker that says nothing about the semantic content of thought, says a lot about the person’s state of health.

And it has a very great methodological importance: “The neural data is not read-only, because it is a data that can be rewritten through neurostimulation and neuromanipulation of hybrid brain-computer interfaces”. It has been discovered how a person can change musical tastes after brain stimulation: what if it happens with political beliefs? Therefore, according to the researcher, if ‘neuroprivacy’ is the privacy of neural data, mental privacy is inherent instead to the mental states of the individual, those that Facebook exploits in the dynamics of emotional contagion.

And if the discourse seems too abstract to you, just think of the case of the use of neuro-monitoring in Chinese schools told in a TV report on Tg1. In summary, the pupils wear a neural headdress that measures brain cognitive activity to measure attention and learning: a zero-privacy school that becomes a disciplinary laboratory. As in the case of the neuro-monitoring devices always used in China to calibrate the production flows of nuclear power plants.

What if these devices are hacked? Hacking the brain-computer interface is a phenomenon already encountered among cybersecurity experts. And yet, as the philosopher Giacomo Marramao recalled during the meeting, who posed the theme of identity against any biological and scientific reductionism “new technologies can be used not to tame men as the powerful would like, but to collaborate with the planet and develop the quality of the relationship with it “, while according to the jurist Oreste Pollicino,” it may not be necessary to imagine new neurolights, but to use the ‘old catalogs’ that provide protections for rights that are beginning to be inflated, remembering that the Charters of Rights they bring not only balance but also conflict ”. A conflict that has already begun.

The article was published in italian for Wired Italy: https://www.wired.it/attualita/tech/2021/01/29/privacy-neuro-privacy-garante/?refresh_ce=

--

--

Arturo Di Corinto
The Startup

Teacher, journalist, hacktivist. Privacy advocate, copyright critic, free software fan, cybersecurity curious.