Post by Dr. Giovanni Greco OFM published on the Blog of the Alfonsiana Academy .
In today’s medical practice, neurotechnologies are indispensable in the rehabilitation of stroke patients, in the treatment of subjects affected by Parkinson’s or Alzheimer’s disease or by mood disorders (such as drug-resistant depression) and make communication possible again for patients affected by ALS and movements for subjects with motor disabilities. In particular, brain-computer interfaces (BCI – Brain Computer Interface) are able to directly connect the human brain with machines, allowing the recovery of many functions [1] .
Neurotechnologies are having an increasingly strong impact on our societies, in the fields of medicine and communications, thanks to the possibility of enhancing cognitive abilities; research on these fronts is rapidly expanding and is not confined to the academic field. These are electronic, optical, magnetic and other types of systems that interact directly with the central nervous system to measure its activity (such as EEG or FMRI) or to modify it: optogenetics, for example, allows specific neurons to be activated or deactivated through light pulses and seems to promise the cure of complex diseases; sonogenetics is able to modulate the activation of specific areas of the brain with sound waves and with extreme precision [2]. All these technologies can be used to treat subjects with serious morbidities but also to influence the functioning of the brain, and therefore the choices and will of individual subjects. This is raising important ethical questions all over the world.
In China, one of the most controversial uses of neurotechnology has taken place: in Hangzhou, factory workers were equipped with helmets capable of monitoring emotions, such as anxiety, anger, and depression, while working, with the aim of improving company productivity. Such invasive surveillance is starting to raise strong doubts about the protection of employees’ “mental privacy,” their autonomy and the dignity of workers in general [3]. There is also the doubt that forms of control and power imbalances could be generated [4].
Another emblematic case is that of Neuralink, the company founded by Elon Musk that is developing implantable interfaces to allow humans to communicate directly with electronic devices through thought. This technology is raising ethical concerns related to “mental privacy,” the protection and commercial use of neural data, issues of surveillance, but also of freedom, manipulation, and even equity, since if these technologies were accessible only to certain socioeconomic groups they would worsen existing inequalities [5].
In all these scenarios the need for careful regulation in the use of neurotechnologies has led to the birth of “neurorights”, a new category of rights that looks at “ethical, legal, social or natural principles of freedom or right relating to the cerebral and mental domain of a person; that is, fundamental normative rules for the protection and preservation of the human brain and mind” [6].
The 2023 UNESCO report on neurotechnology warns: “Developments in neurotechnology have profound implications for human identity, autonomy, privacy, behaviour and well-being, that is, the very essence of what it means to be human” [7]. Respect for the individual, his cognitive freedom and his personal and psychological identity must be guaranteed; we are talking about psychological continuity, that is, the right to maintain one’s mental coherence and self-perception, as well as the way in which a person thinks or perceives himself over time. It is also a question of guaranteeing mental integrity in scientific experimentation, that is, safeguarding the natural functioning of the brain, without unauthorized external intrusions.
International legislation is trying, at different speeds, to address the many emerging problems [8]. Everywhere the need to control the development of new technologies emerges, starting from the principle of the centrality of man and the protection of the patient ; in this perspective the definition of neurorights is necessary to guarantee that technological innovations are truly a progress at the service of humanity and not a danger to its dignity.
—
[1] https://www.altalex.com/documents/news/2024/10/16/neurotecnologie-neurodiritti-sfida-privacy-mente.
[2] https://www.agendadigitale.eu/cultura-digitale/neurotecnologi-e-diritti-cosi-il-mondo-si-prepara-a-proteggere-la-privacy-della-mente.
[3] https://www.altalex.com/documents/news/2024/10/16/neurotecnologie-neurodiritti-sfida-privacy-mente.
[4] Ernesto Belisario, Giovanni Maria Riccio, Guido Scorza (eds.), GDPR and Privacy Regulations. Commentary, Ipsoa, 2022. The volume comments on the individual articles of Regulation 2016/679/EU, integrated with the provisions of the decree adapting national legislation (legislative decree no. 101/2018).
[5] https://www.agendadigitale.eu/sicurezza/privacy/neurotecnologie-e-privacy-i-passi-avanti-verso-un-futuro-etico-e-regolamentato.
[6] Marcello Ienca, «On neurorights», in Frontiers in Human Neuroscience, 15 (2021) 1662-5161; https://www.corriere.it/salute/ehealth/24_novembre_19/neurodiritti-cosa-sono-a-cosa-servono-e-come-tutelarsi-5b050401-3532-4d0c-b218-775924548xlk.shtml.
[7] UNESCO, «Outcome Document of the First Meeting of the Aheg First Draft of a Recommendation on the Ethics of Neurotechnology (First Version) », Parigi, 9 maggio 2024, https://unesdoc.unesco.org/ark:/48223/pf0000389768.
[8] https://www.corriere.it/salute/ehealth/24_novembre_19/neurodiritti-cosa-sono-a-cosa-servono-e-come-tutelarsi-5b050401-3532-4d0c-b218-775924548xlk.shtml?refresh_ce.