close
close

Navigating Neurotechnology Regulation | Regulatory Overview

Researchers address the challenges regulators face in protecting the public in the face of rapid developments in neurotechnology.

They can “decipher and transform our perception, behavior, emotion, cognition, and memory—something you might say is at the heart of what it means to be human.”

The United Nations Educational, Scientific and Cultural Organization issued this warning in a 2023 report on neurotechnology — devices that communicate with neural networks in the human brain. And after Neuralink reportedly announced that it had deployed its computer chip in the company’s first-ever patient, the public is now demanding regulation of neurotechnology.

Although public attention has only recently focused on the regulation of neurotechnology,

Researchers Marta Sosa Navarro and Salvador Dura-Bernal have long grappled with this challenge. In a recent article, they discuss a range of risks that neurotechnologies may pose in the future, outline practical challenges in dealing with those risks, and summarize the regulatory state of neurotechnologies in the United States and abroad.

Neurotechnologies “read and modify the brain” through a process called neurostimulation—applying electrical impulses to the skull. Neurostimulation can cause physical damage, such as burning brain tissue due to high-intensity electrical currents. In addition to the physical dangers of neurotechnologies, Navarro and Dura-Bernal raise a wide range of human rights concerns, including threats to data privacy and abusive governments changing the minds of vulnerable populations like prisoners.

Navarro and Dura-Bernal acknowledge that regulators face significant difficulties in implementing solutions to overcome these concerns.

For example, Navarro and Dura-Bernal cite the rapid pace of innovation in this technology as one of the major difficulties regulators face. However, Navarro and Dura-Bernal also note that regulators’ actions are critical because an increasing number of private companies with direct-to-consumer business models are seeking to sell products that are not yet adequately regulated by the public.

Navarro and Dura-Bernal also note that the challenge of rapid innovation is compounded by the philosophical questions that regulators must answer to regulate neurotechnologies, including “what is thought?”

They detail how different scientists and philosophers differ significantly in their approaches to defining thought, and how these differences can result in recipes that do not necessarily solve the problems of interest to neurotechnology.

For example, if thought means only brain activity, that definition probably won’t be enough to protect against misuse of neurotechnology, Navarro and Dura-Bernal explain. Regulations that prevent a device from interfering with brain activity don’t necessarily prevent a device from changing the structure of the brain by destroying certain connections of brain tissue so that someone forgets a memory. While that destruction may not affect a person’s current brain activity, it could change future thoughts, making it harder to recall memories. As such, Navarro and Dura-Bernal argue that a broader definition of thought that includes both brain activity and brain structure would better protect individuals.

Another challenge Navarro and Dura-Bernal identify is defining how regulators would classify neurotechnology, which affects the regulations neurotechnologies are subject to. In the United States, neurotechnology can be considered either “wearable technology” or a “medical device,” they explain.

Medical devices are generally subject to more stringent safety standards and privacy regulations than wearable technology, but this classification depends on the product’s intended use, the manufacturer’s claims and the risks associated with the product.

Navarro and Dura-Bernal also note that the U.S. Food and Drug Administration (FDA) has typically treated neurotechnologies as wearable technologies, but the FDA’s new draft guidance appears increasingly aware of the pitfalls of neurotechnologies and the need for more stringent requirements.

In the European Union, however, regulators have considered neurotechnology a medical device since April 2017, even if it has no “intended medical purpose.” The EU also imposes obligations on manufacturers, such as following certain regulatory procedures before a product is put on the market. The EU is calling on member states to implement safety procedures to monitor neurotechnologies after they are put on the market and to specify the circumstances in which regulators could withdraw, withdraw or restrict their use.

Despite these steps, some scientists have hailed Chile as a pioneer in neurotechnology legislation. In December 2020, Chile amended Article 19 of its constitution to include a “right to neuroprotection.” The country is considering further legislation that would implement Article 19 and protect the right to psychological integrity. But Navarro and Dura-Bernal and other scientists criticize these bills as too vague, underscoring the difficulties regulators may face in defining the contours of protection against neurotechnologies.

Nevertheless, these legislative actions have led the world to recognize the greater need for regulation. In response, the EU and the United States have created a “Trade and Technology Board” to address the many challenges to uphold human rights and democratic freedoms in the face of this new technology.

But even with the new council in place, it remains unclear whether regulators will be able to keep up with the rapid pace of technological developments and prevent humans from ending up as lab animals in neurotechnology experiments.