Philosophy

Is the Law Prepared to Protect Us from the Neurotechnology?

la era del neuro-capitalismo

In the era of neuro-capitalism, your brain needs new rights.

Over the past few weeks, Facebook and Elon’s Neurolink have announced that they’re building tech to read your mind — literally.

Mark Zuckerberg’s company is funding research on brain-computer interfaces (BCIs) that can pick up thoughts directly from your neurons and translate them into words. The researchers say they’ve already built an algorithm that can decode words from brain activity in real time.

And Musk’s company has created flexible “threads” that can be implanted into a brain and could one day allow you to control your smartphone or computer with just your thoughts. Musk wants to start testing in humans by the end of next year. Other companies are also working on brain tech.

This might sound like science fiction, but it’s already begun to change people’s lives. Over the past dozen years, a number of paralyzed patients have received brain implants that allow them to move a computer cursor or control robotic arms. Implants that can read thoughts are still years away from commercial availability, but research in the field is moving faster than most people realize.

Your brain, the final privacy frontier, may not be private much longer.

Some neuroethicists argue that the potential for misuse of these technologies is so great that we need revamped human rights laws — a new “jurisprudence of the mind” — to protect us. The technologies have the potential to interfere with rights that are so basic that we may not even think of them as rights, like our ability to determine where our selves end, and machines begin. Our current laws are not equipped to address this.

4 new rights we may need enshrined in law

Several countries are already pondering how to handle “neurorights.” In Chile, two bills that would make brain data protection a human right will come before parliament for a vote in November, thanks in part to the advocacy of neuroscientist Rafael Yuste. In Europe, the OECD is expected this year to release a new set of principles for regulating the use of brain data.

One of the main people pushing for these new human rights is neuroethicist Marcello Ienca, a researcher at ETH Zurich, and he explains: “I’m very concerned about the commercialization of brain data in the consumer market,” “and I’m not talking about a farfetched future. We already have consumer neurotech, with people trading their brain data for services from private companies.”

1. The right to cognitive liberty

You should have the right to freely decide you want to use a given neurotechnology or to refuse it.

In China, the government is already mining data from some employees’ brains by having them wear caps that scan their brainwaves for depression, anxiety, rage, or fatigue. “If your employer wants you to wear an EEG headset to monitor your attention levels, that might qualify as a violation of the cognitive liberty principle,” Ienca said.

He added that the US military is also looking into neurotechnologies to make soldiers more fit for duty. Down the line, that could include ways to make them less empathetic and more belligerent. Soldiers may be pressured to accept interventions.

2. The right to mental privacy

You should have the right to seclude your brain data or to publicly share it.

Ienca emphasized that neurotechnology has huge implications for law enforcement and government surveillance. “If brain-reading devices have the ability to read the content of thoughts,” he said, “in the years to come governments will be interested in using this tech for interrogations and investigations.”

The right to remain silent and the principle against self-incrimination — enshrined in the US Constitution — could become meaningless in a world where the authorities are empowered to eavesdrop on your mental state without your consent.

3. The right to mental integrity

You should have the right not to be harmed physically or psychologically by neurotechnology.

BCIs equipped with a “write” function can enable new forms of brainwashing, theoretically enabling all sorts of people to exert control over our minds: religious authorities who want to indoctrinate people, political regimes that want to quash dissent, terrorist groups seeking new recruits.

What’s more, devices like those being built by Facebook and Neuralink may be vulnerable to hacking. What happens if you’re using one of them and a malicious actor intercepts the Bluetooth signal, increasing or decreasing the voltage of the current that goes to your brain — thus making you more depressed, say, or more compliant?

4. The right to psychological continuity

You should have the right to be protected from alterations to your sense of self that you did not authorize.

In one study, an epileptic woman who’d been given a BCI came to feel such a radical symbiosis with it that, she said, “It became me.” Then the company that implanted the device in her brain went bankrupt and she was forced to have it removed. She cried, saying, “I lost myself.”

Ienca said that’s an example of how psychological continuity can be disrupted not only by the imposition of a neurotechnology but also by its removal. “This is a scenario in which a company is basically owning our sense of self,” he said.

Another threat to psychological continuity comes from the nascent field of neuromarketing, where advertisers try to figure out how the brain makes purchasing decisions and how to nudge those decisions along. The nudges operate below the level of conscious awareness, so these noninvasive neural interventions can happen without us even knowing it. One day a neuromarketing company is testing a subliminal technique; the next, you might find yourself preferring product A over product B without quite being sure why.

“Brain data is the ultimate refuge of privacy”

Given the worries about neurocapitalism, I asked Ienca whether neurotechnologies should be taken out of the control of private companies and reclassified as public goods. He said yes — both to prevent companies from inflicting harm and to prevent them from affording benefits only to rich people who can pay for their products.

“One risk is that these technologies could become accessible only to certain economic strata and that’ll exacerbate preexisting social inequalities,” he said. “I think the state should play an active role in ensuring these technologies reach the right people.”

En la era del neuro-capitalismo

It’s hard to say whether Ienca’s neurorights or the OECD’s or Chile’s will effectively keep neurotechnology’s risks in check. But given how fast this tech is developing, it does seem likely that we’ll need new laws to protect us, and now is the time for experts to articulate our rights. Lawmakers move slowly, and if we wait for devices like Facebook’s or Neuralink’s to hit the market, it could already be too late.

“Brain data is the ultimate refuge of privacy. When that goes, everything goes,” Ienca warned. “And once brain data is collected on a large scale, it’s going to be very hard to reverse the process.”

 

Source: Sigal Samuel – Javier Zarracina/Vox

SHARE
RELATED POSTS
Pollution and Litter on the Moon
The Illusion of Time: What’s Real?
WHAT IS CONSCIOUSNESS?

Comments are closed.