The Battle for Your Brain

Meet Nita Farahany

This blog post is part of VIB Neuroscience in the spotlight.

Nita Farahany is a futurist and leading scholar on ethical, legal, and social implications of emerging technologies. Her new book, ‘The Battle for Your Brain: Defending Your Right to Think Freely in the Age of Neurotechnology’, is coming out in the spring of 2023. Her publisher is already ecstatic: ‘A rock star academic explores the final frontier of personal privacy: your mind’. At the recent VIB/Cell Press Neurotechnologies Meeting in Leuven, Nita presented her views and worries about the impact of neuroscience on the right to freedom of thought before a group of hardcore neuroscientists.

©Nita Farahany
©Nita Farahany

Nothing is free

“With our growing capabilities in neuroscience we will soon know a lot more of what's happening in the human brain. As a bioethicist, a lawyer, a philosopher and an Iranian-American, I'm deeply concerned about what this means for our freedoms and what kinds of protections we need.”, is the starting point of Nita’s argumentation, “We've given up many aspects of our privacy. Your Apple Watch knows when and how thoroughly you brush your teeth. When you search the internet, Google remembers forever what you were searching for. Social network sites like Facebook, Instagram, TikTok or Tinder know what and who you like,…”

“By now, most of us realize that anything free that we use on the web, is not actually free. Your data is the way that you're paying for that service. Google now commands 92% of all searches on the internet. Their business model is being able to profile what you search for and use that data for commercial applications. They create a very unique and targeted data set about you, providing advertisers with the opportunity to acquire uniquely targeted advertising. Meta, formerly known as Facebook, does the same thing. Harvesting data on its billions of users and creating psychological profiles of them that advertisers can use to micro target their pitches. Shoshana Zuboff wrote a ‘must read’ book - ‘The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power’ - about how Google and Facebook persuaded us to give up our privacy for the sake of convenience.”

“But my brain, my brain is safe, no? If you think that your friend’s dress is hideous but you say that it is a fabulous color on her, you don’t have to worry that anyone will be able to find what you are really thinking. Right? Well,…” It is Nita Farahany’s conviction that with modern neuroscience, artificial intelligence, and machine learning, people and companies will break through that the mental privacy barrier. Neurotechnology will allow the average consumer to interface with the rest of the world directly through their brain, giving away their thoughts, their most intimate privacy. Just for the sake of convenience.

I fear we will give up our thoughts to companies, for the sake of convenience.

Sharing your diary

The penny dropped when Nita was at the 2018 summit of the Wharton Neuroscience Initiative at the University of Pennsylvania when the chief strategy officer of a company called CTRL-labs gave a presentation. CTRL-labs was a New York startup then, specializing in allowing humans to control computers using their brains. The vision of the company was to develop a wristband that allows people to control their digital devices. The wristband decodes signals from the brain and translates them into a signal that devices understand. “At first sight it seemed a harmless and practical tool”, explains Nita, “but in September 2019 Facebook (now Meta) acquired CTRL-labs claiming that the technology has the potential to open up new creative possibilities by changing the way we connect. The wristband would become the universal controller for all of your interactions with technology, according to a spokesperson at Meta Platforms.”

“The focus of big tech now is to try to use neurotechnology as a new and potentially primary way on how we interface with their platforms,” continues Nita, “Apple has hinted recently that they are going to integrate health sensors, like EEG, into their AirPods, and their recent patents suggest that they're are gearing up to do so. A company like NextSense was backed by Alphabet's Moonshot Division. They think they have the winning recipe for brain health monitoring with their earbuds. They hope to create a mass market for brain monitoring.”

“The way in which the technology is going to be normalized for the next generation is that the metaverse is being built from the ground up with neural interface. Nearly every virtual reality headset will have EEG and other sensors integrated into it. EMG is replacing the joystick, another way of normalizing the use of neural interfaces. But these devices allow corporations to gather enormous amounts of raw brain data.” Why does this matter, why is Nita concerned about this evolution? She makes the analogy with the good old diary: “Suppose you keep a written diary. You hand it over to a friend allowing him or her to read page 23. Your friend reads it and hands over the diary back to you. Fine. But what if your friend decides to keep the diary, put it into a drawer and whenever your friend wanted to know something about you, he or she would open the diary, flip through it and just keep reading. That would be pretty creepy, no? You meant to just share that one page.”

“Well, if your brain data are registered by your future game console, VR headset, earplugs or any other device, they are like your diary. The data might be collected for a commendable purpose: an algorithm might screen whether you are still awake when steering your car, when a pilot is flying a plane, or when a truck driver is on the road. That's the page in the diary. As a bioethicist, I am a huge proponent of empowering people to take charge of their own health and well-being by giving them access to information about themselves, including this incredible new brain-decoding technology. But what if your raw data is kept in the cloud and reanalyzed over and over again for other purposes?”

A world of total brain transparency

“The registration of your brain data could also be good. For example, the start of cognitive decline could be detected much earlier than today and properly treated. Using a consumer headset, it might be possible to pick up the tiny electrical changes that happen in the earliest stages of glioblastoma. But it also worries me that we will voluntarily or involuntarily give up our last bastion of freedom, our mental privacy. Will people give up data about their brains as easily as they've given up all the rest of their data? Will people sign the consent form as they put on their VR headset with EEG monitoring of plug in their ear phones? Of course, they will. They think ‘this internet game is fun’, ‘this is a great technology to track my concentration or distraction while driving and to prevent a car crash.’ Not realizing that they might be giving away their diary of registered brain activity.”

“I am worried that we will trade our raw brain data for discounts from car insurers or for free access to social-media accounts,... or even to keep our jobs. In the wrong hands, the access to your brain data could be problematic. Your employer could pick up your emotional reactions. He could discover when you are upset, anxious, mind wandering, not paying attention, or what your reaction is to the recent proposed raise that you were offered. The Chinese government is already using cutting edge AI and neuropsychology to analyze facial expressions and brain signals to explore the compliancy among opponents during political re-education. A lot of factory workers in China and people who are conducting high speed trains are required to wear EEG headsets throughout the day. Workers are sent home if their brains show less concentration on their jobs or emotional agitation.”

“We're headed to a world of brain transparency,” claims Nita, “and I don't think people understand that that could change everything. Everything from our definitions of data privacy to our laws, to our ideas about freedom. In a world of total brain transparency, who would dare have a politically dissident thought? Or a creative one? I am afraid that people will censor themselves for fear of being expelled by society of by governments."

I fear that people's brains reveal their sexual orientation, their political ideology or their religious preferences, long before they were ready to consciously share that information with other people.
Nita Farahany at the Neurotechnologies conference
Nita Farahany at the Neurotechnologies conference

Balancing benefits and risks

Nevertheless, Nita believes that we have an ethical mandate to develop neuroscience and neurotechnology because of the promise they offer to address an enormous amount of human suffering. “I believe in the great value of responsible science and of sharing data, even raw brain data. The more people are wearing interfacing headsets in their natural environment, the more we can improve the technology, the algorithms, the ways to filter out noise from essential information about health and disease.”

“We could collect massive amounts of great data. Imagine the real problems we could solve. There's a company in Israel that has the technology to detect epileptic seizures an hour before they occur in people wearing a simple consumer headset. That could be life transforming for a person prone to epilepsy. There are so many potential applications for this technology. But why would the average person share their data if there isn't any security for them against the misuse of that data? We need to figure out a pathway that enables humanity to maximize the benefits of it while minimizing the risks.”

“Which is why I think we have to develop an ethical and legal framework of the privacy dangers that people have in their neural data. We should recognize that not all neural data is alike. Not all neural data should be governed in exactly the same way. There are ways in which we can develop frameworks that do so. When it comes to privacy protection in general, I think we're fighting a losing battle by trying to restrict the flow of information. Instead, we should be focusing on securing rights and remedies against the misuse of information. Recognizing that something like identifying information from the brain and the automatic functioning of our brain may not implicate privacy concerns in the same way as our memories or silent utterances, would be a valuable first step. And that the balance of interests may weigh in favor of society at times, depending on what brain data is being sought and for what purpose.”

We have to develop an ethical and legal framework of the privacy dangers that people have in their neural data.

“In a way, the European General Data Protection Regulation (GDPR) is a proactive approach to privacy with a worldwide impact. For US based companies, with operations in Europe, it doesn't make sense to have separate business and operation models for Europe, the United States and the rest of the world. Therefore, GDPR has set an important floor concerning everybody's privacy, with vast impact on data privacy. But at the same time, GDPR has had collateral consequences for scientific progress, research and innovation because of the complexities for processing sensitive data, for exporting data, and more. GDPR has focused very much on access restrictions over implementing misuse protections. No matter how much we try to secure our data, or to anonymize it, there will always be breaches and data hacks. Which is why we need to get much clearer about defining harmful uses of data and how we will protect individuals against that. I wouldn't say that we have to give up on all data access restrictions. Sometimes that's the best way to protect people, but it can't be the only way to protect them.”

The right to cognitive liberty

Nita Farahany advocates for the right to cognitive liberty, securing our freedom of thought and self-determination. “The right to cognitive liberty ensures that we have the right to consent to or refuse access and alteration of our brains by others. This right could be recognized as part of the Universal Declaration of Human Rights, which has established mechanisms for the enforcement of these kinds of social rights. The Universal Declaration of Human Rights creates legal and moral obligations for governments, corporations and individuals to abide by. We can recognize the right to cognitive liberty, a new international human right, to give us a right of self-determination of our brains and mental experiences. It would require us to update preexisting rights to privacy to include mental privacy. The concept of freedom of thought must be broadened, because it is currently interpreted narrowly to focus on freedom of religion and of expression. It should be adapted for the coming age of neuroscience and neurotechnology.”

The concept of freedom of thought must be adapted for the coming age of neuroscience and neurotechnology.

“How we implement these rights is just as important as recognizing them. For example, when Joseph Cannataci, the Special Rapporteur on the Right to Privacy, presented his final report of his term, he proposed a set of principles about data minimization. Those principles apply equally in the setting of brain data and the collection of neural data as for other sensitive and/or personal data. For example, we should make sure that corporations are transparent about the data that they are collecting. According to these principles we ought to give people local control over data collected by their devices. They should be able to turn off data collection at times when people don't want it to be occurring.”

“Besides principles, one needs also the implementation of norms. There are many organizations that have started to propose different standards for implementing neurotechnology. The OECD has adopted responsible innovation principles with respect to neurotechnology. Other individuals and other groups worldwide are starting to advance similar standards. We need, in a synergistic way, to move forward. There are promising things that are happening with neurotechnology worldwide but also deeply disturbing ways in which neurotechnology is already being used to interfere with freedom of thought. As Iranian American, as I watch what's happening in Iran right now, it terrifies me to consider what could happen if the technology ends up in the wrong hands. Reading brain data is one thing, but we also need to consider issues such as cognitive enhancements, cognitive diminishments, and whether we have a right to choose to do either. What about the manipulation and persuasion of the brain? How should we think about those things?

“We should decide together to continue now the ethical development of neurotechnology by embracing principles on mental privacy and freedom of thought,” concludes Nita Farahany, “such that five years from now we're not writing ‘The Age of Brain Commodification’, as a sequel to Shoshana Zuboff’s ‘The Age of Surveillance Capitalism’.”

Learn more

If you want to dive deeper into brain health and disease, explore some of our resources, like our Alzheimer's facts series, or keep an eye out for interesting news and events where our researchers share new insights. We also have plenty of open positions for people eager to join us on our multi-faceted mission to unravel the mysteries of our brain.

 

Visit the webpage to find out more about VIB Neuroscience in the spotlight.

 

India Jane Wise

India Jane Wise

Science Communications Expert, VIB

 

 

About VIB Blog

On our blog, you can find content curated by the VIB community. Discover our research through the eyes of our scientists.

Want to be kept up-to-date on our biotechnological news and stories? Join our community and subscribe to our bi-monthly newsletter here.

Contact