How can technology impact on our freedom of thought and what protection do we need to guard against advanced technologies being able to read our minds?
Leading scholar of ethics in the age of automation and artificial intelligence Professor Nita Farahany says there are worries about the possibilities of emerging brain-reading technologies, as well as their legal and social implications.
Prof Farahany says "the day might be closer than you think" when we have mind-reading tech, and is asking, if computers can decode brain activity what happens to our privacy?
She says that already today companies use consumer-based devices that tell us whether someone is paying attention, or anxious, or drowsy when driving, or about to have a seizure.
“This kind of idea of thought [reading technology] is already here, to get to the place where we would have true mind-reading technology, that is the ability to decode complex thoughts, is further off, at least for non-invasive technology.
“There have been some really exciting and important research studies that have been published over the past year that have shown researchers that when they have somebody who has implanted an electrode … it is possible already to decode some more complex thoughts.”
There’s also electroencephalography devices that can help to detect prior knowledge of matters in court cases by posing a series of questions or showing images of familiar and unfamiliar things to get a baseline reading, Farahany says.
“Then you would show them something they shouldn’t know, like some weapon from a crime scene or something like that, and see whether or not the way that their brain registers that thing you’ve showed them looks more like the thing they know, or more like the thing they don’t know.
“If it looks like something they know, then you’d say ‘aha, here’s guilty knowledge’, something that a person shouldn’t have.”
So far it hasn’t been used in court cases in the United States, “but very similar technology has been used in India already in a number of criminal cases,” Farahany says.
She says major tech companies are also investing in invasive and non-invasive brain technology to find out if decoding the brain can help us to operate things just by thinking about them.
“While there are some great benefits to being able to do things like that, like hopefully people won’t be texting and driving [instead] they could just be thinking about texting while driving, the downside is suddenly we’re giving access to the most private and sensitive information that we have … to the likes of Facebook and Google and these other tech companies, when there’s no safeguards for us against that information ultimately being misused against us, or trying to manipulate or change what’s in our brain.”
However, the big companies aren’t all to blame. Users also frequently highlight the importance of privacy and at the same time give it up all too easily, Farahany says.
“People say they care about privacy but then they don’t read the terms of service when they sign up for a social media account, they freely enter information into search [engines] … they give up their location all the time through use of and carrying around mobile devices.
“People have become so comfortable and used to giving up this data without really seeing what the harms of doing so are, that I’m afraid what’s going to happen when it comes to brain data is that the convenience or the ‘wow’ factor… will mean people will give up that information in exchange for goods and services.”
But people should be more careful about this new technology, she says, because unlike other kind of information that we share freely, no-one before has been able to “get in your brain” before.
“All of those kinds of private thoughts, things we don’t verbalise or speak out loud, suddenly become accessible.
“The chilling effect that that would have on society is the way it would change our relationships with other people, our relationships with government, with companies – I think is so profound that it’s an area where people need to be a lot more careful with this information and how freely and quickly they adopt this technology.”
There needed to be some right of “cognitive liberty” to safeguard “our last bastion of freedom”, she says.
“That would ensure that there is a space of protection for what happens in our brains, that people can’t access or change what happens in our brains without our permission. And to make that barrier and burden really quite high, much higher than we have for other areas of privacy and other areas of data and information.”
She says while there is interest from people to learn more about the threats these technologies pose, she’d like to see moves by governments and legislators for greater protection.
“We have given up on almost every other kind of privacy and data about ourselves, but this one space – this freedom of thought – is the one space that I think everybody always assumed would be sacred and would be protected.
“I hope these [moves by tech giants] will start to be wake up calls for people, and that it will lead to a sufficient ground for us to get out ahead of this, because there is still time, just not much. I think with the time we have, we ought to be doing a lot more to safeguard cognitive liberty.”