Brain Data and the Sovereign Self - Episode 13

The Future of You podcast is back with a bang! Season 3 begins with 'Brain Data and Self Sovereignty', featuring Professor Nita Farahany, Neurable's Dr Ramses Alcaide and Neurosity's AJ Keller

Brain Data and the Sovereign Self - Episode 13

The Future of You podcast is back with a bang! Season 3 begins with 'Brain Data and Self Sovereignty', featuring Professor Nita Farahany, Neurable's Dr Ramses Alcaide and Neurosity's AJ Keller

Apple Podcasts: https://apple.co/3nz2QRa

Spotify: https://spoti.fi/40sYUQs

Google Podcasts: https://bit.ly/42QrqNk

Amazon: https://amzn.to/3zj3sgj

It's back! And we start Season 3 by exploring brain data - what is it? How can it be collected and used? And who has access to it?

In this episode I’m joined by Nita Farahany; author of The Battle For Your Brain and Professor at Duke University. We discuss how brain data is already being collected and used as well as the importance of cognitive liberty.

I also find out more about the technology being used to collect brain data. I talk with CEO and Co-founder of Neurable Dr. Ramses Alcaide, and Co-founder and CEO of Neurosity AJ Keller; two companies creating brain data products they believe we’ll soon be using as often as our smartphones.

Professor Nita A. Farahany's new book 'The Battle For Your Brain: Defending The Right To Think Freely In The Age Of Neurotechnology' available in US and UK

Ready for Brain Transparency? Nita A Farahany's session at Davos 2023

Tracey's piece in Forbes, 'Forget Media Manipulation And Misinformation via TikTok And Twitter, Neurotechnology Heralds The New Battle For Our Brains'


Neurable website

Neurable discord channel

Neurable on Linkedin

Ramses on Twitter @RamsesAlcaide

Neurosity website

Neurosity on Linkedin

AJ on Twitter @andrewjaykeller

OECD, Neurotechnology in and for society: Deliberation, stewardship and trust

Nature Electronics, Brain Computer Interfaces


Tracey's book 'The Future of You: Can Your Identity Survive 21st Century Technology?' available in the UK and US

2 mins: Discussion with Nita Farahany

  • Examples of permissible and impermissible digital phenotyping
  • The collection of brain data in workplaces
  • The potential for Orwellian impositions of brain monitoring
  • Biomarkers of cognitive decline

26 mins: Discussion with CEO and Co-founder of Neurable, Dr Ramses Alcaide

  • How using brain data could aid early detection of Alzheimer’s and Parkinson’s
  • The data collected by Neurable
  • Integration of brain sensors into everyday equipment

40 mins: Discussion with Co-founder and CEO of Neurosity, AJ Keller

  • The Crown brain imaging device
  • Home brain data collection
  • Flow state and ADHD

51 mins: Discussion with Nita Farahany

  • Marketing and mental manipulation
  • Cognitive liberty as the default
  • The importance of knowing how your brain data might be used

TRANSCRIPT:

The Future of You: Episode 13

Brain Data and the Sovereign Self

Tracey Follows  0:21

Last year at a tech conference when I was speaking to an audience of mostly marketers, but some software engineers and investors too. I was doing a fireside chat and the host asked me, what would be a surprising trend to watch, you know, something underestimated or under emphasised. And I had no hesitation whatsoever in saying it was the brain machine interface, particularly in the workplace. And I talked about neuro-tech highlighting what Black Rock and Neuralink were doing. But more generally, the idea that corporations would have access to our brain data, in exchange for the services that they would provide that we would no doubt need in a market looking for augmented humans, doing jobs that involve much more workplace surveillance than today. I even mentioned the laws being passed by the government in Chile, for the protection of mental privacy. And the notion of neuro rights. That's been on my trends list for over three years now at least. I've talked about it quite a lot in relation to identity and the future of you. But I always get the feeling most people aren't taking this, or perhaps me, particularly seriously. I mean, it sounds like science fiction, right? So imagine my delight when I discovered a new and exciting book called "The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology, by professor at Duke University Nita Farahany. I couldn't wait to read it. And what's more, as an absolute expert in all of this, she joins me today to talk in a little more depth about the book, what's going on in the field of neuro tech, and her thesis of cognitive liberty.

Tracey Follows  2:02

Thanks for joining me, Nita. I know you're a very busy woman as your brand new book is hitting the book stands as we speak. Congratulations.

Nita Farahany  2:10

It's so nice to finally have it see the light of day.

Tracey Follows  2:13

I bet! How long have you been working on it?

Nita Farahany  2:16

I mean, to be honest, I've been working on it for a decade, but when I really started working on it in earnest was 2019, like November 2019. And so not as long as a decade, but the first time I wrote a proposal for this book was in 2012. So...

Tracey Follows  2:32

Goodness me , yes you're well ahead of the curve.

Nita Farahany  2:34

Too far ahead at that point.

Tracey Follows  2:36

You can't be too far ahead. It's absolutely superb, you were kind enough to send me a copy. And I'm looking forward to reading again, actually, because there's so much in there. Just remind everybody exactly what it's called the subtitle as well, because that's very important isn't it?

Nita Farahany  2:50

And we agonised over it to be honest,

Tracey Follows  2:52

Authors always do.

Nita Farahany  2:53

No we really did. We agonised over not only the title, but the subtitle. So it's The Battle for your Brain: Defending the Right to Think Freely in the age of Neurotechnology. And it is St. Martin's Press published on March 14. And it captures what I see as truly the battle, the battle that is ongoing that many people haven't been a part of aren't even aware that it's happening. And it's a battle, we can still win to have neurotechnology empower us. But it is a battle that unless people join the call to action is going to wage without public input and engagement.

Tracey Follows  3:31

Yeah, and I think that's the most important thing to try and draw the public into some sort of consultation or conversation. I know, it's incredibly difficult, but I'm sure if any book can do it, this can and you can, because you do set it out, as you know, highly complex, technical detail, but you make it sound so understandable. And I think that's great, so that everybody can become involved in the debate and in the conversation. But to that end, I wonder actually, if you might start by telling people what raw brain data is, because actually, once you read the words, raw brain data, it can feel I'm sure quite scary for people

Nita Farahany  4:08

Almost feels like a steak or something. Right?

Tracey Follows  4:10

Yeah, I like mine well done.

Nita Farahany  4:11

Yes, exactly. You know, there's a lot of different ways in which activity or information from the brain can be captured. But one of the major areas that I focus on in the book is capturing electrical activity in the brain. So as you think, as you meditate, as you do a math problem in your head, neurons are firing in your brain and they give off tiny electrical discharges as they do so and hundreds of 1000s to millions of neurons are firing in your brain as you engage in everyday activities and those now can be picked up through electroencephalography or EEG and what that's picking up is really like the summation of all of that electrical activity across the scalp. What can be picked up from surface based electrodes. And it is, you know, kind of the average of the activity across the different regions of the brain broken down into different bands, or wavelengths. And those different bands or wavelengths can be both picked up and differentiated with algorithms and software that interpret what it is that is being detected.

So, raw brainwave activity is different than a lot of other data that people are used to, because if I want to know, if you're tired, for example, I capture all of that summation of activity that's happening across your brain at any given moment in time or over a period of time. And then I extract the interpretation that I'm interested in the interpretation being - are you tired, and I shouldn't say I, I don't do this right, I am somebody who studies the implications of this technology. But meaning that's what the technology can do.

But why raw data is different is you could return to it over and again, the same algorithm that was used to interpret that data, that activity that was happening at a given moment of time, that could tell if you had high levels of fatigue, you know, continuously monitoring, it could also pick up if you're suffering from cognitive decline, and could also be picked up to decode, you know, even potential simple words or numbers or shapes or images that are in a person's mind. And the more electrodes and the higher grade and quality of what we're talking about here, the more information can actually be extracted. So, it's different than other kinds of information. And in thinking about it as full spectrum rich information from which you can keep going back over and over again.

One example I give in the book to try to help people understand this is, I say, imagine you want to share something with a friend out of your personal diary. And so you hand them the personal diary, and you say, turn to page 32. And you'll see this passage that I highlighted that I want you to read. And they read it and they say, Oh, that's so interesting, what an interesting insight and reflection about yourself. And then they hand it back to you, you're done. Right? The alternative would be they've also make a copy of your entire diary without your permission. And then when you're not around, they just flip through the pages and read everything from pages one to 32. And all the way until the end, because, you know, there's, there's so much more that you capture in a moment than that one little snapshot that you extract. And so while it's not literally your complex thoughts that you've written down into a diary that the concept of, there's a whole lot more there than the piece of information that a software algorithm is designed to extract is one of the important differences of raw brainwave data from other kinds of data that people are used to sharing or having commodified by companies and governments.

Tracey Follows  7:56

Got it, that's very clear. Thank you. How much is that a shift on from something like digital phenotyping then when, you know, we've got the likes of Apple now telling us that using our data from our own body, and physicality, but also the environment around us, they can predict whether we've got depression, or other sort of mental health illnesses. I mean, I've seen a Wall Street Journal article suggesting that they can or they want to do that. But that seems like one of the bridging points perhaps to some of the stuff you're talking about.

Nita Farahany  8:32

Yeah, absolutely. So, first, I think it's important to realise that right now based on where neurotechnology is today, in some instances, you can learn more from digital phenotyping than you can from brainwave data. So you could pick up for example, if you have my complete online activity, my complete cell phone activity you turned on my cell phone to be a listening device at all times you had everything I've ever typed into Google and elsewhere, you have a pretty rich and complete picture of a lot of what I'm thinking or feeling. So this is additive to that. And the question is like, does it give you anything more than all of that already gives you? And I believe that it does, it gives you something more because it gives you unstated and unexpressed aspects of your cognitive and affective functionings; what you're thinking and feeling. And by that I try to keep a poker face when I see my friends you know, hideous new mustard yellow couch that they decided was just the right colour. And I do a pretty good job of it because you know what my friend bought the couch, what good would it possibly do to register anything other than support and your brain states could reveal your disgust

Tracey Follows  9:50

You're telling me this is going to destroy all possible relationships *laughs*

Nita Farahany  9:55

If we don't have some aspect of cognitive liberty and mental privacy then it could. You know, there are parts of ourselves that we keep back that we hold back. And that's important to our own ability to cultivate our identity in the world, our own ability to choose what we share and with whom. And that's the piece that is different than digital phenotyping. Digital phenotyping is trying to get at that. But to the extent that you haven't expressed it, written it down, done something that reflects it, there is still something more that allows you to have mental space and reprieve even though I think all of that is incredibly intrusive, and invasive. And if you think about the principles that I'm laying out, I have a right to cognitive liberty of self determination of our brains and mental experiences, it may apply to some aspects of digital phenotyping. There may be some practices that are well beyond neurotechnology, and I talk about a number of those in the book, that cross the line between what is permissible and impermissible in today's world as well. So in some ways, it would claw back some of what we've already done, even while forging new territory with an entirely new category of technology and an entirely new frontier of data that we've never before breached.

Tracey Follows  11:14

That sounds exactly like what we need, what we all need, actually. So let's move on to that then. Because you do give an awful lot of examples in the book and I'm sure for people who aren't that aware of what's going on. I mean, in the UK, USA and China and beyond loads of places, loads of sectors, industries, government, education, could you give a flavour of some of your favourite examples

Nita Farahany  11:41

Or my least favourite, right? Either way, however you want to think about it.

Tracey Follows  11:44

Yeah. Where you think it's really useful? And where you think it's...

Nita Farahany  11:48

Dystopian right?

Tracey Follows  11:50

Dystopian? Yes, yeah.

Nita Farahany  11:51

So I should say this, it was a very intentional move on my part with this book, to have it be chock full of existing examples. And there's very little futurism in the book. In that I am not talking about what may happen 10 years from now, I'm trying to give people a very clear understanding of where the technology is today. And the things that we can decode today. Well, before we get to the point where you could decode something like complex thoughts, if we ever get there, the kind of inner monologue, that's not what I'm talking about in this book. I'm talking about the very real applications that exists today of decoding brain states, broad and general brain states and what the risks and benefits of that are.

So I think probably the example that has been most startling, and I didn't expect it at first, and we can talk about why I think it's turned out that way, has been the increasing use of neurotechnology in the workplace. And this has just hit a nerve literally with people in ways that I've been astounded by. So already, for more than a decade, there are companies who have been selling technology that enables employers to track fatigue levels by putting brain sensors on inside of baseball caps or hard hats or, other kinds of devices that a person is using already. And they're shifting into monitoring attention and focus in the workplace. And they're already doing that through keystroke logging and other kinds of pejoratively called bossware but now, centres are being integrated into that as well. And that has a, I think, very chilling effect. That's already happening in China, there are reports of workers being required to wear brain sensors, as they go about their everyday jobs, whether it's on a factory floor or as a conductor for the high speed train between Beijing and Shanghai.

Even more chilling, I think for people is to realise it's not just an authoritarian regimes, it's happening in more than 5000 companies worldwide and many more to come. After I gave a talk at the World Economic Forum in Davos, I had a company, a large company, a CEO come up to me afterwards and tell me that they were already using this neurotechnology for lots of the applications that I described in my talk. And were planning on doing so at larger scale going forward. And the heartening part about that is they wanted to talk about the ethics of how to do so how to do so responsibly in a transparent way that would be better for employees and more empowering for employees. But it just makes concrete for people that this really is happening. This isn't some theoretical thought exercise by a philosopher.

This is already in workplaces. I think part of the reason why that struck a nerve for people is first because most people didn't even know the technology or the capabilities existed yet. But second, I think everybody can relate to being at work and everybody can relate to the increasing surveillance in the workplace and how problematic that's already become how much that's already undermined trust and relationship between employers and employees. And the power imbalances. There's already such a stark informational asymmetry, but also power asymmetry. As we go into what appears to be kind of a global economic downturn, and people have far less mobility between jobs, the idea that there could be these kind of Orwellian impositions of brain monitoring in the workplace, and that people wouldn't have a choice to go somewhere else or to quit and work elsewhere that wasn't using that. I think that's very disquieting. That's very intrusive. And it's very relatable.

You know, I talk about a lot of applications like government's using it to interrogate criminal suspects, it's easier to 'other' people and say:"well, I'm not a criminal suspect, so it's never going to be used on me." It's much harder to do that, when you understand it's part of the workplace. It's also, I think, disturbing to think about it in children and in classrooms. So you know, there's, there's a potential value to this, there are studies that show that, students who are suffering from ADHD, for example, may stand to benefit from neurofeedback using neurotechnology. But imposing it, like it was in a classroom in China on students to see whether or not they're paying attention or their mind is wandering, can have a significant chilling effect on their ability to flourish as children to discover their own self identity to mind wander, which is so critical to growing and changing and figuring out who one is. And so those are some of the chilling examples, I think.

But also I talk a lot about the empowering possibilities of this technology, we treat our brain health, as if it's like mystical, you know, our physical health, people know that they need to exercise they know to go in regularly for physical exam to get their weight and their height, and their cholesterol levels checked their blood sugar checked. Most people can tell you, you know, the their heart rate, their resting heart rate and their blood pressure, but they can't tell you anything about their brain health. And, you know, that is contributing to this growing epidemic of mental illness and depression and neurological disease worldwide.

And the possibility of us being able to quantify our own brain activity for ourselves, not for anybody else to have access to it, right. But just being able to be your own advocate, having access to that information and treating it as if it is like other kinds of health, where you have access to it, you have tools at your disposal to enhance it and to improve it. I think that could be transformational. So I envision cognitive liberty, not just as a right from interference by others, but a right to exercise cognitive liberty, to know oneself and to enhance and improve your brain and your mental experiences.

Tracey Follows  18:11

Presumably, once some of that brain data is uncovered, you can start to create biomarkers like we have physically, that would take us to a much more sort of personal biological age than a chronological age. So like in the way that cholesterol is perhaps a biomarker for like heart health or whatever. We haven't explored that necessarily properly yet, have we with the brain, but presumably there will be biomarkers coming out of that kind of brain data?

Nita Farahany  18:38

I think that's right. There'll be biomarkers of cognitive decline, there may be biomarkers of brain fitness. There were some researchers a couple of years ago who published a communication in nature about the possibility of brain benchmarking. We check weight and height for children, but we don't do brain benchmarking. And could we similarly develop biomarkers or other cognitive and affective tests? We test for autism spectrum disorder, but we don't test for much more than that to see if children are growing healthily, and in cognitive and affective ways.

And so biomarkers for that which could also then hopefully identify when somebody is at risk of depression and mental illness and other cognitive or neurological diseases. I talk about an example - there was a group of researchers who tested out the possibility that glioblastoma, which is a terrible kind of brain tumour to be diagnosed with because by the time usually it is diagnosed, there's very little medically that can be done to extend somebody's lifetime, to have the possibility of a cure.

But there are tiny electrical changes that may occur in the earliest stages of glioblastoma well before it's currently being diagnosed and if you pick that up, maybe you could intervene much sooner, such that the course of the disease and the identification of it would be very different. And this all sounds, you know, to me promising. And so, you know, the question is, can we get to that future without the Orwellian dystopian future? Because, you know, the terrifying possibilities already happening that Orwellian, you know, dystopic applications of these technologies are already happening. And can we get to the place where the benefits outweigh the risks, and that we reset, and do this in a way that could enable us to do that without also subjecting us to opening our brains up to hacking and tracking by others?

Tracey Follows  20:45

So yeah, so let's get on to what we think or what you think rather, we need to do about this, and the ethical or legal parameters that we need to put around some of it. One of the things that really struck me, when I was reading the book, is about just how personal, obviously, but kind of how personal this data is. You say at one point in the book: "It's the very personal nature of this data of the profile it can provide of us because of algorithms that can be used to analyse our brain activity and extract features that are both unique to each person and stable over time." That's obviously what you've just been explaining to us. "But how your brain responds to a song or an image, for example, is highly dependent upon your prior experiences, the unique brain patterns that you generate could be used to authenticate your identity." Now, as somebody like me, who's interested in digital identity, and the digitization of the self, if you like, as we go forward in 21st century, that really struck a chord in me. And I don't know whether you said it, or I was thinking you probably said it, is brain data the ultimate biometric? Tell me what on earth can we do about that?

Nita Farahany  21:51

Yeah. So I mean, I write about that because it may be. Certainly people are investing in it. And you know, there are all kinds of problems with other biometrics for secure authentication of individuals and passwords are clearly not the answer. And hacking of inflammation is happening all the time. And so it makes sense that governments and companies are investing in trying to figure out whether the functional biometric of the brain would be a good one to use. You can probably tell from where I write about it in the book that I think giving government access to functional biometrics in the brain is a disaster waiting to happen. Because what you have to look at is the functioning of the brain, like somebody's literally thinking about singing a song or doing up a math problem or something like that, which means you're literally having to give access to the government to your thinking, in order to unlock, your computer, or go through security at the airport or things like that. And so, it may be incredibly valuable as a biometric authenticator, and just not worth the risk too.

Tracey Follows  23:01

I mean, I can see it happening in the military - because you can make the argument of security and safety. It's normally the way these things come in, isn't it? And then suddenly, it's like, well, everybody needs to be secure don't they and then it opens up

Nita Farahany  23:15

and the national security interest justify it, because it's the most secure possible biometric and everything else has failed. Yeah, so I mean, I put it in there, really, to help people see how the normalisation of this technology could occur, and the justifications that will be made in order to make it so that you have to make your brain accessible to other people. And I just find that to be wrong and bad and disastrous. We're heading into a place where people have no freedom of thought where the justification of giving access to your brain data in order to do so for national security reasons for secure authentication suddenly means we drop our final bastion of freedom, the line between the government and our own individual brains and mental experiences.

Tracey Follows  24:09

But you could see it happening in banking, for example. So you can't access your bank account. I mean, if we see what's been happening over the last couple of years

Nita Farahany  24:17

The temptation will be strong.

Tracey Follows  24:19

Yeah, if we think about what's been happening over on social media, and opining and any sort of wrong thing, and then linking that to, you know, well you can't make charity payments in this Pay Pal account, or we're going to take you off the platform and you can't find another platform to bank on. I mean, one can see how these things converge. And actually, identity will play a role in that potentially.

Tracey Follows  24:48

So that was Nita Farahany on the battle of your brain. Yes, that's your brain. And we'll be back with more from Nita later on in the episode. But first, let's take a moment to appreciate what's going on in the neurotech market. There are now hundreds of 1000s of mobile health apps available worldwide that consumers are using and businesses are encouraging their employees to use. No surprise then that the number has doubled in the last five years to something like 300,000. And there's an estimated market value of over $100 billion. And this trend is here to stay. Globally, the market for neurotechnology is growing at a compound annual rate of 12%. And it's expected to reach a value of $21 billion by 2026. Work, wellness and education are some of the key areas of focus. But what can this technology do now? And what are companies  hoping it will be able to do in the future? I spoke with Dr. Ramses Alcaide, the CEO of Neurable whose company makes everyday brain computer interfaces to find out more.

Music  25:54

[Neurable promotional video] "What if the future we want the technology that will remake our world, is already inside of each of us? What if the future is our brain, unlocked?"

Music  25:54

So welcome, Ramses to The Future of You. Thank you so much for joining me on this episode.

Ramses Alcaide  26:18

It's my pleasure. I appreciate the invitation and I'm excited to be talking about brain computer interfaces and the future of identity.

Tracey Follows  26:25

Oh well, we both are then. So you are the co-founder and CEO of Neurable. I wonder if you can just explain to our audience exactly what Neurable is about and maybe a bit of your origin story as well.

Ramses Alcaide  26:39

Yeah, definitely. So in a nutshell, what Neurable does is, we're a brain computer interface company. It's just a fancy way of saying that we record brain activity. In our case non-invasively, which means without surgery, and then we can use it to control different environments. The technology was developed at the University of Michigan. It's a technology that helps increase the sensitivity that we record brain data from. Regarding the origin story, it really started when I was about eight years old. My uncle got into a trucking accident, he lost both his legs, and it was a really traumatic time for him. And myself. And, you know, just put yourself into the shoes of either one of us, you know, imagine yourself there, like how much your life changes in that type of situation. And so, you know, I decided, from this point on, I'm going to dedicate my life toward technology that can help individuals who have any type of impairments for communicating or connecting with the world. And so I went to the University of Washington, studied electrical engineering, worked with the same prosthetic teams that my uncle did. And then I went to grad school and got my PhD in neuroscience. And what I realised there is that the problem is even bigger than just prosthetics. You know, you have people who, for example, don't even know they have Alzheimer's or Parkinson's until 10 years into the disease state. That's awful, right? You have people with ALS, who can't even communicate with their eyes. And the real issue here is just being able to unlock the potential of the brain. And to do that, you need to be able to do it at scale. And so the company spun out from the University of Michigan focused on how do we make neurotechnology a part of everyday devices? You know, how do we integrate this and make them actually functional in a way that people enjoy, and can actually use it. And that's a really hard challenge to do, because neurotech, especially nonsurgically, is prone to a lot of noise. And so we've been able to solve that with our technology. My co founder, Adam Molnar, has actually written numerous chapters on neuroethics. And he serves as an adviser to the Neuro Ethics Institute. So it's a really core part of our company. But I think that what matters more to us is that a lot of these concepts actually already exist, you know, and we've already seen how if we don't take them into account, they fail. The best example of this is like with the internet. You know, we failed to really create privacy and protection for individuals. And you know, it's turned into a nightmare that we have to deal with now. But core rules on laws regarding AI and protection of privacy, like this concept has already existed. And because neurotech is so new, I think it's part of our responsibility to make sure that we follow in good footsteps and make sure that we don't repeat the same issues as what happened with the internet with this new direction.

Tracey Follows  29:31

So what kind of data do you collect then?

Ramses Alcaide  29:35

Yeah, so we collect electroencephalography which is called EEG. It's basically just data that you collect non-invasively without surgery essentially, from the top of the scalp. Our technology right now we we've partnered with a few different manufacturers of headphones, and headsets of different kinds. And they collect the data from around your ear using soft sensors, or in the ear using essentially plastic sensors, and then they use our technology to actually make it usable at scale.

Tracey Follows  30:14

Do you see this being integrated into other platforms then? Or is it a standalone app? How do you see it sort of entering into the ever-increasing ecosystem of medical and healthcare apps and devices like this?

Ramses Alcaide  30:30

Yeah, I mean, I see it as like the next iPhone, that's kind of how I see it, you know, it's a, the iPhone came out, and it enabled you to have the internet in your pocket anywhere you went. And like that was insanely powerful, right? And it built industries, for example, like Uber wouldn't exist if we didn't have smartphones, right? Having GPS and internet, in ourselves, you know, anywhere we go, just made that a feasible thing. And so we're gonna see wearable devices like headphones and earbuds replace a lot of the things that your phones do. And the way you're going to control them and the way you're going to do your own health monitoring, just like you do with an Apple Watch, or with an iPhone that tracks your steps, it's going to be with your hearable device. And so we really see that as kind of the next evolution of the wearable computing devices that you wear. And then long term when augmented reality devices come on board, what's going to be the mouse and keyboard for them? Well, it's going to be the brain. And, you know, our technology is positioned to fit inside those narratives.

Tracey Follows  31:31

Made me think actually, that authentication, and you talked at the beginning about there not being an identity layer on the Internet, obviously, big problem. Authentication on something that is collecting such valuable data and is a highly personalised product and experience like this, presumably, it's going to become really, really important. Whether it's authentication or verification that you are the you that the company thinks is using the product. Because all kinds of things are being hacked at the minute, right? from bank accounts to social media accounts to whatever. What are your thoughts in terms of digital authentication or verification for Neurable?

Ramses Alcaide  32:09

Yeah, so you know, what I'm going to answer from a BCI space first. There's a big difference between invasive and non invasive. So when you're talking about invasive brain computer interfaces that require surgery, you can get really personal data and information from that. When you're talking about non invasive data, it's really kind of at the same level as picking up data from an accelerometer or from a temperature sensor, it's very high level information, it's not going to be like your social security number or anything like that. So, but regardless, the brain is sacred. So even just from the perspective that like human beings connect to that data in that way, I think we should respect it that way. That's our personal belief here at Neurable. And so from that perspective, our main goal is, you know, once again, communication of what type of data we're going to be getting the ability to have choice as to whether you want that data removed, and then also making sure that it's anonymized, even when we're processing it ourselves, right, because there's still ways of gathering value, even though we don't know exactly who it belongs to. And when it comes to authentication, I think that there's better and better ways of doing biometrics. Now, especially once we start doing AR devices, we can look at, you know, Iris information, etc. In the future, there might even be brain data that we can use for authentication. Although that still hasn't been proven out as strongly as one would like it. I think it would be an awful user experience right now. And so I think we need more of that. I think it's happening organically. The main thing is, do the companies building these products respect that? And then do the people buying the products? Are they willing to vote on what matters to them? Because at the end of the day, like, unfortunately, your dollars vote for everything.

Tracey Follows  33:54

Can you tell me a bit about some of the specific products? Obviously, I've looked at the focus algorithm and the smarter headphones for smarter focus, and then the piece with the Mayo Clinic on take a break. I wonder if you could just talk about some of the very specific like products or the way it's been packaged as products?

Ramses Alcaide  34:14

Yeah, definitely. So really  Neurable, ever since we started at the University of Michigan, our goal has been how do we take neurotechnology that would typically require these large cap systems and make them work in a consumer ready level in everyday devices, which means headphones, earbuds, glasses, helmets, etc. That's a really tough challenge. Because the smaller you get, the less sensors you have. And also when you move away from regions where the brain is creating that data from, you start to lose signal and so the experience becomes worse. And so fast forward seven years of development. You know, we finally got our technology to work inside everyday devices; headphones, earbuds, glasses, helmets, etc. And we started to go out and identify, like, what is a real pain point for people. And what we found out is that there's this huge issue happening, especially because of COVID, of burnout, and mental fatigue. And, you know, for example, we met with call centre, they had a 30% retention rate, like, that's awful.

You know, we met with college students, and a lot of them are burnt out from classes, you know, aircraft controllers, for example, they get tired, you know, 30% of accidents in the Air Force are caused because of fatigue. Like, that's crazy. And so we found this really big need point, which is just something as simple as understanding when a person's focus starts to go downward, and then tell him to take a break, because by the time we take a break, we already feel tired, we should have taken one significantly earlier. Otherwise, that leads to burnout. And it's already known, but most people think, Oh, I'll take a break, when I feel tired. That's, that's too late.

And so we went down this direction, and we did a study with the University of Graz. And we were able to demonstrate that we could not only identify fatigue, we could actually identify when a person was going to fail at a task, 30 minutes before that moment occurred. And so now you're talking about, you know, pilots, for example, Hey, your fatigue is trending to a point that at the task that you're doing in 30 minutes, if you don't stop and take a break, you're going to get into an accident. And so that saves lives. And even from an office setting, it's just the fact that like, people aren't getting burned out, you know.

And so we did a study with the Mayo Clinic, in office settings where we had people come in, use our system, by taking random brakes, no breaks, and then also our breaks based off of brain data. And we were able to show that at the end of the day, people actually reported a 20% decrease in stress, which is pretty significant. And a 70% increase in end of day happiness, which is really big. And all the only thing we did behaviorally is we told them to take a break in the right time timeframes. So they were more energised, they did less errors in their work. And then at the end of the day, they actually felt accomplished. And they felt energised about the work that they did, instead of feeling burnt out, getting home, you know pouring themselves cereal with milk and being depressed. You know, it's like, they were like, "wow, I really crushed it today and I feel energised." How do you give people that feeling more and more?

Tracey Follows  37:35

Do you think we'll look back at the 20th century and think it was pretty medieval in terms of our understanding of the brain?

Ramses Alcaide  37:43

Yeah, absolutely. Like, yeah, it's kind of crazy, you know, and I think we're gonna feel that way about cancer too. Like it's crazy to think that we use radiation to kill cancer, you know, it's like, these cells consume more energy, so they will die sooner, but we're just gonna kill everything inside you, right? And that's essentially a lot of what we do with the brain too right? Whenever we take a pill. We're just like, well, we think it works here but it regulates and changes so much. It really is medieval what we're doing. It just feels modern, but it's not, right? And as we get to understand more, be able to develop more specialised therapies... there's a revolution coming and a lot of that has to happen with how do we give access to this data more continuously to people?

Tracey Follows  38:33

I think it's partly the invasive nature of it that people don't necessarily appreciate. This is non invasive, I think that is a big distinction to make, isn't it?

Ramses Alcaide  38:43

For sure. And that's a big distinction as well, too. But what I would just say is we're so far away from the scary stuff, you know. It's like I just went to this military conference, where they were asking us about BCI as brain computer interfaces, and their applications and all these areas. And I was with some of the top CEOs in the field. And we all came up there and we basically said: 'look, we're trying to help somebody with ALS communicate.' Like, having somebody control a fighter pilot jet is just not going to happen anytime soon.

Tracey Follows  39:22

Super soldiers. Is this like N3 at DARPA?

Ramses Alcaide  39:25

Get that out of your head. Like, that's just not going to happen anytime soon.

Tracey Follows  39:28

But that's happening right -  is it?

Ramses Alcaide  39:30

No, it's not it. We're so far away from that, we're so far away from that, right? Like having somebody type out a pace that's anywhere close to human level speech. We're just trying to get to there first. And that's on the invasive side, right? On the non invasive side, you have to worry about it less as well, too. I'm not saying let's not be worried. Let's not put these ethics in place. I think it's something that we should do we should all be responsible for. But don't let the fear mongering impact the value that can come out of brain computer interfaces and the real good that they can do to billions of people.

Music  40:08

[Neurable promotional video] "At Neurable we're not waiting for the future. We're building it into the devices we use every day. Neurable, the mind unlocked."

Tracey Follows  40:22

I also spoke with AJ Keller, together with co founder Alex Castillo, they run Neurosity. They developed an EEG device called The Crown, which sits on your head, and uses signal processing to translate your brainwaves.

AJ Keller  40:37

I'm the CEO and co founder of Neurosity. We are a company that is striving to be the most trusted and loved neurotechnology company in the world. The products that we're currently making are essentially brain imaging for consumers to do in real time at their home. This company was started because I believe in the future, we'll all have access to incredible abilities to image our brain and track our brains all from the comfort of our own home, whether that's our office chair, or a bed or a couch.

We'll be able to comfortably learn more about our brains on a very regular basis. I think computers do an incredible job of showing up for people every single day. And giving people the ability to rely on computers, is really what's exciting about this and giving computers the ability to understand humans, and giving them this new data set is just going to create this entire new world. And I think that potential is just so worthwhile and exciting to work on. So that's really where Neurosity came from.

Tracey Follows  41:42

Do you think people are ready for this? I mean, the general public.

AJ Keller  41:47

I love being able to bring a technology that's been done in clinics, home to people. You know, it's time to come home, it's time to bring this technology home. Yeah, I think people are ready, we've been traveling all over the world to go to these clinics to get our brains imaged. We only get to get our brains imaged when we slam our heads into something like a car crash or football injury. You know, so I think, yeah, it's time to bring this home, it's time to be able to get this technology in our hands just like it's useful to have a iPhone in our hands, it's going to be useful to have a brain computer, A Crown on our heads.

Tracey Follows  42:26

Yeah, exactly. The Crown. So you've got some beautiful footage on your website. Really looks amazing hardware. But I'm sure some people will think it's a little odd. Tell us how it works and how you wear it and what it does.

AJ Keller  42:42

We wanted to create something that was able to image the entire brain and was able to scan the entire head. And we knew that if we could create something like that, that would be tremendously useful for ourselves to write software applications or neuroscientists looking to do research - it would give them a lot of help, being able to image the whole brain. Not just imaging it from the forehead, but actually being able to see what's going on the back of the head, the side of the head and the front of the head, and how all that energy sort of swirls around, we really wanted to be able to bring that to the programmers give that to the power of the programmers.

And so the Crown's design had to fit on the back of the head, and we didn't want to put anything on the face of the person. We wanted to create something that looked beautiful, and that stood out and that gave the wearer confidence. It's definitely something unique. And it's been really helpful for us to be able to make our own design, because we're not trying to fit it into anything, you know, and we have the ability to carve our own desires and wants into the design of it so that we can fit many head shapes and hairstyles. So the Crown can go through long hair, short hair, it can be comfortable for people with no hair as well. So really that was a lot of the big focus was how can we image the brain on everyone's head? There has been one customer who, in particular, one target market in particular, that sends us an email if the device breaks is like waiting for the new replacement device to get to them.

And it's like, "where's my tracking number? Like, I haven't gotten my tracking number yet today. And you said you were gonna send it yesterday, like what's going on here?" These are programmers, these are people who most of the time have struggled a lot of their life to voluntarily be able to do their best work. But what the Crown, our solution, provides is this voluntary way to get better at doing your best work and to sort of slip into a focus state. Sort of like accidentally get into flow state just by showing up and by doing this priming before you work, this mobile app experience we have, that's really the biggest use case, we've seen. The really great part of programmers and why I think it's an amazing thing is, the more programmers that we get to use this, there's going to be this snowball effect where programmers have to programme the things that are around them.

Like, there aren't that many hardware platforms, right. But what programmers can do is write software for the hardware platforms. So they are compelled to make software for the things around them. So my hope is that as we have programmers be the first target market, because they can afford it, they're sitting at home with good Wi Fi, you know, they're stationary, they can have space on their desk to put it and keep it charging. They're early adopters, they're receptive to technical difficulties, they're receptive to lacking technical documentation, and they can sort of fill in the gaps. All of those things aside, and why it's a great target market and beachhead market is that they really need it, right, they really desperately need it, because they can't work if they're not able to get into that flow.

And then there's this whole thing where we have the Neurosity platform that has a open free SDK and API for people to build apps for. So a lot of the organic growth that we've seen from this target market often comes from the applications that they're creating themselves, the way that they're integrating into their lives. That's something that they feel they can go viral with their own communities.

And so there's all of these components that make it just the best target market, and I'm a programmer, and my co founder is a programmer. So it really resonates with us, we know the problems. I'm a programmer with ADHD, so I doubly know the problems, I can see my co founder who doesn't have ADHD, sit down and start working. And it's the most amazing thing I witness. And then me, I sit there, and I'm like a little Pong game going back and forth. You know, so I think, you know, just the differences in that and the problems that we have, you know, we're really ripe for disruption in the space and, you know, programmers, right now, we're only turning to like Adderall and Red Bulls, and just like trying to do anything just to get focused. So, you know, this is a great target market for us and have been really receptive. And definitely thank you to all the programmers, if anyone's listening, that have supported us along this way as well.

Tracey Follows  47:22

What happens to the data AJ? I wanted to ask you that, because people do have concerns around privacy autonomy. Who's collecting what data? Where does it reside? How is it treated? How can you access it yourself? Tell me a bit about the data.

AJ Keller  47:39

When Alex and I started this company, we said, we're not going to put privacy on the back burner, we're not going to make the device and then figure out okay, how do we make this private? But we said, from day one, how do we make this something that is using the same standard of encryption and technology that Gmail uses to secure your email? You know, why do you have confidence that, you know, your teammate isn't going to see your emails? Right? Like, why do we have that confidence? It's because years of trust and effort have been put into encryption and authentication and authorization. So the Crown is actually the only neurotech device in the world that you have to authenticate and authorise applications to get data from it.

AJ Keller  48:24

So let's say a third party developer wants to make an application for detecting epileptic seizures and start building a dataset for epileptic seizures, where the users who have epilepsy, if they feel a fit is coming on, or they have a fit, they're able to tag it and we're able to save that data. But in that user's experience, the first time they sign into the application, they're hitting, connect your Neurosity account to this, and they're seeing 'Do you authorise Joe's epilepsy app to collect raw brain activity, accelerometer data, and can store this data?' And you say, yes, yes, yes. And you authorise it. And the Crown is the only device that has that and that's because we have an entire computer that's running on the actual device.

When it comes to encryption and authentication, you have to have cryptographic capabilities on the device. And a lot of these IoT devices that you see out there don't have cryptographic capabilities on their devices. But the Crown absolutely does have cryptographic capabilities. It has a secure enclave that's able to store sensitive information. It's running Linux, the same stuff that Android is running, it's running the most up to date version of that, we do over the air updates to apply patches, and it's incredibly private. Once you have claimed your Crown. Nobody else has access to that data and only you have access to that data. We just started doing a research programme where you can opt in to submit your data - you don't opt out. We have this thing that I invented called an opt in culture where all of our devices and all of our software applications assume that the user has opted out. And we request if the user would like to opt in. And so using this sort of methodology, where everything is locked down from day one, and then anything on top of that, you're asking the user for explicit permission that's revocable, at any time, and really just using the same stuff that we use in the rest of our everyday lives to keep it secure.

Tracey Follows  50:43

So that's the tech - personalised wearable devices that can effectively monitor and analyse your thoughts, your emotions, maybe your focus, or alertness, and help provide you with insights about your own brain. As Neurable say, your brain knows best. But what have the sovereignty over our own brains and brain data? That's something I touch on in the book a bit; The Future of You. But Nita Farahany has given it a name, cognitive liberty. Yes, this type of neuroscience gives us intimate access to our own brain data. But what happens if government or companies or our employers, or any kind of rogue actor gains access to our own brain data too? Let's return to Nita Farahany to explore this more.

Tracey Follows  51:30

I can understand that people want to put data in data trusts, and there are bio banks and all of these things going on and it can be incredibly useful for a community or a locality or a group of people. But actually, data solidarity shouldn't be the default, should it? Cognitive liberty, should.

Nita Farahany  51:50

Cognitive liberty should absolutely be the default and I think if we could get to the place where we had a data repository, that I knew wasn't going to be mined and misused, and could be used to address some of the leading causes of neurological disease and suffering, I would happily contribute to it. If I had the choice, and if the default role was to have the data be my data, and if I had a right to cognitive liberty, and if I knew that I was secure against the misuse of that data. So you know, if we could design a world in which people could choose to confidently share their brain data for limited circumstances, for limited purposes, for, you know, common good applications that they believed in, I'd be all for that - right? But you know, all of those things I said, have to be in place to make that possible and that's not the world we live in right now. It's a world that we could make possible by making the right choices now.

But I think one of the things that I'm trying to make clear with The Battle for your Brain is just how urgent this dialogue is. How little time we have to put into place default rules that protect individuals because the battle has arrived, right? It is, it is here. It is already in workplaces, it's already been used by the government, it's already being invested in people are already unwittingly giving up their brain data. The default is, again in favour of government and corporate use and commodification of our brain data. But it isn't yet widespread. It's about to be. It's about to be in our headphones and in our earbuds and in our everyday devices. And before that moment occurs, if we put into place, rights in favour of the individual, a right to cognitive Liberty, we could flip the terms of service and put it in the hands of individuals rather than favouring corporations and government misuse and abuse of our brains.

Tracey Follows  53:51

Yes, because how much of this do you see in marketing and manipulation? I know you cover manipulation in the book, I think it's one of the most interesting areas. Because we're becoming more and more savvy to perhaps media manipulation now that we just didn't maybe see decades ago. That's perhaps awakening us to some of these issues. But of course, lots of people are being nudged, as we would say in the UK, by behavioural economics departments and teams and behavioural science, and it feels like it's not too much of a leap to move from behavioural economics and nudging to sort of these attempts to alter your cognitive biases or emotions or start playing with the subconscious. Can you just tell us a little bit about what you foresee in that area?

Nita Farahany  54:40

Yeah, so I write about mental manipulation is what I call it. And I tried to give a spectrum of that, which is everything from neuro marketing techniques that just trying to better understand our preferences and desires and it's not that different from other forms of marketing as icky as it may seem to intentionally trying to bypass our freedom of action. Meaning, trying to hijack our neural processes, to have us addicted to platforms to have us unable to tear ourselves away to really try to overcome our natural preferences and desires to do otherwise. And I think techniques that are designed to tap into that system in ways that are harmful to us, and kind of meant to take control in many ways of how we act and react and behave.

I think those are the ones that we should be the most suspicious of. Those are the ones that I think are more likely to fall in the line of impermissible manipulation that interferes with our freedom of thought. And so it's the hardest category, I think that was really like the chapter that I struggled the most with philosophically. Is to figure out kind of where other people have come down on this, how does the brain actually work? What are the techniques that people are trying to use? And what are the concerns with that? Like, what is it that feels so wrong about it? And can we articulate that in laws and regulations and normative lines that would help us be able to regulate it to prevent those kinds of encroachments on cognitive liberty and as begrudgingly as I come about it and conclude on it, there are some practices that I don't like, but that I don't think the sort of freedom of thought is the right way to regulate, I think it's too heavy of a hammer, I think it interferes with too much ordinary human interaction to try to ban it.

Tracey Follows  55:10

Because there's a fine line between persuasion and manipulation. Persuasions, fine, isn't it? Kind of? But manipulation is not.

Nita Farahany  56:52

I mean, it's what we're trying to do every day, I'm trying to persuade people to you know, care about the right to cognitive liberty and to be part of the conversation. And so I think one of the earliest skills that we develop, which I talked about in the book, is theory of mind of others. And when theory of mind comes online, then we're trying to use those theories of mind to persuade other people. So we don't want to interfere with that. And we certainly don't want to invite the government into regulating the interactions between ordinary humans to say like, 'Nope, that's on the wrong side of the line. And this is on the right side of the lines,' we have to be careful about where we draw that line. And I offer an interpretation of how to do so in the book.

Tracey Follows  57:28

As we draw to a close, I've got a bit of a mad futurist question. You're a futurist as well. So I'm sure you've thought about this. Say we've uploaded our brains, we've done some mind uploading or downloading, however you want to think of it and say, and I know, we have to suspend our disbelief a little bit here, say that we can put our brain in some sort of digital substrate. Will that digital brain or mind have the same rights, the same cognitive liberty as our human Meetspace version?

Nita Farahany  57:57

That's a good question, right? I mean, it's basically the AI question to write which is, is the digital representation of self, the something that is self? Is there something uniquely human about our cognition and our processes? I hope there is something uniquely human about what it means to be human. And that is different from machines. And I also think that the idea that we could fully upload our brains ignores that a lot of our cognition and our affective experiences are extended beyond just our brains, right? And so we would have to upload so much more than just literally our brains to make...

Tracey Follows  58:40

We'd just have to clone ourselves really wouldn't we?

Nita Farahany  58:42

Essentially, yeah. And you know, and it's not just your brain and your nervous system, and how you have this extended personality throughout your body and what you interact with, but it's your interaction with the environment that gives rise to a lot of the different experiences and mental processes that you have. And so, in that world, where all of that could be captured, right, and we're not just talking about brain uploading, we are talking about the full human experience being uploaded, then yeah, I think that we would have the same rights and we would have the right to cognitive liberty in that form, as well. But you can see that, as a futurist, I have some disbelief about being able to reduce the entirety of the human experience into digital bits and bytes that can then be uploaded.

Tracey Follows  59:25

Yeah, it is different, isn't it the analogue brain and the digital substrate? So Nita what's your final message for people in terms of getting more involved in this subject matter? Understanding more? You know, initiating more conversations around it so that we can push this agenda of cognitive liberty?

Nita Farahany  59:44

I mean, I hope to make many many more cognitive liberty advocates out there in the world to be part of the conversation and I think the very first step is to get educated. I think my purpose in reading this book is ideally to spark a global conversation and to make it accessible to everybody, for people to understand what's happening, and to be familiar with and understand what the issues are that are at stake. Like what rights are at stake, what does it mean, to connect your brain to corporations and governments? Like what is it that you really are putting at stake in the debate? So first is to get educated. And I think a good way to do that is, of course, to read my book, so I hope people will, which is to read The Battle for your Brain: Defending your Right to Think Freely in the age of Neurotechnology. But I will say, I am far from the only person in this debate. There are so many thoughtful, you know, scholars, advocates, big thinkers.

Tracey Follows  1:00:38

Where else would you recommend people look? Are there journals, good websites?

Nita Farahany  1:00:44

So I mean, there was a wonderful nature electronics journal that just came out that declared BCI, the brain computer interface, that technology of the year. The OECD has recommendations for regulation of neurotechnology, the UN has, you know, dug into this issue. There are thoughtful scholars across the globe who are coming to the table and are offering different perspectives on these issues. And I think that diversity of thought is so important to push this issue forward, and has helped push my thinking forward as well.

But I'd say it's so important for people to get educated, become part of the debate, and then realise that they have choices to make, right? It isn't just like at a societal level, like, oh, let's hope somebody passes cognitive liberty, people have to start exercising their own right to cognitive liberty, and they have to treat it differently. They cannot unwittingly give up their brain data just as easily as they've given up all of the rest of their data. They need to make deliberate choices, as they buy EarPods and headphones and watches that have a neural interface and not just think like, oh, this is the next cool technology, but figure out what the terms of service are under which that technology is being offered to you. If their employer is mandating it, demand that it be put into writing about exactly what they plan to do with that data, what they're collecting, and demand a transparent policy about that and push back, right? If the employer says I'm going to track your brain activity, push back and join with others to push back against that and say, No, you know, what, this actually violates the fundamental dignity of what it means to be human. And so I want people to be much more active in this debate and in this conversation, and you know, you start by becoming aware and you then transition into using that awareness as a tool of empowerment.

Tracey Follows  1:02:36

So really cognitive liberty is as much a responsibility as it is a right, isn't it?

Nita Farahany  1:02:40

Absolutely it is as much a responsibility as it is the right.

Tracey Follows  1:02:51

Thank you for listening to The Future of You hosted by me Tracey Follows. Check out the show notes for more info about the topics covered in this episode. Do like and subscribe wherever you listen to podcasts. And if you know someone you think will enjoy this episode, please do share it with them. Visit thefutureofyou.co.uk for more on the future of identity in a digital world, and futuremade.consulting for the future of everything else. The Future of You podcast is produced by Big Tent Media.

Tags:

More posts from this author