In this episode of Technology and Security, Dr Miah Hammond-Errey speaks with Professor Nita Farahany about the increasing emergence of neurotechnology and what it means for national security as well as consumers, policy makers, military forces and nation states. They discuss the importance of ensuring the privacy of brain data as the ‘final piece in the jigsaw puzzle’ of data collection by large technology companies. They also discuss the possibility of identifying, verifying and targeting individuals by their neural signature and why addressing this technology should be a national security priority. They explore the current and potential roles neurotechnology can play in combatting information warfare and improving cognitive resilience as well as the increasing role of AI. Finally, they highlight what to look out for in tech in 2024.

Nita Farahany is a Professor of Law & Philosophy at Duke University and is a leading scholar in the ethical, legal and social implications of emerging technologies. She has consulted extensively and including advising DARPA and has testifying before Congress. Nita was on the US Presidential Commission for the Study of Bioethical Issues for many years. Her latest book, The Battle for Your Brain: Defending the right to think freely in the age of neurotechnology, examines the ethical and legal challenges of emerging neurotechnology.

Professor Nita Farahany and Dr Miah Hammond-Errey
Professor Nita Farahany and Dr Miah Hammond-Errey

Technology and Security is hosted by Dr Miah Hammond-Errey, the inaugural director of the Emerging Technology program at the United States Studies Centre, based at the University of Sydney.

Resources mentioned in the recording:

Miah’s Twitter:

Making great content requires fabulous teams. Thanks to the great talents of the following.

  • Research support and editorial assistance: Tom Barrett
  • Production: Elliott Brennan
  • Podcast design: Susan Beale
  • Music: Dr. Paul Mac

This podcast was recorded on the lands of the Gadigal people, and we pay our respects to their Elders past, present and emerging — here and wherever you’re listening. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.

Episode transcript

Please check against delivery

Dr. Miah Hammond-Errey: [00:00:02] Welcome to Technology and Security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr Miah Hammond-Errey. I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney.

Dr. Miah Hammond-Errey: [00:00:19] Welcome back to Technology and Security. This is the first episode of 2024 and we hope you had a great break. My guest today is Nita Farahany. Nita Farahany is a professor of law and philosophy at Duke University and is a leading scholar in the ethical, legal and social implications of emerging technologies. She has consulted extensively, including advising DARPA and testifying before Congress. Nita was on the US Presidential Commission for the Study of Bioethical Issues for many years, and her latest book, ‘The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology', examines the ethical and legal challenges of emerging technology. We're so happy to have you join the podcast, Nita.

Prof. Nita Farahany: [00:01:01] I'm delighted to be with you.

Dr. Miah Hammond-Errey: [00:01:03] We're coming to you today from the lands of the Gadigal people. We pay our respects to their elders, past, present and emerging both here and wherever you're listening. We acknowledge their continuing connection to land, sea and community and extend that respect to all Aboriginal and Torres Strait Islander people.

Dr. Miah Hammond-Errey: [00:01:21] So, Nita, you start your book recounting a demonstration of technology that changed the way you saw brains as the place of solace. It's been five years since that moment. And have we lost the final realm of privacy?

Prof. Nita Farahany: [00:01:34] Not yet. We are on our way to doing so. But, the description you're talking about is really my aha moment, which is I was at a conference at the University of Pennsylvania at Wharton Business School, where there was a technologist who stood up and talked about how we humans are such inefficient output devices that we're unable to get what's from our brains out into the rest of our world because we're so limited by our bodies. He was talking about neurotechnology, not as a way to just be able to meditate or to improve brain health, but much like a mouse or a keyboard, it would be how we would interact with other technology.

Prof. Nita Farahany: [00:02:21] And it wasn't just the difference in what we could do with the technology, it was the form that it took. So he was showcasing sensors that were embedded into a watch. That to me was different because, up until then, neurotechnology had had niche applications in form factors that were either bulky and medical grade or were things that we would never wear in our everyday lives, like a stiff forehead band that was, you know, awkward and uncomfortable. And so a watch that could be multifunctional, you could tell time, it was unobtrusive, but also changed the way you interact with other technology was like, wow.

Prof. Nita Farahany: [00:03:00] We're not there yet. That is, we haven't lost our final frontier of privacy because those multifunctional devices, while a few proof-of-concept ones have hit the market just in the past year, the big tech players, Meta, Apple, Google, those are all products that they're launching within the next few years, but there hasn't been a major entrance by a tech company with an everyday product yet. When that happens is when I think the final frontier of privacy will have been crossed.

Dr. Miah Hammond-Errey: [00:03:31] How far away do you think we really are? I mean, if they've got plans in the next couple of years, when do you think we'll see one of the major players release something as multifunctional as that?

Prof. Nita Farahany: [00:03:42] Well, I'd say the first crude version of it is the Apple Pro Vision. It uses eye activity to make inferences about what we're thinking and feeling and intention to navigate through a screen. It doesn't use direct brain sensors, but it uses what your irises are doing and what your eye movement is doing to make inferences about what you're thinking or feeling, and it does so as the interface to the device. And so that, even though that's not going to be a mass market product because of its price point, that is the first major entrant by a major tech company into consumer-based neural interface technology, as the way we operate in something like immersive or spatial computing.

Prof. Nita Farahany: [00:04:27] And then pretty soon thereafter, I think probably in first quarter of 2025, Meta will launch its watch. So that aha moment, was by a company that I thought was going to be acquired by Apple. It ended up being acquired by Meta, and they have really gone all in, investing in that being one of the primary ways that we'll interact with Oculus and their other immersive environment devices.

Prof. Nita Farahany: [00:04:53] Since I wrote the book, generative AI has exploded, and it has also exploded the opportunities for neural interface, because the possibility now of launching a neural interface device with basic functionality, like you can go left, right, up, down using neural interface, but having it powered by generative AI where it learns you and co-evolves with you over time, such that it becomes increasingly more powerful and personalised. And so when Meta in its 'Meta Connect' conference a couple of months ago, was talking about the updates with respect to this device, that's one of the things they talked about, that this is very much now become an AI story and a generative AI story, where the technology will co-evolve with the individual.

Dr. Miah Hammond-Errey: [00:05:38] You show in your recent book that neurotechnologies are already more pervasive than we realise. And before we kind of get into some of the future implications, can you explain a little bit about the sensors in neurotechnologies and their applications?

Prof. Nita Farahany: [00:05:52] Most of the devices are measuring electrical activity in the brain. So when a person thinks or does anything, neurones are firing in the brain and those give off tiny electrical discharges. And when you have a dominant mental state, hundreds of thousands of neurones are firing at the same time. And the patterns of neuronal firing that show up across what are called different brainwaves can be picked up by these sensors. And it's kind of the average of brain activity across the brain, but those, those differences of activity can be picked up just by tiny dry sensors that are in an earbud or in a headphone, or in a hard hat that make contact with the scalp.

Prof. Nita Farahany: [00:06:33] So the two big ones that I think, will hit the market and are already on the market in consumer-based devices are picking up EEG, electroencephalography, the brainwave activity, the electrical activity in the brain, or fNIRS functional near-infrared spectroscopy, which is using infrared light and how it reacts across brain tissue. So consumer devices, there are a lot of them already on the market, but most of them today are not neural interface. That is, most of them people use to do things for what's called neurofeedback, that is to see their brain activity when they're meditating or if they're stressed or for something like that. It can be used medically for things like training your attention if you have something like ADHD. So that same idea of neurofeedback, which is seeing how your brain reacts and then using, for example, a gamified approach to improving your attention over time.

Prof. Nita Farahany: [00:07:33] For the most part, the medical-grade technology has evolved a little bit differently than the consumer-grade technology. So, for example, in a clinic or in a hospital, it's not in the form of like earbuds or headphones, which have just a couple of sensors in it. It may be, you know, 128 sensors that are applied with gel using a cap across a person's head to pick up far more precise information about brain activity. That would be an EEG cap. And it's used to analyse what's happening when a person is suffering from epileptic seizures, to better understand the pattern of activity that's happening, or even things like brain tumours or migraine or other neurological conditions that a person is suffering from, and being able to use that for diagnostic purposes or trying to do sleep studies to understand sleep disturbances. There's also implanted neurotechnology. And this is, you know, a kind of tiny number of people, it's primarily clinical studies that are underway worldwide, where a person who, for example, is a quadriplegic or suffering from Parkinson's disease or intractable depression, may have implanted neurotechnology that enables them to communicate from their brain if they've lost the ability to speak, or be able to use their brain activity to navigate across a computer screen and regain some forms of self-determination.

Dr. Miah Hammond-Errey: [00:08:59] Thank you for making quite a complex topic easier to understand. And you've raised there the difference between consumer and medical applications. Can you just also clarify where we're at from a medical and military neurotechnology application perspective?

Prof. Nita Farahany: [00:09:15] In the military, I'm not familiar with implanted neurotechnology being used in that context, but there is a lot of focus on the use of neurotechnology to augment soldiers’ capabilities, or for training purposes. So, using it to improve the rate of learning or improve skills like target identification. There's even explorations to try to figure out if it's possible to engage in something like brain-to-brain communication or to enhance and go beyond what ordinary human capabilities are to create so-called super soldiers, for example. And there are some more dystopian applications in military context. For example, the kind of growing claim that there is a sixth domain of warfare called cognitive warfare, the development of weapons that could potentially disable or disorient the human brain, and a belief that that may be one of the major areas of investment by militaries worldwide to try to develop these kinds of tools or technologies that could make that possible.

Dr. Miah Hammond-Errey: [00:10:20] You've talked about the way that neurotechnology sensors are at risk or already being commodified, just as the rest of our personal data has been. As a society, we're really only starting to deal with what I've described elsewhere as a complete or near-complete data coverage of human lives. Can you describe that interplay, and how are the questions about neurotechnology different from who has access to our data and what can be done with that aggregate data?

Prof. Nita Farahany: [00:10:47] It really builds on that conversation. So, you know, most people understand at this point that services that are supposedly free aren't free, that we're paying for access to, you know, search engines or to social media through the aggregation of our personal data. So, every time you hit a like button that feeds into a system that really creates a very precise psychological profile of who you are and what your preferences and likes and desires are. But those are all inferences about you aggregated across many, many different data points. And they've become very powerful. That's because they've increasingly been designed to try to hook you or addict you based on what the devices know and learn about you.

Prof. Nita Farahany: [00:11:28] Brain data is kind of like the last piece of a jigsaw puzzle. So, all of that information is gathered from things you've done, right? So, it's you've hit the like button. You drove to that particular location, which was tracked via your [GPS] location, but there's still a little part of you that you hold back, right? It's like your unsaid emotions or desires. It's how you're delighted by something, you're disgusted by something, you are biased about some person or some topic that's presented to you, but you figure out how to push that aside and not express it in what you do. You choose to only share that bit of you with a loved one, with a friend. That, that's the piece that brain data suddenly makes accessible to other people.

Prof. Nita Farahany: [00:12:23] And it's the same exact people who've been commodifying all of the rest of that information, right? So, if Meta has been aggregating all of your personal data from likes and swipes and clicks and posts, the Holy Grail, the last thing is how do you actually feel? What're your inward preferences and desires and unexpressed emotions? And that's what brain sensors suddenly make accessible. And if it's Meta who sells you the watch and it's used to interface with their social media platform, suddenly it's a complete picture of who you are that is accessible and can be instrumented, can be used to advertise to you, but also can create this kind of closed loop where it's like, this is how you feel? Here's what is going to show up in your feed. Oh, and that's how you reacted to it? We're going to subtly change your feed algorithmically, and it's going to become this little closed loop of being able to both understand, commodify and then shape and change how you feel and act and react to information in your environment.

Prof. Nita Farahany: [00:13:35] But the question is, can that happen in a way that's aligned with human flourishing or that's aligned with what is good for humanity, or good for us, or good for our brains, and can be more self-directed? So, for example, imagine a world in which brain sensors are ubiquitous. They're the way you interact with other technology, but the data that's collected lives on device, it's processed near your device, and the only thing that the company ever sees is the necessary inference to do the thing you need to do, like move left or right, but none of the rest of the data about how you react or how you feel is something that's accessible to the company. That's a design choice. That's a design choice that we could demand. It could be that edge processing and edge storage of data, of brain data in particular, isn't just a preference, it's a mandate. And that the most sensitive data that we have, which is this brain data, is kept for us to use to gain insights about our own life and well-being, but is not up for instrumentation by corporate giants who might use it to do whatever they want to do otherwise

Dr. Miah Hammond-Errey: [00:14:49] And, of course, this is a key concept in your book, cognitive liberty as a design choice. Can you explain to listeners the difference between mental privacy as a relative right and freedom of thought as an absolute right?

Prof. Nita Farahany: [00:15:02] Sure. So, when I describe cognitive liberty, I think first of all, the best way to understand it is truly as a framework, right? Which is to say it is a fundamental right that should guide how we think about what it means to flourish in the digital age. To get there, we need to recognise a set of rights and put into place a set of both sticks and carrots.

Prof. Nita Farahany: [00:15:26] And the biggest and kind of first stick that we urgently need to put into place is an international human rights framework that aligns with cognitive liberty. And that means updating three existing human rights.

Prof. Nita Farahany: [00:15:38] The right to self-determination, which is currently recognised as a collective and political right, which also needs to be an individual right, which includes a right to self-access.

Prof. Nita Farahany: [00:15:48] It means recognising that privacy includes a right to mental privacy, which is a relative right, and by relative rights, what I mean is that there will be times that societal interests outweigh the individual interest in mental privacy. So, it's a balancing act between the needs of the common good and the needs of the individual. But by recognising it as a right, that means that under international human rights law, it can only be restricted if there's a legal reason that that mental privacy should be infringed upon, if the need matches the necessity. So, for example, you want to know if a truck driver is suffering from mental fatigue, that would infringe on their mental privacy, but their interest in mental privacy may not be as strong as societal interest in knowing if they're falling asleep at the wheel. But the only data you could get from their brain is whether or not they are tired. All of the rest of the information would remain with them, because it has to be proportionate to the interest that you're trying to preserve for society.

Prof. Nita Farahany: [00:16:51] Freedom of thought is currently understood in international human rights law to primarily protect freedom of religion and belief. There has been a move in arguments to broaden that to more robustly protect our thoughts and the images in our mind. There's a lot you can glean from the brain, like automatic reactions or fatigue levels, which are not the same as the kinds of thinking and imagery that you have in your mind that we really think of as robust thoughts. Here, this is an absolute human right. That means it's not relative to societal interest. You can never violate the interest of the individual.

Dr. Miah Hammond-Errey: [00:17:28] Can you explain the use of brain biometric data for authentication? How accurate is it and what are the security implications of this?

Prof. Nita Farahany: [00:17:37] Yeah, it's a good question. So, something that's pretty amazing is that at least according to current research studies, each of us processes information a little bit differently in our brain in ways that can be detected and potentially serve as a neural signature for each of us. And so, what that might look like is, you know, I have a favourite song, and I sing a stanza of that favourite song in my head, and it's recorded through something like an EEG device. And then to authenticate myself, that is to say, it's really me when I want to unlock my phone or I want to gain access to a secure building, for example, I would sing that same little stanza in my head, and that would become the way that I would authenticate myself. This is what's called a functional biometric, because instead of something static, like your face, it is how your brain is processing information. But it has the signatures of what are kind of like the best possible biometrics. It's hidden. It's potentially unique to each individual. It's very difficult to replicate, which makes it a good form of authentication.

Prof. Nita Farahany: [00:18:48] It's unclear yet if it is scalable across the entire population. So, there's a difference between authenticating a person and identifying a person. Identifying a person means you can pick each person out in the world based on their neural signature, which would mean that, you know, across the billions of people in the world, each of us have a unique neural signature. We don't know yet. We haven't run that experiment to know if that's true. But so far it seems like, at least for authentication purposes, that is matching a signature to a pre-recorded one, that it seems to be highly accurate and unique to each person.

Dr. Miah Hammond-Errey: [00:19:25] On consumer devices, do you ever think that threshold for identification could be used to verify an individual to apply lethal force?

Prof. Nita Farahany: [00:19:34] Wow, that's a hard question since I hope we won't be doing it. So, I will just start with, um, to me it's horrific, the thought of using neural signatures as a basis for using lethal force against them. But do I think that it could reliably pick out a person well enough that we could say that it has appropriately identified that person? Again, I don't totally know on the identification front, but I think on the authentication front, the answer would be that you can pick out a single person based on a previous recording.

Prof. Nita Farahany: [00:20:12] And if we just bring an earlier part of the conversation back to where we are, I don't think it's unreasonable to imagine a future that's coming pretty soon where most people will have recorded their neural signature, because they use it as the primary way that they interact with other devices. And if that's the case, then that means that every person will have recorded what is their unique neural signature, and we'll be using it pretty regularly as the biometric to unlock or to interact with their devices, in which case the question of does it scale across society so that we can pick out individuals we'll have a better answer to. Also, will it be possible to uniquely pick out an individual and identify them and target them, whether it's for a military purpose or for any other purpose? The answer is probably yes.

Dr. Miah Hammond-Errey: [00:21:02] Concerning, but I think a really, really important question. In 'Battle for Your Brain', you talk about the potential that neurotechnology offers governments in surveilling not just people's behaviours and actions, but as you've set out, also their thoughts. And I want to kind of go to some of the national security implications here and set out, are there differences between the way that nation states want to apply and are currently applying neurotechnologies?

Prof. Nita Farahany: [00:21:27] The answer is probably yes. We don't know fully what every country is doing when it comes to neurotechnology. What we do know is that there seems to be a pretty significant investment in China in developing this sixth domain of warfare or cognitive warfare. In fact, the Biden administration in December of 2021 sanctioned four Chinese companies for the development of purported brain control weaponry and some declassified and translated documents from Chinese military suggest that they may be investing heavily in trying to figure out ways to disable and disorient the human brain.

Prof. Nita Farahany: [00:22:09] And of course, there is a lot of conversation about whether platforms like TikTok, for example, are really information warfare platforms or designed to try to both pick up, for example, biometrics of other citizens of other countries and then try to disseminate particular kinds of information or amplify particular voices or try to, you know, strike discord within populations based on certain voices or certain messages that are amplified. And if you think about a heavy investment in neurotechnology devices as being part of that, it isn't hard to imagine this all being part of a bigger ecosystem, developed around trying to understand, dominate and potentially affect and disable other citizens of other countries and how their brains think and process and operate information. So, it's definitely something I worry about a lot, the kind of militarisation of neurotechnology and the potential use, both of biometrics, but also the hidden and more subtle ways that that information could be used to really cognitively shape and reshape how people think in ways that could really be destabilising to countries.

Dr. Miah Hammond-Errey: [00:23:22] Thank you. I've written a lot about our information environment, which, as the military call it, is really about how technology is impacting content, infrastructure and cognitive resilience. You've touched on it there, but can you describe how neurotechnologies might impact those components of the information environment?

Prof. Nita Farahany: [00:23:39] I think if you think about this as a broader ecosystem, right, which is, the brain and our mental experiences are not just about picking up how you're thinking and how you're feeling, right? A lot of self-determination is about the basic building blocks of what it means to be human and what are the basic human capacities for flourishing. And that's the capacity for introspection, the capacity for mental agility, which includes critical thinking and resilience, and the capacity for empathy and relational intelligence, understanding our relationships with each other.

Prof. Nita Farahany: [00:24:16] But really what we're seeing is an eroding of those categories right now. People are spending far less time being in touch with their body in themselves as they externalise their image into social media platforms, and their sense of self is distorted, even through filters that they look at themselves through. Their ability to think deeply and critically is kind of robbed by a short 15-second video and a recommender algorithm that steers them down a particular tunnel, rather than exercise critical thinking skills. You know, blaring headlines, misinformation and disinformation all undermine and erode this. And then empathy behind a screen is not the same as empathy and relational intelligence in person with each other.

Prof. Nita Farahany: [00:25:01] And when tech companies or when governments, for example, invest heavily in this cognitive warfare regime to try to steer people onto a platform that erodes these fundamental aspects of self-determination, what they're really doing is eroding a person's cognitive liberty. Add to that, technology that's designed literally to try to paralyse or to undermine or to disorient or to disable the human brain, whether that's through purported, you know, weapons that can induce something like Havana syndrome, vertigo, and other experiences that a person is undergoing, you pair that across this world of information warfare, disinformation campaigns, deepfakes, the kind of increasing use of generative AI to do so. And what you see is a future where people really don't have cognitive liberty, where what they're left with is a lack of capacity to have self-determination over their brains and mental experiences.

Dr. Miah Hammond-Errey: [00:25:59] Do you have any ideas on how neurotechnologies can be used to improve our cognitive resilience, particularly to things like disinformation?

Prof. Nita Farahany: [00:26:08] Probably one of the most important skills that we need is both introspection and the skills of discernment. So, in a world in which there is significant disinformation campaigns, people have to be in touch with their own bodies. That is, if you have a gut instinct that something is wrong, it encourages you to explore further and to use critical thinking skills to be able to build up the resilience to that information. Or if you're presented with a badge that tells you that you're interacting with generative AI, you are more likely to pause and to be able to put up the mental defences that are necessary to then use the skills of discernment.

Prof. Nita Farahany: [00:26:49] But if you lack introspection, if you no longer have a sense of your own body, if you no longer are in touch with the connection between brain and body, it's nearly impossible to then be able to say, my gut instinct is anything because you have lost the connection to your gut instinct. If your discernment skills, the kind of critical skill that you need to have, that resilience in the face of changing technology, has become so weakened because of both the lack of connection between brain and body, but also because your attention has been robbed from you, you can't exercise the cognitive resilience you need to be able to navigate through that.

Prof. Nita Farahany: [00:27:31] And so neurotechnology can help us by helping us to fortify introspection, it can start to give us a better sense of what we're feeling, to help us to visualise through neurofeedback, or to help us to better meditate, to help us to better sense and reclaim embodiment in ways that are necessary as the first skill in exercising cognitive liberty. It can even help us exercise greater skills of critical thinking by being able to see what happens each time a notification, for example, pops up on your phone, and use that to understand the cost of context switching, and then retrain your brain to be able to reclaim those skills that are necessary to exercise self-determination.

Dr. Miah Hammond-Errey: [00:28:16] I mean, this is so important in our everyday lives, but I'm thinking here about policymakers, decision makers and members of the military and how easily brain data can be weaponised. I mean, even our attention can be weaponised. So, I'm also thinking here of how we might fortify our democratic institutions and our decision-making processes.

Prof. Nita Farahany: [00:28:36] I think that's right. To me, cognitive liberty isn't necessary just for individual flourishing, it's necessary for societies to flourish. If it doesn't become a priority for governments to invest in, you have a significant undermining of the skills and the capacities of an entire population. When you see, for example, China having a very different system on TikTok with its youth than it does in the United States and start to recognise that those differences actually align with differences in mental health and differences in critical thinking skills and capacity to retain attention. You realise that there is a national security imperative to investing in the cognitive liberty of the populace, that collective cognitive liberty of people is necessary to the flourishing of not just global societies, but local and national societies to be competitive. And so, I think it becomes a crucial area for entire societies to be investing in. And that means doing things like starting to recognise that the misalignment between tech companies that are designed to extract attention, because it drives increase in ad revenue, is contrary to [the] national security interests of a country. Every country should be trying to develop not just a system of laws, but a system of incentives that start to shift tech companies to redesign their products to enhance cognitive liberty, rather than undermining cognitive liberty of its people.

Dr. Miah Hammond-Errey: [00:30:13] Can you highlight the top national security threats of neurotechnology?

Prof. Nita Farahany: [00:30:17] I think we could start by saying it can be weaponized, right? It can be used for cognitive warfare in ways that are devastating. But it's, that's almost the more obvious one, right? It's military use of it. It's use for super soldiers.

Prof. Nita Farahany: [00:30:30] I think the more insidious use of it is the more problematic one. And that's everything from capturing the biometrics of individuals, commodifying and using them to instrument and change them in subtle ways that people wouldn't even detect, to using it to create a closed-loop environment where, you know, you have the same tech company that has access to brainwave data or other forms of brain data that's driving the environment that the person is operating in, like an immersive or spatial computing environment, where it almost becomes like people are in a matrix that is operated by a foreign adversary. That's incredibly dangerous from a national security perspective. Unless there's some security and safeguards that are put into place against that, you know, I fear that that's an insidious way of really getting at social control in ways that would be very difficult to counteract and to even detect. And so, to me, cognitive liberty is a national security requirement for countries to start to embed into products and to say brain data is uniquely sensitive and uniquely valuable, and it needs to live on device. It needs to live in the hands of individuals.

Dr. Miah Hammond-Errey: [00:31:45] I want to go to a segment now on alliances. You've just set out really the need for cognitive liberty across national governments. How can we engage or collaborate on protection of human rights in a multilateral or international level?

Prof. Nita Farahany: [00:32:01] So to me, cognitive liberty and the reason in 'The Battle for Your Brain', I really advocate as a starting place to think about it as an international framework for updating our existing human rights. I think it is an international priority. I think it is necessary for the basic human condition across the globe. It's not just that each country, pitted against the other, needs to defend the cognitive liberty of its populace. We have existential threats across society, and many people are worried, for example, about growing potential existential threats from AI. The way I think that we ensure that human thinking continues to be protected, continues to be up to the challenge, is by ensuring cognitive liberty flourishes, not by undermining it. And so, you know, if you're going to invest heavily across the world in creating powerful generative AI systems, you should be investing heavily in ensuring that human cognitive capacities continue to flourish. Otherwise, it will be very difficult for us to be up to the task of actually managing the emerging threat.

Dr. Miah Hammond-Errey: [00:33:10] You've raised AI there, and obviously we've seen many efforts to regulate AI nationally and globally in the past few months. What are the interplays between neurotechnology and AI?

Prof. Nita Farahany: [00:33:21] It's a co-evolution story, right? I mean, it's an AI story. While the hardware has become much better for neurotechnology, it's miniaturised, it can be put into sensors that can be put into multifunctional devices like headphones and earbuds and watches or wearable tattoos, it's the AI and the advances in AI that power the capacity to decode what's happening in the human brain and to modulate it and change it. And, you know, whether that's to improve signal-to-noise, that is, to filter out noise through AI systems, or whether that's through the powerful capacity to decode what those signals mean or the ability to customise for each individual through generative AI models, what a person's unique classifier is that decodes what they're thinking and feeling. This is a deeply intertwined story where advances in AI are enabling the advances in decoding and ultimately changing the human brain.

Dr. Miah Hammond-Errey: [00:34:22] We want to go to a segment called Emerging Tech for Emerging Leaders. What emerging technologies are you watching most closely?

Prof. Nita Farahany: [00:34:29] Clearly generative AI. I am closely watching what's happening in immersive technologies as well. So spatial computing, AR, VR, you know, the kind of broad XR set of technologies. Quantum and synthetic biology and really a lot of what's happening in neuroscience itself. And so, this is brain organoids, for example, the increasing ways in which the human brain is being studied, but also different ways of trying to sustain organs of the human body outside of the human body. These, to me, all converge in some way, which is the reason why I'm watching each of them individually, is because they inform what's happening in each of the other fields. But those are the ones that I'm watching most closely.

Dr. Miah Hammond-Errey: [00:35:15] And what does the future hold for these neurotechnologies?

Prof. Nita Farahany: [00:35:18] To me, what's coming is, is truly wide-scale neural interface technology. It's brain sensors that are as ubiquitous as the rings that people wear to track their body temperature and the heart rate sensors that track their heart rate. It's probably more appropriate to think about how ubiquitous the mouse and the keyboard are, to think about how ubiquitous neural interface will become. I think people should keep their eye on everything brain health related. To me, you know, every major tech company is making a soft entry into this through journaling apps, through mental health and wellness programs. And they are all their first foray or normalisation of a new world that is coming focused on, much more directly, brains and mental experiences. So, I'd say notice, most importantly, how every tech company has a new entry in this area.

Dr. Miah Hammond-Errey: [00:36:31] We've got a segment called Eyes and Ears. What have you been reading, listening to, or watching lately that might be of interest to our audience?

Prof. Nita Farahany: [00:36:38] I've been reading a lot of theories of human flourishing, and I've been going back to everything from Buddhism to John Locke to John Stuart Mill, to more recent work, like work by Jonathan Haidt and you know, Daniel Kahneman. And the reason is because I'm really interested in how we have thought about theories of human thinking and how we've thought about the preconditions of human flourishing over time, and the assumptions we've made about the sanctity of the human mind as a precondition for the capacity to flourish. Each of them have simply assumed freedom of thought or an inner sanctum, and have built their theories from there. And so, I've been going back to, to see, you know, what, what would happen to these theories if, um, it turned out that precondition, that assumption no longer held true.

Dr. Miah Hammond-Errey: [00:37:37] Will that be your next book?

Prof. Nita Farahany: [00:37:39] It will be part of my next book for sure.

Dr. Miah Hammond-Errey: [00:37:44] What do you do in your downtime to keep sane?

Prof. Nita Farahany: [00:37:47] I hang out with my kids. I have a wonderful four-year-old and eight-year-old daughter, and being able to do silly things like, you know, read a storybook to them or just cuddle with them or hang out with them is, you know, the kind of most, most important part of being human for me and the most important part of my everyday life. And it keeps me grounded because to them, you know, I am just their mom. And it is a great, great job to have in life.

Dr. Miah Hammond-Errey: [00:38:19] As the mom of a toddler, I can say I also think it's the best job. It's wonderful to see the intersections of your work in such a positive light. Lastly, we've got a segment called Need to Know. Is there anything I didn't ask that would have been great to cover?

Prof. Nita Farahany: [00:38:32] Oh, I think we've covered a lot. So no, I think you have been really thorough and we have really covered the full gamut.

Dr. Miah Hammond-Errey: [00:38:40] Thank you so much for joining me today. It's been a pleasure

Prof. Nita Farahany: [00:38:42] Likewise.

Dr. Miah Hammond-Errey: [00:38:44] Thanks for listening to Technology and Security. I've been your host, Doctor Miah Hammond-Errey. I'm the inaugural director of the Emerging Tech program at the United States Studies Centre, based at the University of Sydney. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_HE or send an email to the address in the show notes. You can find out more about the work we do on our website, also linked in the show notes. We hope you enjoyed this episode and we'll see you soon