Dr Miah Hammond-Errey interviews Susan Gordon, former US Principal Deputy Director of National Intelligence, discussing the potential of public disclosures of intelligence to build trust, the nuanced differences between American and Australian cultures that impact technology policy and innovation, especially in areas like AI regulation and what is needed to make AUKUS Pillar II a success. They also cover the increasing role of private-sector firms in national security – from supply chain decisions to the role of Starlink in Russian invasion of Ukraine to Chinese infrastructure investment in the Indo-Pacific – how best to harness them and their technologies, and the new heights disinformation could reach, and the fact “2022 was the last time we talked about AI in the future”.
Susan M. Gordon was the former Principal Deputy Director of National Intelligence, advising the President on intelligence matters and providing operational leadership of the US intelligence community. She has had an extensive career in United States Intelligence Community – working as the Deputy Director of the National Geospatial-Intelligence Agency (NGA) and spending 29 years at the CIA where she also led the establishment of In-Q-Tel, the CIA’s venture arm. She is a fellow at Duke and Harvard Universities and has worked with leading companies and government on intelligence integration, outreach and driving innovation.
Technology and Security is hosted by Dr Miah Hammond-Errey, the inaugural director of the Emerging Technology program at the United States Studies Centre, based at the University of Sydney. Miah’s Twitter: https://twitter.com/Miah_HE
Resources mentioned in the recording:
- (USSC Report) Secrecy, sovereignty and sharing: How data and emerging technologies are transforming intelligence
- (USSC Polling Explainer) Collaboration with trusted allies and distrust in Chinese technology: American, Australian and Japanese views on technology
- (USSC Disinformation Commentary) Dealing with disinformation: A critical new mission area for AUSMIN
Making great content requires fabulous teams. Thanks to the great talents of the following.
- Research support and assistance: Tom Barrett
- Production: Elliott Brennan
- Podcast Design: Susan Beale
- Music: Dr Paul Mac
This podcast was recorded in Washington DC, which sits on the ancestral lands of the Anacostans or Nacotchtank, and the neighbouring Piscataway and Pamunkey peoples. We acknowledge the Native Peoples on whose ancestral homelands we gather and pay our respects to their elders past and present — here and wherever you’re listening.
Please check against delivery
Miah Hammond-Errey: [00:00:02] Welcome to Technology and Security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr.Miah Hammond-Errey. I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney. My guest today is Sue Gordon. Thanks for joining me.
Susan Gordon: [00:00:23] I'm glad to be here.
Miah Hammond-Errey: [00:00:24] Susan M Gordon was the former Principal Deputy Director of National Intelligence, advising the President on intelligence matters and providing operational leadership of the US intelligence community. She's had an extensive career in the United States intelligence community, working as the Deputy Director of the National Geospatial Intelligence Agency and spending 29 years at the CIA, where she led the establishment of In-Q-Tel, the CIA's venture arm. She's currently a fellow at Duke and Harvard Universities and has worked with leading companies and government on intelligence, integration, outreach and driving innovation. We're coming to you today from Washington, D.C., which sits on the ancestral lands of the Anacostans, also documented as the Nacotchtank, and the neighbouring Piscataway and Pamunkey Peoples. We acknowledge the native peoples on whose ancestral homelands we gather and pay our respects to their elders past, present and emerging. In the last 20 years, we have seen technology permeate almost every aspect of our lives. As you've previously put it, there is an abundance of data, we are digitally connected and technology is ubiquitous. Can you describe for our listeners what this means for intelligence?
Susan Gordon: [00:01:28] Boy, I love it when my words come back to me. They're very good words. So, let's start with what I think intelligence is. So I think intelligence is fundamentally to know the truth, to see beyond the horizon, and to allow leaders to act before events dictate. And I will argue that that mission has been present since the discipline of intelligence was imagined, certainly in 1947 when the modern intelligence community came to bear. And today. But I think it's useful to ask yourself the question, is that still a valid mission? And I think you'd have to say it is. To know the truth. You bet. To, in the midst of all this, to be able to divine that which is certain is a funny word, but is apparent not which you simply prefer. So, we're going to say that that's still a modern mission to see beyond the horizon. Absolutely true. Now, when we thought of those words, we thought physical horizon. I will argue that today it's a digital horizon that you're trying to be see beyond, to try and infer intent from data rather than just some secret that you've stolen from someone who is cleverly placed. And to allow leaders to act before events dictate. Again, a great persistent mission. If you take all of that and say, well, I want to do that mission, but I cannot do it the same way I did before. If you put yourself in our shoes, if 1947, when our modern community was formed with the National Security Act, if that never happened and we only thought of the mission intelligence today, would you do it the same way that you did it in 1947? Well, no.
Susan Gordon: [00:03:02] We were hunters for information, and now we're gatherers. There was not an abundance. Only a few people could go and get it and steal it. Well, that isn't. The world knows everything today. And leaders made decisions in a vacuum over long periods of time by themselves to issue forth with some puffs of smoke. And now the clock is forcing those things. So you need the mission. You can't do it the same way. So you plop into the world that I just described and you say, okay, what do you do in a world that technology is everywhere? Well, you now don't look at whether someone has the capability as singularly important. You say, can they use the capability? Right. You don't look at digital connectedness as just being the way electrons are shared, but you think, oh, my gosh, any distance can be reached, any boundary can be crossed, any activity can be hidden. And so you have to think of it differently in a digitally connected world and in a world of data abundance, this discipline of intelligence is not sacrosanct. There are a whole bunch of people out there that are looking at the same data and trying to make something. Well, what distinguishes the craft of intelligence from the opinion of a knowledgeable person? So that's how I put those two together.
Miah Hammond-Errey: [00:04:15] What's one thing about intelligence that the public don't know that you wish that they did?
Susan Gordon: [00:04:22] I wish they knew it wasn’t opinion. That that doesn't mean it's always right. Intelligence is inductive, and so it's possibility focused. So how do you make decisions based on possibility? And the answer is you develop a craft around it so that people can deal with what is fundamentally uncertain, with some certainty. So I wish people understood that that just because it is not certain does not mean that it doesn't have a standard.
Miah Hammond-Errey: [00:04:52] I want to talk about world events and geopolitical tensions relating to technology. Can you talk a bit about the public disclosure of intelligence leading up to the Russian invasion of Ukraine? And what does this mean for future conflicts?
Susan Gordon: [00:05:05] Yeah, I love it. I love that they decided to do that. So if you'll allow I'll go on a little bit of an arc, but I'll try and be brief. You'll be happy with that same dawn of the digital age. One of the effects of a digitally connected world is that the threat surface extended beyond that which was governmentally controlled. And so what that meant is the people that are being threatened are not just government. And the people that are national security makers are also not just in the government. That's okay. The intelligence community held strong for the longest, held strong for the longest time, trying to keep those secrets. But I think over time several things happened that made us realise that the world was changing. Number one, I think the disclosures by Edward Snowden were really significant in that kind of broke open this idea that there were intelligence activities going on broadly about which the American people and our allies and partners had opinion. And and we had a hard time explaining what we were doing because we were so used to never talking about it. And it took some of our formers to actually screw up their courage and talk about what we were doing to try and get us over the crisis of that. Move forward in time. You see humanitarian crisis. I'll choose the Ebola crisis in 2014, where the National Geospatial Intelligence Agency figures out that, oh my gosh, if it releases some of its imagery and mapping data that it can positively affect the treatment and containment of that.
Susan Gordon: [00:06:43] It's taking a national security resource and making it openly available. And so that's a good thing. Then you have the Skripal incident where the intelligence community figures out that it needs to reveal that Russia used a fourth-gen nerve agent to try and assassinate somebody because otherwise we couldn't stop the counter narrative coming forward. Then you have 2016, the Russian interference in our election, and we have to reveal that because it's the American people that are being affected by that. And so we issue an intelligence community assessment openly about that. And so I think it's just this movement of recognition that you have to be able to share some of the information to the people that are being affected in order to engender trust. And so I think that's what the decision to share information about Putin's intentions in Ukraine that were necessary for two things. One, coalition building, so that we got the trust of our allies and partners and quite frankly. Two, to the American people to understand when the counter narrative came during a time of relative mistrust of government, we had beaten the allies to the spot.
Miah Hammond-Errey: [00:07:55] What are your thoughts on the use of, you know, technologies like Starlink that are that are not not controlled by government and the capacity for an individual or a company to shape such large events that would normally be in the national security sector.
Susan Gordon: [00:08:10] Yeah, So that's such a good question. And Starlink is not the only one.
Miah Hammond-Errey: [00:08:16] I mean, just a great example.
Susan Gordon: [00:08:17] You know, there are a lot of big companies that I would argue are some of the biggest non-state actors globally, and they do shape organisations and they do shape activities. Um, I think there are two. Again, not putting this genie back in the bottle, there are to counter, effective counters in terms of mitigating the risks. One, is don't let there be only one, right? Because if you grow to use it, rely on it, depend on it. Somewhere in that arc, you can't be beholden to someone who could decide that they didn't want to continue to support it. So we have a saying that one is none, right? So, you know, so you need more than more than one. I think the second thing is, and we've kind of talked about this a lot more, and that is, I've just said that the threat surface extends beyond governmental control and national security decision makers are outside the government. I think we need to get the companies to recognise that they are actually making national security decisions, whether they're action, whether that is the telecommunications industry that decided that they wanted to walk away from the low-profit baseband communications. And so you see China take that over. But they did that from an efficiency perspective. Had they thought about what they were doing from a national security perspective, whether they made it the same. So, one is none – got to have more than one player, and two, they've got to recognise that they're playing a role in national global security.
Miah Hammond-Errey: [00:09:45] How do you see the intelligence community in the US and globally dealing with the challenges of mis- and disinformation and obviously state-sponsored information influence?
Susan Gordon: [00:09:54] Yeah, I think the advantage that the intelligence community has in detecting disinformation is that right now disinformation tends to be single modal. I can create a deep fake, but I can't, I'm not nearly as good at making a deep fake be correct at the right time in the right place. I can't make all the signals align, all the metadata align. In other words, truth is actually really hard to counter if you have the ability to look at a lot of different dimensions. And I think right now the intelligence community still looks at the world in a lot of different directions from a lot of different layers, and those different layers provide some protection against single-threaded manipulation. But that said, the more there is, the more it happens, the better it gets. The slower we are to develop the counter technologies. In insurance technologies, it gets harder and harder and harder. And to do that at speed before society is manipulated. And then you have to deal with the fact that the government isn't as trusted as it was a minute ago. And so if something is manipulated, it becomes present in society's mind. If the government is slow, then how do you counter that? If it's not obvious, the government is trusted? Which is why, going back to your question before, I think it is just the right trend of the intelligence community to try and share before the false image is created, rather than try and counter a false image that has been disseminated.
Miah Hammond-Errey: [00:11:24] Yeah, I think it's a problem that's only going to get harder.
Susan Gordon: [00:11:27] Really hard.
Miah Hammond-Errey: [00:11:27] Yeah, we're going to go to a segment. It's called Emerging Tech for Emerging Leaders. You've held some really significant leadership roles during big tech developments. Can you give some insight into how you've led others to navigate major tech changes throughout your career?
Susan Gordon: [00:11:45] Boy, it's hard, right? And it's hard not, I think, for two reasons. One is because change is hard. And again, we've been so successful with the systems that we've had, that evolution is much more comfortable and so you don't have to make big decisions to walk away from installed base. So, my first strategy is to counter that, which is you have to look at the outcome we require and whether the path we're on is going to yield that outcome. So I don't even start with technology. I start with the outcome that we must have. Define the outcome and then define the things that need to be accomplished in order to do that. And I think that works pretty well. And I think it has worked well for me to go to outcome rather than just capabilities, because I'm sinister and because capabilities from a budgeting perspective can be stretched out, right? If it takes me ten years to develop a capability, oh well, right. But if I have an outcome that must be achieved within five years because it's about advantage, not that. So, my first is focus on outcome, focus on advantage, then back into the things that you have.
Susan Gordon: [00:12:50] And then the second is when you are changing technologies, the presumption is that you're adding risk, right? And so, my approach is to show how I have accounted for the risks that drove us to have the technological approach that we had before. And this really came up in the formation of In-Q-Tel. I mean, in the in the formation of In-Q-Tel, the idea that the intelligence community would give, in the CIA particularly, would give its most challenging problems in unclassified form to an entity that it didn't control, was anathema. Right. But you had to give up that control to get the vibrance of Silicon Valley. Well, why did people hate it? And they hated it because they're like, ‘Oh my gosh, we're introducing the possibility that bad technologies will come into our place’. And the answer is no. We are very seldom protecting the technology. We're, usually what is classified as usually the use. We still control that, and we can always not contract for that technology. So you just have to account for it differently.
Miah Hammond-Errey: [00:13:54] You're introducing the potential of this technology.
Susan Gordon: [00:13:57] Right you do, I think it's harder and harder every day because I think I was just out at the RSA conference in San Francisco looking at all the cybersecurity solutions, 3500 companies. How is any leader today going to adjudicate between those? So, I think somehow we have to change the conversation from not what does the technology do, but what does it allow and what does it introduce and get that conversation, because modern leaders just aren't going to be able to adjudicate between A and B.
Miah Hammond-Errey: [00:14:31] I'm going to shift over and talk to you about alliances.
Susan Gordon: [00:14:34] I don't believe in them.
Miah Hammond-Errey: [00:14:37] Do you think there's a role for an intelligence alliance expanding Five Eyes or an individually like a separate alliance with regional partners like Japan and South Korea?
Susan Gordon: [00:14:47] Well, so let me let me choose the alliance that we've just, the sub alliance that we've just established with AUKUS. I think there's an attractiveness to that. It isn't exclusive. So it's a focus. It's not an exclusion approach. What I like about it is a focus is it's a particular regional security, shared interest that has an operational component that generally drives you to be willing to share more with more urgency. And so it has nothing to do with you haven't changed the nature of the partnership. You just said I have a particular focus. We have that shared interest. We're going to drive with urgency. Those other alliances are there and they're going to go on. Japan in that construct? Absolutely. If you talk about the focus in in South China Sea or Southeast Asia or the Asian theatre, you cannot do that without Japan and Korea being involved. But the effective operational alliances have to also have a component of have we been able to align all our interests in that moment, or are there two levels of shared interest technology development versus operational control? So, I think there is a role for both broad sharing against shared values and narrower partnerships to achieve specific outcome in particular timeframes.
Miah Hammond-Errey: [00:16:14] I think that's a really, really important distinction actually, and one that often gets lost in in the discussion. I wanted to talk about AUKUS and Pillar Two in particular. I wanted to get your thoughts on a range of things, but I'll just throw up innovation to start. Obviously, the Australian Government recently announced ASCA, the UK Government has started up ARIA both following on, I guess from DARPA, at least in spirit. There's some really big challenges ahead for AUKUS Pillar two. How do you see this space?
Susan Gordon: [00:16:43] So I think it's a great initiative and I think it's really hard. I think the great initiative is you have to set some quest of someplace you want to go. I know that sounds dumb, but I'm old, so my example is always the US decision to go to the moon in the 1960s. So, JFK said, ‘We're going to put men on the moon and return them to Earth safely by the end of the decade’. We had no idea how to do that, but it was big. And when we did, we developed things that needed to be addressed various critical paths and critical technologies that needed to be advanced, and it required public and private partnership at that time. And when we did all that, we got Tang and Velcro, right? When you did a big quest, you got not only the performance of the mission, but you also got the things. The reason why you need Quest is you can't get to the moon just by building Tang and Velcro, right? So the reason why all those advances by AUKUS is important, by those other entities having a focus and by aligning that interest, you increase the chances that you're going to get there, not just say, I'm going to develop all the piece parts and somehow they're going to magically come together.
Susan Gordon: [00:17:54] So that's why you do it. The problem I see, is there has to be a framework by which all those things come together. I'm a zoologist by training. It's like saying if I had a petri dish and I put a bunch of hydrogens and oxygens in carbon molecules with some amino acids, and if I just stared at that petri dish, life would issue forth. That feels to me the risk of this, right? So set good goals, decide what the critical technologies are, but you can't just independently develop those technologies and think at some moment they're going to come together in order to deliver you a capability that is going to be transformative. That requires a framework. And so I think it's exciting what we're doing. I think it's the right goal. I think it's really important that each country develops something so that it fits within their ethos and structure. But at some point you're going to have to come up with a common framework that allows sharing, especially in a world where it's not just the governments that are going to be in control of the technologies that you need. It's private companies that need to be able to work together. But how are they going to do that?
Miah Hammond-Errey: [00:18:56] The next segment we have is called Eyes and Ears. What have you been reading, listening to or watching lately that might be of interest to our audience?
Susan Gordon: [00:19:03] Everything that's going on with large language models. I was telling someone the other day that that I think 2022 was the last time we talked about AI in the future, that the speed at which generative AI large language models are being integrated into solutions and put into use is absolutely mind blowing and it is largely happening out of the view of any, I hesitate to say, Governor, but any consideration about what are the first principles about how you develop this, so that once you have it, you have accounted for the possibility of misuse and done the most you can? And so probably that is the thing that I am reading most on now. And I can't give you one source because it's just it's everything.
Miah Hammond-Errey: [00:19:52] I'd love to hear your thoughts. In Australia, obviously we have the eSafety Commissioner. We have a fairly technologically literate Parliament that's able to legislate and is motivated. What are your thoughts on things like AI regulation, particularly I guess, in relation to some of the social harms that you've just mentioned?
Susan Gordon: [00:20:10] Yeah. So this is I think this is a really difficult issue for Americans. This freedom of action, freedom of speech is, is culturally embedded. I don't I don't know how you can get it out of us, even as we have seen the deleterious consequence of misuse of that information, whether purposeful or not. Finding the solution to that. So, I love I love what's happening in Europe and Australia because I think they're interesting models. I don't know that they're easily translatable and I don't know that they are going to be totally adoptable here and I think we are really going to be struggling. So, I love I love what's happening. I don't agree with everything because I am a product of my environment, but I do think we need some controls. I don't believe that without controls we are going to end up in the place we want to be. I do think this is a governmental thing to do because it's big, but where I think we really struggle is when it gets to actual content, it goes there so quickly, that that is a really hard place for us to go right now. In the US where we do believe so much in privacy and we do believe in independence of speech and thought.
Miah Hammond-Errey: [00:22:42] I think the cultural nuances between, in different countries is a really significant component and something I've spoken to other interviewees about is how they see the culture that technology has created in shaping that technology. I wonder if you have any thoughts on that, given most of the technology companies we use in Australia are American?
Susan Gordon: [00:23:03] Well, I think you see you see the effect of them being American, where our Constitution limits the government, right. It doesn't limit the people. And so, what you've seen in a world that has grown faster than that which can be defined by our existing laws and policies, you have seen a natural effect of just people going as far as they can go because laws, they can go that far and because it hasn't been prescribed. I think I said before, I think this is the moment where we're going to need the private sector to exhibit some responsibility and recognise that not everything they can do should they do. But man, that's a hard thing to climb. This might be glib, but I've said before that, if you made me choose between national security and privacy, I'd choose privacy every day of the week and twice on Sundays, because I think it's so central to free and open societies. And I think that's one of the real rubs that we're trying to address right now. And national security is so much more today than we defined it as even ten years ago. Right. National security now is: are we going to have a democracy? Are we going to have a society? Are we going to have freedom of action for every person? You know? And so I think that's what we're wrestling with. But that's the challenge. And maybe the addition that we need to make to this conversation about data security, because it's, it kind of is [a] uniquely cultural challenge that we have created because of who we are.
Miah Hammond-Errey: [00:24:37] Do you see that we have an innovation problem or an innovation-in-use problem?
Susan Gordon: [00:24:43] Yeah. I think we have an incredibly vibrant technology community. It's probably why I argue for non-decoupling because I think now it requires global participation to have this vibrance. I just it that community needs to have a place so that it can just freely and openly compete and do things. So I think we're still pretty good on that front. So I think I think we don't have a technology problem. I think we have a technology use problem. Um, that's not to say that I don't think the government needs to invest more research dollars again than it has used to do it in the past. And then the private sector in the in the time of communication just took it on and we're like, okay, I don't have to spend the money because they're advancing now. I think there are actually some technological issues that the US government really needs to invest in the longer-term issues. But other than that, I think the question is how do you get the new technologies into use? And so, we're still, when we offer technologies, we're offering the what here's the technology as somehow magically it's going to get into use, and the companies aren't thinking about how it gets into use. They're just thinking about how wonderful their technology is. And the government looks at this new technology and it's really exciting and it has really great potential. But I don't know how to bring that in because it doesn't fit in the systems I have. So I don't think it's a technology issue. It's a it's how do you bring it in, whether that's changing the budget process, whether that is getting decision makers who have more organisational reach so they can make bigger decisions? Whether we need to change the incentive structure so that people don't just get incentives from selling things, they have to actually yield outcome. I think there are a number of things we have to do.
Miah Hammond-Errey: [00:26:45] You've picked up there, I want to ask you a bit more about it, and that is about decision makers and about decision making. You mentioned it in the context of being able to distinguish between lots of different types of technologies, ethical decision making, being able to justify decisions, is a critical part of public-sector leadership, military leadership. What do you think leaders need to be good leaders in this environment, and how do you think we upskill people to make better decisions?
Susan Gordon: [00:27:13] So I think we have a real generational problem. I don't want to paint everyone an ugly brush, but, but leaders in the most senior decision making slots are farther away from having been technology creators. And so they view new technology as additive risk. And it's harder to distinguish. And you know this, someone, someone who comes into your office and puts a great idea on your desk, is that a good thing or a bad thing?
Miah Hammond-Errey: [00:27:42] A great thing?
Susan Gordon: [00:27:43] No, it's a horrible thing because it's an idea and I don't know what to do with it. So it just sits on the side. Part of it is we've got to we've got to get a little bit more tech savviness in our decision makers so that they don't see it as only additive risk. I would say we also have to get our technology leaders to stop talking about their thing and talk about what their thing does. You've, you've heard so many pitches and they are all about the capability and none of them are about use. So, so that's one of the things that that has to happen. The other thing is. I think we have organisational structures that put leaders in charge of only a part of the problem. And so if you insert technology into Slot A. It's a good technology and it might be an advance, but if you aren't considering the whole ecosystem, it won't yield the outcome you had. And that's why we bought so many things that don't deliver the performance that we expect because it was this part, not all these parts.
Miah Hammond-Errey: [00:28:47] I think that's one of the reasons that I, I do think technology is quite transformative for the intelligence community because I think it challenges that fundamental idea of compartmentalisation.
Susan Gordon: [00:28:56] Well, yeah, if I, if I suggest that it wasn't transformative, I think it does. But you still have to imagine how it can be done differently, not just how technology can modernise, how you do it, right? So it's, you know, so I think you can see it the same way. The question is, Robert Cardillo at NGA, when he was the director and I was his deputy, I thought he did a brilliant job of imagining NGA in the future. Not just imagining how he would modernise NGA, for the future. And I think it's, if you can get that jump and then build it, you're okay. If you just try and use technology to at the edges, change, modernise how you currently do it, you'll miss the opportunity of transformative technology.
Miah Hammond-Errey: [00:29:50] We're in another segment now called What Do You Do in Your downtime to Keep Sane. On a personal note, what technology brings you the most joy and what do you do to wind down and disconnect?
Susan Gordon: [00:30:01] Oh, I read, right, and I actually think this is so important. I've talked about all the joys of technology, but when technology becomes a commodity, it's the critical thinking that's going to make a difference. So if I would say one thing, one, reading, let's this introvert recharge. But I also think it is the fundamental skill to advance critical thinking. And we can't just be a bunch of button-ologists. We have to be thinking about how everything from how it can be used to how it can be misused and and there's just nothing like it. I would say, go back to a liberal arts education. But I think that's a bridge too far. But I think be a great reader. It's probably the right way to go.
Miah Hammond-Errey: [00:30:49] You have reminded me that we've spoken a bit previously about information operations and influence, and I'd really love to hear a little bit more from you about how you see that, the trajectory of it, and how you see our potential for changing that.
Susan Gordon: [00:31:07] The digital environment is how every intention of adversary and competitor is going to be affected. Because the casual listener, reader, observer watcher, especially in free and open societies that do default to trust, they do. It's funny, we'll get back to trust in government. They default to trust. Anything that hits them, they tend to believe. So those who would manipulate information get a receptive audience because we trust right at a fundamental core level, we trust. That's how our society, that's how open societies actually work. So I think I think it's a field day for people who would manipulate information. I think it is absolutely devastating. When you have people who believe they have free will to being, receiving information that is not true, that shapes them in a direction. And I think I think the Russians, for, since World War Two and before that, have had active measures in this shaping to be part of it. And now they have these cool tools and everyone's figuring out how to do that. So, I think the potential is great and the potential impact on free and open societies is even greater. So, here's my four-part plan. Number one, the government's got to become more trustworthy. Right. The government has got to be able to be trusted at some level by the average populace. I mean, and and how do you become more trustworthy? You say more things that are true. You stay above the fray. You do fewer things that are just in your personal interest and more in the common interest and neatly enough in our societies, all the power to put those leaders in place rests with the people. So all you have to do is make better choices and you can get that. But more trust in the government.
Susan Gordon: [00:33:01] More civics education. I think you and I have talked about what happens when time moves, and the problems get far, people get far away from when we were really in crisis. I think we need to remind people about civic responsibility and personal responsibility and what the collective society needs to be. I think there are a bunch of people that think that actually we don't need government. That's silliness. I would agree that I don't think our government systems are as good as they need to be. But the idea that you could destroy them and you would have any idea what to be built, it would be hard. So got to invest our people in civics. Similarly, they need to be critical thinkers because I will say the truth has its own sound. And if you would actually critically listen or read some of this stuff, you would say, 'Well, that can't possibly be true' in most of the times if you think it can't possibly be true, it's not. And then technologically, we have got to use some of our technology prowess to really go after ensuring trust and truth, because that's what's under assault and that's what makes us work. And then finally, I need the citizen and the private sector to recognise that they too are national security and global security decision makers. And instead of just living this time of presumed abundance and ease, recognising no, these are conscious decisions we make in order to have the freedoms that have gotten us to where we are. So I think that was a five-part plan.
Miah Hammond-Errey: [00:34:33] Can you just draw that out a little for the listeners? When you say people and companies are decision makers, national security decision makers, can you just explain for them what you mean?
Susan Gordon: [00:34:45] So companies are easy. When a company decides to accept money from someone who is trying to introduce their interests over yours, you are opening yourself up to being shaped. When you decide to offshore part of your supply chain and leave us vulnerable in times of crisis, pandemic or conflict, Ukraine, you have affected national security. When you don't protect your information, you leave yourself open for actions that would have either local or potentially systemic effects. So, you are making decisions with all those choices you make about just the conduct of your business. And similarly, individuals, when they decide what they believe and they decide who they affect, they are they are security decision makers with the things they're doing. And the coolest thing about our society is that that the power does rest with the people. And if they will say, ‘Oh, I'm not just taking from this pond, but I'm actually responsible for the maintenance of this pond’, I think you just get better.
Miah Hammond-Errey: [00:36:00] Thank you so much for joining me. It's been such a pleasure. I think we could have gone on for hours, but thanks so much for coming on the show.
Susan Gordon: [00:36:06] Oh, it's been a blast. Thanks.
Miah Hammond-Errey: [00:36:10] Thanks for listening to Technology and Security. I've been your host, Dr. Miah Hammond-Errey. I'm the inaugural director of the Emerging Tech Program at the United States Studies Centre, based at the University of Sydney. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_he or send an email to the address in the show notes, you can find out more about the work we do on our website, also linked in the show notes. We hope you enjoyed this episode, and we'll see you soon.