In this episode Alex Lynch, from Google Australia’s Public Policy team, joins Dr Miah Hammond-Errey to discuss emerging technologies, quantum, Australia’s role and much more. They cover Google’s recent quantum announcement on error correction and Australia’s significant role in quantum research, data localisation, ethics in AI — including Google’s approach — and strategic decoupling. They also talk about the complexity and security of Google’s global infrastructure, data breaches and what it is that should not be automated.
Alex manages Google's public policy engagement in emerging technologies, including artificial intelligence and quantum computing, as well as related areas, including digital lines of communication, technology supply chains and trade and investment. Prior to joining Google, Alex consulted on crisis and strategic reputation management for some of Australia's top companies, having formerly worked as a national security practitioner in New Zealand.
Technology and Security is hosted by Dr Miah Hammond-Errey, the inaugural director of the Emerging Technology program at the United States Studies Centre, based at the University of Sydney.
Resources mentioned in the recording:
- (USSC Report) Secrecy, sovereignty and sharing: How data and emerging technologies are transforming intelligence
- (USSC Polling Explainer) Collaboration with trusted allies and distrust in Chinese technology: American, Australian and Japanese views on technology
- Google’s AI principles: https://blog.google/technology/ai/ai-principles/
- The Declaration on the Future of the Internet
- Digital Futures Initiative
- (Book) Fleet Tactics and Naval Operations
Making great content requires fabulous teams. Thanks to the great talents of the following.
- Research support and assistance: Tom Barrett
- Production: Elliott Brennan
- Podcast Design: Susan Beale
- Music: Dr Paul Mac
This podcast was recorded on the lands of the Gadigal people of the Eora nation, and we pay our respects to their Elders past, present and emerging — here and wherever you’re listening. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
Please check against delivery
Miah Hammond-Errey: [00:00:00] Welcome to technology and security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr. Miah Hammond-Errey, and I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney. My guest today is Alex Lynch. Thanks for joining me.
Alex Lynch: [00:00:21] Good to be here.
Miah Hammond-Errey: [00:00:22] Alex manages Google Australia's public policy engagement in relation to emerging technologies. This includes artificial intelligence, quantum computing, as well as digital communication technology, supply chains and trade and investment. Prior to joining Google, Alex consulted on crisis and strategic reputation management and was formerly a national security practitioner in New Zealand. We're coming to you today from the lands of the Gadigal people. We pay our respects to elders past, present and emerging. Here, and wherever you're listening, we acknowledge their continuing connection to land, sea and community and extend that respect to all Aboriginal and Torres Strait Islander peoples today. So, Alex, 2023 is shaping up to be huge in the tech policy space.
Alex Lynch: [00:01:04] Indeed.
Miah Hammond-Errey: [00:01:04] What are the key issues you and Google Australia or Google Global are thinking about?
Alex Lynch: [00:01:09] It's an incredibly complex space. If you even just read the papers, you'll see concerns ranging from tech supply chains to geopolitical decoupling. You'll see concerns about artificial intelligence and these new large language models coming to market. You'll see a people investing in cloud computing, transitioning in that very macro change we're seeing at the moment from on-premises computing to cloud systems, you're seeing cybersecurity concerns, you're seeing privacy concerns, you're seeing a raft of issues, all of which are interesting debates for society to have and very interconnected as well.
Miah Hammond-Errey: [00:01:42] Access to the best tech talent is often billed as a huge issue. And as you've just said there, Google has been the breeding ground for a lot of remarkable innovations. How does that access to tech innovation and tech talent globally work out in Australia?
Alex Lynch: [00:01:56] You can find clusters of really great people all around the world, including from our part in Australia. About half of our business here is on the engineering side. They work on a variety of products. I think maybe some of the audience might not have heard, but most people know these days, at least in my circles, that Google Maps was created almost, or the conception of that was a startup, that was an Australian startup. And we still have a big Google Maps team working in Australia on those — what is a global product? Australia has and we have been working with universities here for a long time to try and make sure that universities are creating talent that is fit for our local engineering team. Right. Now we are not just competing with tech for technical talent and high-end technical talent with between ourselves, Atlassian and other big employers locally of pure play tech companies. What we are seeing now is a huge whole-of-economy uptake of technical talent. Commonwealth Bank I think has more than 5000 people, for example, and that is a necessity for all these companies that are now transitioning into cloud-based systems, but who are also trying to look at what they have, their assets, the data assets in particular, and how they use those. And so we've gone from a situation with sort of a very high-end talent pool and you'll see depending on what technical area it is. For example, I used to have to have a PhD to be an AI practitioner. The tool set and the supporting infrastructure has now been built out so that you need a far lower level of technical capability to invest in AI or be a practitioner in AI. And quantum computing is at the opposite end of that scale where you are — you're still very much need to do the maths. We are looking at the spread of this talent across the economy and increasingly as all companies, large companies in particular institutions, private sector and government, are thinking about how they can use these new tools, how they can get access to all that talent. And it's a highly competitive environment.
Miah Hammond-Errey: [00:03:50] We often hear that Australia has an advantage in quantum. Can you explain why this is? And in that answer, could you give us a little bit about the quantum ecosystem in Australia?
Alex Lynch: [00:03:59] Yeah, of course. Australia has something to be proud of in this area and part of it is, as we saw in the US in the past, where the traditional digital champions were created there as a result of a long period of government investment in science and technology, oftentimes through DARPA, through organisations like Bell Labs in Australia, we have seen a long-term commitment to quantum science and research in Australia, and so Australia has developed this research advantage in Quantum. We are working on a quantum computer at Google. That research in terms of the hardware side is largely taking place in the United States. But what Australia has and we are deeply partnering between these hardware specialists in the US and software specialists in Australia. What we have is a situation where we have really great quantum algorithms people, and this is just one small subset of Australia's research community, but I'll use it as an example of where we can punch above our weight and why why we matter globally. These quantum algorithms specialists are looking at how to make more efficient algorithms because what we need to know is how big the quantum computer we need to build is, so that we can perform meaningful computations that the quantum algorithm specialists are looking at: If you want to solve a quantum chemistry problem, how many qubits do you need to enable that computation? And they're constantly refining that and bring that number down. And on the US side, they're thinking about how can we increase the number of qubits in our machine such that we can actually make these computations possible. So there's this two-tiered piece where the Australian researchers are building more efficient algorithms and the hardware people in the US are trying to get to the point where they can compute those algorithms.
Miah Hammond-Errey: [00:05:31] And how does that sit with the recent quantum announcement from Google about improved error correction?
Alex Lynch: [00:05:37] So this is a huge we think is a huge step forward in quantum science. What we see now when people talk about quantum computers, they talk about these noisy intermediate-scale quantum computers or NISC computers. And that is essentially each qubit is a single computational piece, like a like a binary computer or one or a zero. Although quantum computers have phase as well as as that sort of binary one and zero piece. So they're more complex, but the nature of quantum computation is such that any kind of interaction, what we call measurement, but it's just a measurement, could be as simple as one tiny particle bouncing off another. Each measurement can cause an error or cause a bit to flip or an error in that quantum machine. And so when we're talking about how we build a computer of meaningful scale, it quickly becomes impossible to just build up the number of qubits you need what we call a logical qubit, which is an error-corrected qubit, something we can be sure that the answer it gives is consistent. The latest announcement essentially says we have reduced the errors in this quantum machine as we increase the number of qubits, which means that eventually we will be able to create very, very large-scale quantum devices that can solve these problems we want to solve. Now, to give you some conception of where the scientists think quantum computing is, as opposed to other areas of quantum which are more near-term. They talk about creating the vacuum tubes of quantum computing, not even the transistors. We have 14 plus billion, if I recall correctly, transistors in the latest iPhone. And at the moment we're getting to one logical qubit. We will need millions or billions of these things to perform meaningful computations.
Miah Hammond-Errey: [00:07:11] So, I think that's one step in a really long journey.
Alex Lynch: [00:07:14] It's a long journey.
Miah Hammond-Errey: [00:07:15] You have kind of talked a little bit there about the hardware and the software. Can you, you know, in the context of Australia's advantage in this, can you just kind of clarify there where we sit in the other parts of that ecosystem?
Alex Lynch: [00:07:27] Yeah. So we have specialists in a number of quantum areas, quantum sensing, which is a near-term piece which looks at, you know, can you look at very, very tiny measurements. And this is where Quantum comes into its own because our world is sort of based upon a quantum substrate, which feels very tangible to us because everything's being measured all the time and it's collapsing into reality. But Quantum allows you to measure very, very small things in the quantum sensing space. So gravitational fields or things like, you know, electromagnetic fields. So these things could be consistent across the surface of the earth. They could be variable in different places. So you may have a future GPS-esque system that is based upon quantum sensing, or you could have communication systems that are cryptographically secure based on quantum communication principles. And so these things are coming down the pipeline in the nearer term than full-scale quantum computers. But we need to understand these principles because they'll give us particular advantages in particularly in the sort of military and defense space.
Miah Hammond-Errey: [00:08:22] Yeah, absolutely. And quantum encryption is kind of often referred to as the Holy Grail, and one of the, you know, a key area that defence really focuses on. I want to touch here on the conversation about technological innovation and, you know, dual civil-military use technologies, kind of drawing out that quantum discussion. You know, historically many technologies were developed within the military and shared outward, whereas now, you know, that's flipped. And many of these are developed in tech companies and moved into defence and security. How do you see that this will evolve and what does that mean for Google?
Alex Lynch: [00:08:55] Yeah, it's very interesting, right? You are absolutely hitting the nail on the head when you say that the preponderance of R&D is now done — it's not government funded in the West. And you know, largely speaking of the US here from our perspective, but they are, primarily R&D is done in the private sector and we see that in Australia as well. That is a change from the past where particularly during the Cold War era it was very government based. So that means that we need to think carefully about, you know, as we have different perspectives and different principles as private sector companies than does defence. And we need to be comfortable having conversations, sharing technology in a way that we didn't have to think about in the past, things that we put in the world, into the world. You look at our AI principles, for example, they preclude any kind of use that may result in harm. And, you know, the military has a very different perspective of the use of harm to defend the national sovereignty or priorities. How we resolve that, our inability to or our unwillingness to use technology in that domain, which I think is a principled stance that a private sector company should be comfortable with, how do we resolve that with the need to explore those military technical uses? And a lot of that will be done as a result of public publication of these sorts of research streams. We might not want to do anything ourselves in that area, but we don't preclude publicly published research from being used for other purposes.
Miah Hammond-Errey: [00:10:16] As a global company, how does Google think about security?
Alex Lynch: [00:10:20] 'By design' is the short answer. That's an innovation that we sort of pioneered security by design in all our products, so it should not be an afterthought. How do you defend these systems you've created? It should be how are these systems intrinsically defensible because of the way that you create them? We also look at things like we pioneered zero trust systems, for example. So this sort of cyber security innovation has been at the core of what we do because we have obviously a very large potential attack surface for people. The data that we keep just by providing an email service, for example, is very valuable in particular intelligence contexts. And we have experienced in the past people getting access to our systems. So now we take a very, very proactive, front-foot, innovative approach to security. We've pioneered a bunch of these cybersecurity principles that are now being baked into other organizations, and we're continually looking at how we make sure our infrastructure is defended from the physical all the way through to the digital level so that we we just have a very strong record of being difficult to attack.
Miah Hammond-Errey: [00:11:22] I guess, are there any lessons that, you know, that Google has learned that can be kind of drawn, you know, more domestically?
Alex Lynch: [00:11:29] Yes. But I suspect those are very well known right now. We obviously have been the subject of attacks that were attacking unencrypted data travelling between bits of our infrastructure in the past. And so we learned and adapted so that we are encrypted in transit, that we have data that is sharded across different data centres of different jurisdictions. So no one compromise can compromise a piece of data because it is they'd have to compromise data centers across various jurisdictions to enable that. And there is a tension there between discussions around. Data sovereignty, for example, and our security posture, which very much says this data is more secure if we can keep it away from everybody and keep it in multiple places so no one can get immediate access to it.
Miah Hammond-Errey: [00:12:11] You've amazingly preempted one of my questions.
Alex Lynch: [00:12:14] Excellent.
Miah Hammond-Errey: [00:12:15] I want to ask you about data localisation and whether or not you think it makes us more secure. So a really great answer you've already given there. Can you kind of give us a bit more about that? Various jurisdiction.
Alex Lynch: [00:12:27] Yeah, that's right. So we have data centers, very few of the larger ones, but we have edge locations all around the world, including in Australia, where data can be stored locally if there's a requirement for that. And oftentimes that's a latency requirement for a customer. So the bank may not want to be bouncing the signal to Northern Europe or Scandinavia and have to bounce back from a big data center. But they want a high latency local solution which we can provide. But there is also, there is security in making sure that no one actor can get their hands on the information and putting it in multiple jurisdictions so that it can't be attacked in that way if indeed encryption was broken. That is a very, very secure way of doing things and has protected us. But it may not be desirable in all cases. A government in some cases may want things stored locally. We would like we think security should be the priority overall. We think that you're in Australian persons information is more secure. If it is secure against any actor that might want to access it. But we also understand that there are, you know, well, we think our systems are extremely highly secured. That is not the case for all companies or all platforms.
Alex Lynch: [00:13:33] And so we can understand the argument for or desire for some sets of data to be kept locally. But it's not only security that we think about this in terms of. We look at things like, there are huge compute clusters that don't necessarily exist in Australia. The future is not evenly distributed, as they always say. And so some places in the world, for example, we have the first prototype quantum computer in Santa Barbara. We don't have those in every country around the world. And it would be impossible and financially infeasible to build them in every country in the world. Likewise, we have these huge data centers that have huge computational capacity, not data storage capacity, but the ability to run large algorithms across the data or train AI on the data. And these things are located in different jurisdictions. And so if you have something, for example, a legislative instrument that says you cannot take a certain kind of data outside Australia, that means we couldn't compute algorithms on it. For example, if you take the attacks that are being made against Australians' email addresses or spam to their Android phones, we can look at all that information, compute all overseas and develop things that protect people's Gmail from spam, for example.
Miah Hammond-Errey: [00:14:43] Yeah, that's such a great example to make it real for people because so often the difference between computational power and a data center storage is is really not clear to the everyday person completely. You've touched on something here and I want to draw you out a little bit more. My colleague Tom Barrett and I recently published a piece examining trust and distrust in technology. We'll put a link to the full piece in the show notes, but essentially we found that across Japan, Australia and the US, more than 80% of respondents trusted American technology, while less than half trusted technology from China. How does Google build and maintain trust?
Alex Lynch: [00:15:17] That's a very good question, and it's a question both for ourselves and for governments. I think, part of the offering that we have is that we do philosophically and from the founding of the company, respect the rights of the individual, the right to privacy, their right to the security of their own data, and that there are tensions there and there shouldn't always be tensions, but there will always be tensions there for governments wanting more access and sometimes capability. So we, a big question that I'm asking myself strategically right now is: Why should the government of Indonesia choose to work with US companies rather than companies originating from another jurisdiction — a more authoritarian jurisdiction that provides them more intrusive access to the information on their citizens, for example? I think we have to articulate why we care about these human protections and why that matters to us and why we're not going to be hypocritical and ride over themselves, ride over those protections when we think that's in our domestic interests. We have this declaration on the future of the Internet that a number of governments, including Australia, signed up to, I think last year. And this articulates a bit of that vision. But we haven't been consistent and clear about that internationally. People trust us because they think that we will act to protect their information. They need to be able to trust their government to do the same thing. And the government of Indonesia, for example, if we want to compute business information from Indonesia in an Australian data centre, should that be a thing that we did, then the Indonesian Government would have to trust the Australian Government won't just ride roughshod in and access all of that for their own purposes. This is it's a highly complex environment and evolving rapidly. But this idea of trust, consistency, you know, walking the walk about what you say about your values is so important.
Miah Hammond-Errey: [00:17:09] Well, you've hit on a topic I really wanted to ask you about, and that is values. Do you think technology is imbued with the values in the context that it is created in? What does that mean for Google and how it operates globally?
Alex Lynch: [00:17:25] Yes. The answer is you make design decisions all the time. When you design something, right? You have to, the protections that you build in reflect what you think is valuable. And we think about that in the user's interests. And that user's interest has been largely that the humanistic approach that looks at individual human rights as one of the priorities when we're building those systems. What we are interested in looking at now is. When you look at these modern AI systems and the way that they're developed is essentially drawing upon the corpus of human knowledge that exists now, that knowledge being that we are humans, is riddled with biases, particularly historical documentation. And these systems that are drawing upon all this knowledge be that in a legal context, be that in just a broad creative text or LLM context, they reflect the granularity of human experience. And that's confronting for a lot of people, it's confronting for us. And we need to, as we're developing this technology, figure out how to make sure that we don't lose track of our values when we're putting things into market, that we look at the biases and the the systems that we create and adjust for those that we make sure that we are testing them with the right people so that they can tell us things that we as people who are in the developed software development and and production industry that we don't think about because we haven't had their life experiences. And that's something that we think about deeply when we're developing software.
Miah Hammond-Errey: [00:18:45] Okay. Let's talk a little bit about AI and ethics. How do tech leaders like Google balance the race for dominance in something like generative AI with the increasing ethical concerns around the kind of technology? And is this a balance that can be improved?
Alex Lynch: [00:19:01] Yes. You would have seen a lot of priority around AI ethics over the past few years. If you look at the rhetoric from ourselves, from Microsoft, from many companies and from governments, people are concerned about making sure that these systems are developed using principles. Increasingly, we talk about international standards, and there is an international standards effort around AI that's going on presently. Australia is deeply involved in that. So we look at the structural approach for having principles and not only principles but internal governance around how we operationalize those principles, which is hugely important. We look at international standards and standardization. And when we're developing these new systems, we also need to play the role of talking about what these systems are actually capable of. Can we put policy controls around the things that this system is reflecting? Because as I said, some of them are built on top of a corpus of information, particularly the larger models that are that is it's human, human, and it's filled with the human biases that you would expect. You know, we have a responsibility to do better and to make sure the future is better than the past. And a lot of the sort of controls and the ecosystem that we're building from a technical and research standpoint is evolving. So people, there are the top researchers in the world at Google looking at how to actually build bias out of these systems, how you look at datasets to to assess whether your dataset has bias, how you make sure you adjust for that, so your endpoint system is less biased than the data. Like this is an emerging area of research and technology and something that if you are a young person listening to this podcast, I encourage you to look at in detail and do some research in because it is something that you will be grappling with, or we as a society and companies will be grappling with for many years to come.
Miah Hammond-Errey: [00:20:51] We're going to jump into a segment here which is emerging tech for emerging leaders. Can you share some of these emerging technologies you think up and coming leaders of today and tomorrow should know about?
Alex Lynch: [00:21:02] Obviously, generative, generative AI is very big at the moment. We, the thing that I look at coming down the pipeline is biotechnology and the interface between digital technologies and biotechnology. There's a lot of data that exists out there and medical systems and genomics. The marrying of digital technologies like AI into that environment is something that is exciting and will cause, you know, a flourishing of human endeavour over the next decade. But it also terrifies me. You know, the less the capability, the increase of capability and the reduction in threshold to entry for this sort of work will only, that the ability of people to engage in it, will only increase over the coming years as the toolset gets built out. And so. You know, ordinarily I wouldn't be that concerned. But in an era where we have increasing disconnection, increasing competition, a lack of cooperation around this, the technology of the future between blocs of powers internationally, that becomes a more difficult question. And I would look at international agreement around the regulation of biotechnology and and those sorts of things as being almost akin to the arms reduction treaties of the Cold War era. We need to make sure these these things go very wrong, very fast, and we need to make sure that we have proper controls around those agreed controls at a civilizational level.
Miah Hammond-Errey: [00:22:26] Are there any technologies that you can't live without and that you would recommend to up and coming leaders or workers, you know, in a professional sense?
Alex Lynch: [00:22:35] I think we are bombarded with information and rather than a technology — strange coming from someone from Google — but rather than technology, I would deeply encourage all emerging leaders in technology fields and others to look at how you assess information for veracity from a media article to a scientific paper. How do you look at what people's interests were being exhibited through this? You know, why is this information available? Why is somebody writing about it? Why are they writing about it in the way they are? Ask yourselves all these questions whenever you read something new and that will set you in good stead.
Miah Hammond-Errey: [00:23:12] Critical thinking is a critical asset.
Alex Lynch: [00:23:14] And we don't want to automate it. Like one of the big questions we ask is what are the things we don't want to automate? I can't remember any number. I can barely remember my wife's phone number because all the numbers are in my phone. What are the things we do not want to reduce human capacity for? Things like critical thinking. So what should we not automate is as big a question as what could we productively automate?
Miah Hammond-Errey: [00:23:34] What are some of the key transferable skills that you've seen between the technology and security sectors?
Alex Lynch: [00:23:40] These are consistent across a number of areas, but things like critical thinking, clear communication, One of the key things I think that is common between the intelligence community that I think practitioners there are particularly good at is the ability to thrive in uncertain, highly changeable environments with limited information and make an assessment that you will back despite there being a high degree of uncertainty.
Miah Hammond-Errey: [00:24:02] I feel like for some people that lights us up and we like leap up and go, That sounds fun.
Alex Lynch: [00:24:05] Yeah, exactly. And this is one of those things that we in a highly changing environment, like the world of technology, having that comfort, having the ability to identify trends early just because of your training and your natural proclivities there, pardon, that's so valuable, and something that anyone wanting to transition between the intelligence community and the private sector or manage that, which can be difficult in a lot of ways personally and professionally — there are so many skills that you will come away with that are so valuable to not just in technology, but more broadly across the private sector.
Miah Hammond-Errey: [00:24:46] So we're going to jump to the section on alliances. But first, I just want to, I guess, just explore this idea here about dis and misinformation and you know, that we are bombarded by mis and disinformation. You know, it's something I've written a lot about. I know you're really interested in it. Can you, I guess, talk me through some of the tensions that you see there, how it might evolve and if there's any work in Google's space on this.
Alex Lynch: [00:25:09] I mean, always. We have a very highly productive threat assessment group led by a former ASD person called Shane Huntley, who looks at what we call inauthentic activity on our networks, which is someone purporting to be someone that they are not. You know, it's one thing for a foreign diplomat to be under their real name, on a service, on a, you know, a service like YouTube or Twitter saying, here are my views. You know, hopefully someone has the tools to assess those views. And they're usually labeled with the foreign government entity. But what about the people who are you know, there are 10,000 people sitting in a shop somewhere at the computers, building content to push into social networks, and they might pay PR firms and foreign jurisdictions. They might do it directly, but they the governments are looking at how they leverage tools to propagate their point of view into other markets. And so we have teams that specifically look at identifying those networks, sharing that information between companies, taking those down, notifying people about those. What becomes more complex is what happens when people in nations where there is a right to speech who reflect views they've heard from a propaganda outlet or who reflect misinformation into the market that they truly believe. The question of what we take down is clear in some cases. Vaccine misinformation during a pandemic. Very clear. Political speech during an election is a hugely contested issue and one that we deeply rely on the guidance of electoral commissions and governments to deal with. You know, we cannot be the source of truth when it comes to free speech in a Western state in particular. We need to make sure that that has been driven by the government, that has been driven by the people who maintain the integrity of elections. And we work very closely with them to do that.
Miah Hammond-Errey: [00:27:01] So on to the segment about alliances. You know, we ask our intelligence and security leaders largely about nation state alliances. But how do you see Google contributing to the alliance between Australia and the US as well as kind of multilateral forums? Yeah.
Alex Lynch: [00:27:17] So we've recently launched something in Australia that we're quite proud of here, which is the Digital Future Initiative and essentially looks at it in those terms. Google has some of the best researchers in the world and very niche areas and they could be, you know, as I said before, our technical teams are largely located in the US, but they can be global, including people in Australia. So what we look at is how we can encourage the best researchers in particular in niche areas in the world to work with Australia and the top people in Australia, to work with the top people at Google and vice versa, so that we can assist the technical development and industry development in Australia. Be that through initiatives like Blue Carbon, where we worked with DFAT and CSIRO to map seagrass ecosystems in the Pacific, which helps partners engage in climate change discussions more effectively because they know the carbon capture capability of their ecosystems. Looking at things in the broader context of the capability that we can bring from a global, Google globally to how we can work with people in Australia to develop intellectual property in Australia, support Australian industry, Australian research, Australian diplomatic capability. All of these things are questions that we ask: How can we bring the best of Google to Australia so that we can enhance Australia's interests? And from our part, Australia is a great location. I love living here. We have a huge technical team here. We work with Australian researchers and universities. We develop Australian talent pipelines through schools and university programs as well. And so, how Google and can assist the Australia US relationship is trying to bridge those gaps and make those connections. We live in a world that seems abstract and technical, but you know, as you, particularly as you mature and age into your roles, you see that it is deeply human in so many ways and that making those human connections is the most important thing that we can do.
Miah Hammond-Errey: [00:29:05] You have worked in multiple countries in tech and security. What are some things we can share with each other?
Alex Lynch: [00:29:11] Sure. I think I've worked in primarily New Zealand, in Australia on the government side and security in New Zealand and in the sort of high end of the corporate world and at Google in Australia, so in technology. The things we can learn between those contexts. It's an interesting question because in a lot of ways we're very, very similar. And looking back at my career history, I can see moments that are bigger in context that drive particular outcomes. For example, I worked in intelligence through the post-9/11 period. That moment, that terrible moment enabled this situation where where intelligence, the intelligence community could talk to one another in various jurisdictions in a way that they wouldn't have in the past. And you built connections that you might not otherwise have done when globally the world was looking at and addressing terrorism. There are a lot of international connections built during that period that would not have existed if the agents were still primarily looking at nation state actors and adversarial in that way. But that meant that they weren't, that nations weren't looking at other things that were happening during that time. And so. I think. Being aware of the context between whatever jurisdictions you're working, being aware of the context that you're in and the priorities that you might otherwise have if you weren't in that context, and being explicit about those, is something that we could all learn from from this period of recent history. Where, we clearly missed adapting to certain large geostrategic trends because we were stuck in looking at counterterrorism policy. We were looking at how to adapt to a global economic crisis or a pandemic when in the background, these big geostrategic changes are happening that were largely uncommented on for a lot of that period.
Miah Hammond-Errey: [00:31:11] As you know, I recently published a paper highlighting some of the impacts of emerging tech on Intel. Given your background in both, I'd love to hear your thoughts on those key tensions.
Alex Lynch: [00:31:21] Yes, it is one of those situations where the community, the intelligence community has had a huge advantage in technical expertise for a long time. There are things that the IC [intelligence community] knows that aren't very well known outside of that. And now you have a situation where the technology is evolving so rapidly that you oftentimes have areas of specialty outside the intelligence community where the people working there have a deeper technical understanding than the people inside the intelligence community. And so I think more than ever before, there is a necessity to talk to one another. A, because there is that need to share information for that national security standpoint, but also because the ability to respond to intelligence problems. I mean, I'm talking primarily on the defensive side in this context. We need to actually be able to understand what the threat environment looks like. And we can't, I don't think the intelligence community can do that by itself. The corporate community certainly can't. And we need to increase that cooperation. We need to have that be structured. We need to make sure that everyone is doing that in an ethical way, that their audience, be that the public or their shareholders are comfortable with and make sure that is done in a way that is supportive of the national interest and broadly, you know, the strategic interest globally.
Miah Hammond-Errey: [00:32:43] One of the segments we have is eyes and ears. What have you been reading and listening to or watching lately that might be of interest to our audience?
Alex Lynch: [00:32:50] I've be catching up on something which I'm sure everyone listened to a long time ago, which is the history podcast, Dan Carlin's history podcasts, which has been interesting because I grew up in a time when, you know, my father had collected this beautiful series of magazines. It must have been those periodical weekly or monthly magazines on World War Two. And so I grew up in my room with this massive history collection about World War two individual battles, individual weapon systems and strategy and all those sorts of things in minute detail. And then I, you know, was interested in the Cold War period. And so read into that. And then I grew up through the period of the Gulf Wars. So I had this very good grounding in recent history, but never really engaged with deep history or ancient history. And getting back into that in a very pulp way with great storytelling is, has been fantastic.
Miah Hammond-Errey: [00:33:37] Is there anything else you wanted to add to that?
Alex Lynch: [00:33:39] Yeah, I guess one of the things that I've been reading at the moment is fleet tactics and naval operations. So it's.
Miah Hammond-Errey: [00:33:45] Seriously?
Alex Lynch: [00:33:46] Yeah, it's a book that you have to order from the Naval War College in the US. But it's one of those things that we see in the debates in Australia at the moment. You know, this the question of the of Taiwan or the question of our near abroad. And we do live essentially at the base of a large island chain. And maritime operations are increasingly, you know, people are aware that that is increasingly going to be the crux of a lot of competition, geostrategic competition and regional competition. And so, you know, I had no idea really how naval engagements worked. And so it's one of those things where you kind of need to if you want to have a reliable analytical lens or opinion about some of these strategic matters. I felt like there was a huge gap in my analytical framework there. So I've been I've been working on that.
Miah Hammond-Errey: [00:34:32] I want to ask quickly about your thoughts on tech decoupling. We're obviously seeing tech decoupling in quite a number of areas between the United States and China. Where do you think it will go?
Alex Lynch: [00:34:44] Interesting question, and one that is sort of the drumbeat has been continuing for a long time from in the public domain from the Huawei decisions. This is a critical issue of trust. And now to what we see as a huge, huge intervention in the form of the CHIPS Act in the US and the ramifications that are still spreading through the international environment, the multi-billion dollar supply chain decisions that are being made to change where, you know, things are produced, to look at the infrastructure and logistics cost of those changes. Now, this is. It's happening is the answer I guess. There's decoupling at a high technology level is happening, and the ecosystem around emerging technology and technology production, technology, supply chains is so broad that you are seeing interventions happening across multiple fronts, from academic cooperation to manufacturing to standardisation in international standards bodies and people re-engaging or the Western states reprioritising international institutions that look at technological standards. You know, all of these interventions are happening right now. And. The tension is not going away in the near term. I can't see any indication that there is any, the views on both sides of the sort of geopolitical divide, so to speak, the broad geopolitical divide, are hardening, not lessening.
Alex Lynch: [00:36:09] We also have to think about what I like to call digital lines of communication, which is, you know, I'm sure you're familiar with the sea lines of communication concept, right? There are choke points. There are you know, we look at our logistics in terms of that, the vulnerabilities of our logistics from production sources to destinations. And I think about our digital footprint in a similar way. Where where are we using compute? Where is that being piped in from? We have a map of subsea cables. We have big technical infrastructure in various jurisdictions. You know, at what is the risk in that, you know, both from a geopolitical and security standpoint, but even from a policy standpoint, right. So where where do we want to invest for the future in order to make sure our technical infrastructure and global technical infrastructure footprint is resilient and what I would love to see, you know, as as someone locally is for Australia to play a big role in that. But we have no idea at this stage how that is playing out.
Miah Hammond-Errey: [00:37:02] You've touched on like the scale of Google's operations and just how immense they really are. And you know, on the flip side of that is also you're balancing the individual privacy of users and security, but that also creates a risk that a huge volume of data is held by Google itself. And I guess, how do you how do you kind of balance that tension?
Alex Lynch: [00:37:24] Click the minimum amount of data that we need to to actually execute on our products. That's a fundamental principle of how we design things. And that's not just a this is reducing data access risk. This is every piece of data you collect is costly in terms of it sits somewhere and it eats up computational capacity or storage capacity or the electricity you need to run to run the data center. So it's a fundamental question of just operational efficiency as well as being an issue of we only want to collect the data because we think this is fundamentally where the world is going, that we need to not just hoover up everything, because that's not it's not commercial, it's not efficient, and and it is just it's poor design.
Miah Hammond-Errey: [00:38:07] Okay. So is there anything we're just going to the segment on need to know is there anything I didn't ask you already that would have been great to cover?
Alex Lynch: [00:38:15] No, I think you've been pretty thorough, actually. No. A lot of the things that I was thinking about raising in as as potential wild cards, we've actually gotten to already, So it's been quite fantastic.
Miah Hammond-Errey: [00:38:27] Thanks for such a fun conversation, Alex.
Alex Lynch: [00:38:28] It's been really fun. Thank you. Cheers.
Miah Hammond-Errey: [00:38:30] Thanks for listening to technology and security. I've been your host, Dr. Miah Hammond-Errey. I'm the inaugural Director of the Emerging Tech Program at the United States Studies Centre, based at the University of Sydney. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_he or send an email to the address in the show notes. You can find out more about the work we do on our website, also linked in the show notes. We hope you enjoyed this episode and we'll see you soon.