In this episode of the Technology and Security (TS) podcast, Jessica Hunter from the Australian Signals Directorate (ASD) joins Dr Miah Hammond-Errey to talk about emerging technologies and signals intelligence. They cover ASD’s role in Australian intelligence, REDSPICE, offensive and defensive operations and the Russia–Ukraine conflict, technology as statecraft and cybercrime. They also discuss alliances, the security of everyday technology, the Optus and Medibank hacks, international standards, and the value of creativity and vulnerability for leadership in intelligence.

Jess is a First Assistant Director-General at the ASD, working at the Australian Cyber Security Centre under Access & Effects. She has worked in the intelligence and security community for almost 20 years, including postings at agencies in the United States and the United Kingdom. She has held leadership roles in offensive and defensive cyber security, cyber resilience, threat assessment and disruption.

Technology and Security is hosted by Dr Miah Hammond-Errey, the inaugural director of the Emerging Technology program at the United States Studies Centre.

Resources mentioned in the recording: 

ts-ep1-1.jpg
ts-ep1-2.jpg
ts-ep1-3.jpg


Making great content requires fabulous teams. Thanks to the great talents of the following: 

  • Research support and assistance: Tom Barrett 
  • Production: Elliott Brennan 
  • Podcast design: Susan Beale
  • Music: Dr Paul Mac

This podcast was recorded on the lands of the Ngunnawal people, and we pay our respects to their Elders past, present and emerging — here and wherever you’re listening. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.


Episode transcript

Please check against delivery

Miah Hammond-Errey: [00:00:02] Welcome to Technology and security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr Miah Hammond-Errey. I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney. My guest today is Jessica Hunter. Thanks for joining me.

Jessica Hunter: [00:00:25] No worries.

Miah Hammond-Errey: [00:00:25] We're coming today from the lands of the Ngunnawal people. We pay our respects to their elders past, present and emerging, here and wherever you're listening. We acknowledge their continuing connection to land, sea and community and extend that respect to all Aboriginal and Torres Strait Islander people.

Miah Hammond-Errey: [00:00:42] First up a little bit about our guests. Mrs Jessica Hunter is a First Assistant Director-General at the Australian Signals Directorate (ASD). She has worked in the intelligence and security community for almost 20 years, including postings in agencies in the US and UK. She has held leadership roles in offensive and defensive cyber security, cyber resilience, threat assessment and disruption. So, Jess, you officially head up the Access and Effects Operation Division. Can you give us a quick overview of ASD and describe what your division actually does and how it fits into ASD and the Australian Cyber Security Centre?

Jessica Hunter: [00:01:13] Absolutely. Those who don't work within the intelligence community, obviously it's a little bit tricky to figure out what we do and what our functions are. So ASD is actually 75 years old. We celebrated our anniversary last year and we have a very simple mission statement, which is reveal their secrets, protect our own and that enables ASD to do two things – look at intelligence and pull critical information in to inform national security, but also our important role in cyber security and protecting the nation. My specific role within ASD under Access and Effects is to collect all of the important data – so it goes to the big data conversation – so that we can actually look at the information and provide advice and policy to government. And also the offensive mission enables us to do something with that large amount of data to generate impact and effects.

Miah Hammond-Errey: [00:02:00] That's great. And one of the questions I have is: How does ASD balance operating as a foreign intelligence agency and its increasing domestic functions given this distinction between foreign and domestic intelligence?

Jessica Hunter: [00:02:12] It's a great question and again, going back to our history. ASD has always had that dual function. In fact, the very premise of ASD when it was first established was that communications security function, how to develop the right cryptographic key and material to secure the communications of the warfighter. So it's not actually a new function for us to have to manage the two between the foreign intelligence and protection of Australia. It's actually been part of our core for the last 75 years and it's built into our functions. We often say ASD, our values are so core and one of those is being audacious in concept, the intelligence piece, but meticulous in execution. And that's really around that compliance and balancing privacy of Australians.

Miah Hammond-Errey: [00:02:53] Can you give us a clear definition of what signals intelligence is?

Jessica Hunter: [00:02:56] Oh, okay. So signals intelligence is effectively taking technical data. So and in our instance, it's offshore or foreign communications, collecting, analysing and then producing intelligence from that data.

Miah Hammond-Errey: [00:03:10] I get the sense that you think about emerging technologies a lot. And so what are the technologies that you and ASD are thinking about the most at the moment?

Jessica Hunter: [00:03:18] Oh, lots. So particularly in the cyber security space, we're thinking about zero trust technology, we're thinking about machine learning and AI and the implications of that. From our defensive perspective, it enables cyber criminals in particular to scale their capability against us. So we then need to look at different technologies to counter and defend and mitigate that. So those are just a range of them. But ultimately we look at the poacher-gamekeeper concept. So what we need to defend against. We also need to understand, because those are what our adversaries are using in the technology space.

Miah Hammond-Errey: [00:03:52] That's really interesting. I'm interested, I guess, you know, we've had recent breaches like Optus and Medibank and have those data breaches changed thinking in ASD or ACSC? And can you share any examples or structural or operational impacts or lessons to be learnt from those breaches?

Jessica Hunter: [00:04:10] Those were significant breaches within Australia and I think what is critical in that space and what we've seen reflected back from it is the attention that boards are now paying to cybersecurity. So from the Australian Cyber Security Centre, which is a part of ASD, we have initiated a range of different products, services, advice and guidance as a result of those data breaches. Also the conversation around the importance of protecting the victim and for industry to work incredibly closely with the ACSC so that we are able to support them to recover and be resilient. And also a critical piece that has also come that the ACSC is now leading is really generating how entities, boards, government departments and industry exercise for a cyber security incident and treat it as a whole-of-department business continuity challenge, not a tech challenge.

Miah Hammond-Errey: [00:05:01] I guess, could you you just mentioned there about talking with big companies and based on your experience at ACSC, which maybe you could share a bit about, can you tell us, you know, about how far engagement with industry has come?

Jessica Hunter: [00:05:14] What we now find is we have industry as partners embedded within the ACSC in our partnership program, where we are on a time basis, sharing threat intelligence, machine-to-machine information and really getting into the heart of how industry can then partner with the intelligence community. This is a, matured immensely for me in the last 13 years and really now it's not if we should invite industry, it's industry's already brought in from the day dot on any of our incidents.

Miah Hammond-Errey: [00:05:42] And what do you think industry would say back?

Jessica Hunter: [00:05:45] Oh, it's a great question. I sit on many industry boards and also a lot of CISO boards and industry often says back, We're trying to understand how government operates and they actually really appreciate the ACSC because we often are that translator back into broader government because we have the technology and the tech credibility working with industry, we're able to be that interlocutor and that translator.

Miah Hammond-Errey: [00:06:12] Great. Thank you. Next up, we've got a segment on alliances. You've been posted to the US and UK, our key allies. Can you share a bit about those experiences?

Jessica Hunter: [00:06:21] Oh, amazing. I would encourage anyone to do it. A couple of reasons for that. First, it gives you an objective perspective on Australia. So when you're sitting outside of your own country, you can see the incredibly positive opportunities that come and also some ways that you can improve as an organisation and as a nation. So it gives you that perspective. But importantly for me, I was in the UK when the UK established the National Cyber Security Centre, so that was really a new capability within the UK to truly support individuals, businesses and large entities and government. So I was able to be there at the very beginning of that and then help shape the Australian Cyber Security Centre through that experience. What I did learn from both the US and the UK is genuinely how tight the relationship is, particularly on the SIGINT side. So ASD's counterparts, incredibly tight relationships such that there's a level of trust there and a true understanding of how we truly need to support. The conversations aren't around how do we deconflict there around, how do we partner? And for me that demonstrates we've moved from a transactional relationship to a true partnership.

Miah Hammond-Errey: [00:07:36] Absolutely. And you've kind of mentioned that the challenges to Australian security all require good alliances. How do you see signals intelligence and the relationship ASD has with NSA and GCHQ contributing to those alliances from a broader kind of a broader national perspective?

Jessica Hunter: [00:07:54] Incredibly significant and you would no doubt be tracking at a very political level and a more sort of senior level within Australia initiatives such as AUKUS and the Quad and ultimately from the intelligence community, we're able to support those larger initiatives because of that level of trust there. Actually in REDSPICE, a key component of our resilience is also leveraging our partners overseas and ensuring that we are using the best and brightest technology and skills, and leveraging from their knowledge and scalability in order for us to do REDSPICE and resilience. So there you can see it delivering not just at a tactical intelligence level in the partnership, but supporting the broader government initiatives at a national level.

Miah Hammond-Errey: [00:08:36] So you've brought up AUKUS there. So how do you see the Five Eyes intelligence sharing arrangement and AUKUS Pillar II interacting?

Jessica Hunter: [00:08:44] Yeah. So my view on that and from what I've seen on the ground, speaking very plainly and frankly is the operational aspects of our intelligence sharing will not change. AUKUS enables us to leverage where there is a gap or a need to do more between the US and Australia, but it will not impact the operational day to day relationships that we already have across the Five Eyes. And that's a conversation which from a SIGINT agency we are absolutely having with our Five Eyes partners and one that they appreciate and respect.

Miah Hammond-Errey: [00:09:20] The previous government publicly announced REDSPICE with a huge financial commitment, and this has continued with the current government. So what is REDSPICE? Why is it so important?

Jessica Hunter: [00:09:30] REDSPICE is a truly transformative program. It is the single largest investment in ASD and the Australian Intelligence Signals area over the 75 years that we've been operating. And really it's a true reflection of the shift in our geopolitical and strategic realignment since the Second World War. What we've witnessed is the rapid military expansion, increased cyber attacks, which you would have seen obviously in the data breach and even in the Russia-Ukraine conflict and a shift in military and economic intimidation. And with the new technology that you've already referenced as well, we really needed to ensure that ASD, as the cyber-ready organisation for the Australian Government, was fit for purpose, for moving us into the next geopolitical environment. So what does it mean in terms of actually what's being delivered in ASD? Well, REDSPICE is actually an acronym, so it stands for Resilience, Effects, Defence, Space, Intelligence, Cyber and Enablers. And given you're a PhD in data science and big data, I think you'll be proud to know that actually one of the largest investments within REDSPICE is actually in the foundational technologies. It's actually around things such as delivering new cloud-based cyber and intelligence systems, scaling up our AI and machine learning and delivering AI-supported offensive capabilities. Core to REDSPICE is a huge cultural shift in ASD. We will look to almost double in size with 1900 new staff coming into ASD, but importantly for our customers in industry and across Australia. Those aren't individuals coming in to only Canberra, but we will be increasing our national and regional presence.

Miah Hammond-Errey: [00:11:07] And can you give us a bit of an explanation about the offensive and defensive side?

Jessica Hunter: [00:11:12] Yeah, absolutely. So it is critical, that poacher-gamekeeper. So in offensive capability, that's effectively when you are touching a target and making an effect occur. So that is where potentially you might be putting a payload in a network or a payload in some form of communications to change or to generate an effect, whereas defensive is purely in the protective space. And this, as I mentioned before, is the core of ASD's initial intelligence and cyber security function, where we are effectively defending a network or defending a device. And that's where the core of the Australian Cyber Security Centre is focused. How do we protect those networks, ensure that they are hardened, ensure that they are resilient and make sure that other adversaries aren't doing to us what we could do to them in an effect space.

Miah Hammond-Errey: [00:12:00] Can you give me an example of what an offensive or defensive, like what a technology application in those environments might look like?

Jessica Hunter: [00:12:09] So for instance, in the Russia-Ukraine conflict, we often have discussed the concept of the wiper, and that's where it has been alleged that the Russians were able to create a piece of malware or a payload which could be downloaded onto a network, and that piece of malware was then enabled to delete or destroy data. So that's a wiper piece of malware. You will have probably seen it more commonly in terms of ransomware. So when ransomware, again it's usually a piece of malware or malicious software that is dropped onto the network, and that will either encrypt the data, delete or destroy the data. So that is what we refer to as an offensive or an attack action which has generated an effect. The defensive side is how do I then stop that payload or that malware functioning and operating? How do I then protect the network and enable it to recover?

Miah Hammond-Errey: [00:13:00] Again, you've led teams in both. Tell us about what that's like and tell us about, I guess, do you have a preference?

Jessica Hunter: [00:13:08] Oh, tricky question. So my preference will always be on the cyber defensive side because that's the side of good and that's the white hat component. But they are they are truly both interesting functions. And you do have to do a mental shift when you move between them. So the first, if you are in the protective realm, it's very much at an extreme national level. How am I helping smaller entities, individuals, small businesses, big businesses and government departments? How do I help them, because they own the network, actually protect their network. Whereas on the offensive side, ultimately we are developing the capability and the access to enable those effects and those operations. So it's a very different relationship with the target and very different intent, both incredibly interesting from a technology perspective and both must inform each other in order to generate the most appropriate effect, you need to understand the security posture of where you're generating that effect.

Miah Hammond-Errey: [00:14:05] So your bio gives away your passion for professional development. I'm going to go into the segment Emerging Tech for Emerging Leaders. What motivates you and what are the issues and initiatives that you're passionate about in terms of professional development, leadership and mentoring?

Jessica Hunter: [00:14:19] What motivates me is that everyone who works around me is smarter than me, and I love that. But in particular, what motivates me is role modelling, actually allowing people to truly see others that they take good and bad from that. I have been very blessed in my career to have some amazing role models and that kind of keeps me me going. The other piece is, the other piece that motivates me is individuals who don't think they have all the answers and they are vulnerable and they're willing to sort of show that to grow the next generation. And probably the third piece for me is individuals who don't realise they're amazing. I spent a lot of time mentoring incredibly junior individuals who are still in high school who would love to work at ASD, all the way through to peers across both the UK, US and Australian intelligence community and sitting in awe talking with them and they don't realise that they're already a leader. They already know and know how to sort of drive huge change and that they are the ones who've helped organisations and a nation pivot and they sit there so humbly and don't recognise that and that, that keeps me going every day.

Miah Hammond-Errey: [00:15:32] Absolutely, yeah. Your passion just shines through. Can you share some emerging technologies that you think up-and-coming leaders of today and tomorrow need to know about?

Jessica Hunter: [00:15:42] So I think ML [machine learning] and AI is obviously critical in that space and that's because I think leaders need to truly get more into the weeds of understanding that because they ultimately will need to make decisions around risk at some point. And the concept of AI and ML and the scalability of it is what most folks go to. But actually as a leader, if you have that within your organisation or you've invested in that capability, if something from like a false positive occurs or a compliance concern or it has an impact and generates an effect you weren't expecting, you will then be asked to explain all the risk decisions and why you chose that path. And I think for some leaders that's scary. That's asking them to embrace technology and understand the risk behind it. So I would actively encourage leaders to to to think in that sort of space and make sure that they are well informed.

Jessica Hunter: [00:16:37] The other piece in technology is the number of vulnerabilities that are available in technology. We're no longer in a game of solely patching individual vulnerabilities on an hourly basis. As leaders, we need to truly pivot the way that we harden and protect our systems, our data and the data of other individuals. And that, again, requires leaders to embrace a little bit of the technology and actively ask the right questions of of their techs and their business continuity side. So those two sort of technology changes and pivots I think are opportunities, but incredibly scary if leaders aren't truly understanding the nuances.

Miah Hammond-Errey: [00:17:19] Yeah. Great. So I wanted to talk to you a little bit about world events and you have conveniently already raised Russia and Ukraine. The war in Ukraine has seen declassification and broader sharing of intelligence that's been traditionally secret. You've also talked a fair bit about using intelligence and technology as a form of statecraft. How do you see this use of intelligence to achieve outcomes in foreign affairs?

Jessica Hunter: [00:17:43] So the Russia-Ukraine crisis, you're exactly correct, a significant amount of intelligence being declassified. What I think is probably not as commonly known is that occurs on a regular basis. So as I mentioned previously, our engagement with industry on a daily basis, we are declassifying information in order to protect a victim, for instance, of a cyber incident or protecting a nation state, if we're seeing that there's an intelligence risk or challenge there. So intelligence will always inform foreign diplomacy and policy. I think what is critical and underlying in this all is around partnerships and alliances. So the intent to declassify information to support another is often around what is the alliance and the relationship and the impact that we're trying to achieve. It is incredibly nuanced and at times of conflict will be very, very different to time of contestability or in peace time. And that's where I think it'll be interesting over the next five or ten years to see how that sharing of intelligence shapes diplomacy, based on the type of environment in which we're operating in.

Miah Hammond-Errey: [00:18:45] I recently published a paper where I talked about the stakeholders – the link will be in the show notes – talking about, I guess, how stakeholders of intelligence or the users, if you like, have expanded out. Perhaps you could give a bit of an overview or insight into the fact that emerging technologies have have shifted some of those vulnerabilities and affecting individuals more directly. And that is a little bit new for ASD because traditionally your focus has been more nation state.

Jessica Hunter: [00:19:09] Yeah, absolutely. And the probably the most topical example of that and one which will resonate with probably many of your listeners was during COVID-19 and the pandemic and the lockdowns and the number of cyber criminals in particular who were taking advantage of the fact that individuals were receiving payments from the government and support during this time. So from an intelligence perspective, we had an understanding of where those cyber criminals were operating from – overseas, how they were operating – the technology that they were using. And importantly then to your point, the individual victims within Australia and globally. So that was a conversation around intelligence gain–loss. How do we ensure that we not only get that intelligence out to the correct areas and industry? In one of those, we partnered with industry around scams and supporting them to reduce the impact of those operations. But also importantly, how did we get that information out to protect the victims? And this is a critical role for ASD. We received a new function which is our 71C function of the Intelligence Services Act, where we not only produce the intelligence, but we have a role and responsibility to disrupt cyber criminal activity, which means we're ultimately responsible for protecting those individual citizens. So this is a great example of us taking that intelligence and then running disruptive operations to effectively deny the cyber criminals' ability to impact or danger those individual victims.

Miah Hammond-Errey: [00:20:38] You mentioned a little earlier about data retention, and I wanted to ask you about your thoughts, I guess, on data retention and your thoughts on data localisation and their role in increasing cyber security.

Jessica Hunter: [00:20:48] I think it comes down ultimately to entities and businesses truly understanding what they need as an organisation and then overlaying that with privacy rules and legislative rules. And what I mean by that is are they particularly focused on accessibility? Are they particularly focused on business continuity? Are they particularly focused on privacy? Those things need to inform their data retention and then you need to go back and look at it again. And often what we find from a cybersecurity perspective is policies are set in place around availability or integrity or accessibility, but then they're not reviewed when new data sets come in or they're not segmented within a network where they've actually determined what is the priority function of that data. So that's what we're actively encouraging entities to do now. That's on the cyber security side. And then on the intelligence side, obviously we have the Inspector-General of Intelligence and Security who looks at the type of data we collect and the oversight behind that, but also the type of data we purge if we need to, and then the type of data that we retain. And there are very strict policies and guidelines around that. Again, going back to the privacy rules and the obligations.

Miah Hammond-Errey: [00:22:01] ChatGPT has emerged onto the scene and we are kind of all following what has largely been bubbling there for a very long time, but is suddenly public. What are your thoughts?

Jessica Hunter: [00:22:11] Oh, there's a lot I have a lot of thoughts on that. Again, this is the poacher-gamekeeper. Huge opportunities in terms of how you could use some of that data, but so many risks around the ethics component of it, the decision making and buyers. How do you defend against it and how do you build methodologies in to protect yourself? It's fairly new for us in terms of how we're thinking about it as well. But opportunities come from crises.

Miah Hammond-Errey: [00:22:44] It's almost like it's the first time for many for many kind of regular people that they've really engaged with AI.

Jessica Hunter: [00:22:50] Which is odd because of all the spam messages that come out that have obviously been generated that way. But you're right, it's not in the same sort of impact that they're receiving.

Miah Hammond-Errey: [00:22:59] It feels like a tipping point, too, for AI in society.

Jessica Hunter: [00:23:02] Yeah, I think that I think that's fair. Probably three or four years ago everyone was saying AI and ML, but they didn't really understand what that meant for their day to day jobs. And that's, I think you're right, it's come to realisation now.

Miah Hammond-Errey: [00:23:14] Coming up is Eyes and Ears. So Jess, what have you been reading, listening to or watching lately that might be of interest to our audience?

Jessica Hunter: [00:23:22] Oh, okay. So this is where I am very embarrassed and reveal I have a whole swagger of things I listen and read and it'll sound a little bit unusual. But I'm doing quite a bit of reading at the moment on cyber warfare and whether it's a real thing or not. And there was one. And what I'm doing with that is I'm trying to read older books about cyber warfare. So 'Cyber War Will Not Take Place', which is a Thomas Rid book, 'Understanding Cyber Conflict', those sort of 2015, 2017 books. And then I'm juxtaposing that with podcasts around the Russia-Ukraine conflict and how cyber war was used, combined with military and kinetic conflict, and truly trying to understand: Did we anticipate it correctly in the past and is that how it plays out in modern warfare? And there's a couple of podcasts that I listen to. I think there's The Economist, there's a Babbage Science and Tech podcast once a week, and they're really looking at some of those features. So that's satisfying one fascination of mine. The other piece I'm actually looking at, and this goes to REDSPICE and being a leader in ASD and how I help an organisation and a workforce transform and go through such change. So I'm also reading leadership books about being vulnerable. And how.

Miah Hammond-Errey: [00:24:43] Is that Brené Brown or are there more?

Jessica Hunter: [00:24:45] There's variants of that one, variants of Brené and how to kind of build up resilience. So purely from a physiological perspective as well as from a leadership trust perspective. So I'm trying to do a little bit of a balance. And then my third odd piece is very much focused on generative machine learning and augmented decision making and the power that comes with being able to take a whole range of data, without even algorithms, synthetic data and put it together to kind of generate outcomes. But then the risk that that forms in terms of a cyber security perspective, that can then scale content for spear-phishing and sort of interactions to enable malicious content. So I'm a bit all over the place probably for some of your listeners, but that kind of gives you some psyche into what keeps me awake at night.

Miah Hammond-Errey: [00:25:37] What do you do in your downtime to keep sane?

Jessica Hunter: [00:25:39] I don't have a lot of downtime. I have two small children and I love my job. But I do try to disconnect from the digital world, mainly because having worked in cyber security for so long, it's a little bit scary when you connect. So what I do is I do a lot of exercise, but more importantly, I do a lot of activities outdoors and also travelling a lot. I am a, a child who grew up overseas my entire life and I do a lot of travelling to stay sane and really appreciate what Australia offers us.

Miah Hammond-Errey: [00:26:13] I want to know we're in a tech decoupling in some areas between the United States and China. Where do you think it will go? Where might we end up? And do you see there is a tipping point for that technology decoupling?

Jessica Hunter: [00:26:28] So I think we're almost at that tipping point. And this comes back to international standards, which effectively at a really macro level, set the standard for how technology is produced, what ethics are built in, and values are built into that technology, which then cascades down to supply chain, how it's used and how it's implemented. I think that tipping point has already come, and that's due to the imbalance in the value proposition in the international standards bodies, where if you look across, it's no longer a neutral value proposition, that there is definitely influence in international standards such that the neutrality has been removed. That's where I would focus. That is a very strategic place to invest time and often for individuals who are thinking about their widgets and the supply chain and how they use their piece of kit and how secure or vulnerable that piece of kit is, the concept of international standards is too abstract, but ultimately that's where we start to set the norm in terms of the technology space. So for me, that tipping point was lost several years ago, and it's now up to all nations to try and level that out such that the value proposition is, is is normalised rather than one or two values coming into play and then shaping all of the back-end tech which effectively sits in the new technology. So if those norms aren't set in the right place, then we do have all of the chips and all of the supply chain and all of the build out being influenced without a neutral perspective.

Miah Hammond-Errey: [00:28:05] Thank you. It's a really interesting area. And we've got some some polling data which is about to be released, will probably be released by the time this podcast is out, which does show that the publics in Australia, the United States and Japan have significantly more trust in technology from the United States and lower trust in technology from China. So we talk about decoupling as a really kind of grandiose supply chain, nation-state security. I think I'd argue almost that trust in technology is also about our individual user trust in the devices that we have in our homes and in our lives. Do you think there's a relationship there between the kind of broader decoupling and this individual trust in technology?

Jessica Hunter: [00:28:51] I think you're spot on, on the individual trust. It's what they see and can hear and can read and ingest. Technology is still, it may not be for you, but it's still a bit scary for some people. The concept of reading journal articles about tech, the concept of understanding the sovereignty of where your data resides, into which cloud, into which nation state. That is not an everyday user's vernacular or worry.

Miah Hammond-Errey: [00:29:14] I'm looking shocked and horrified here.

Jessica Hunter: [00:29:17] That is that is very much in our world. So you're spot on individuals, I think, unfortunately do overlay some of that geopolitical when they're thinking about, oh, is this a trusted company or a trusted country that has produced this? But it always comes down to ultimately what their friends or their neighbours or those who are influencing them have suggested. What I want to say, though, is I don't want them to be fearful of tech or worry about even where the tech is generated from because there are so many mitigations they can put around their individual personal device regardless of where the chip was made or regardless of where the handset was or even regardless of which network you're joining when you're overseas and connecting to the Wi-Fi, there are there are general mitigations they can do at what we call a local level at a personal device level that should help remove some of that fear and then ultimately should decouple technology from geopolitical conversations for an end user because they are actually able to control the security of that device.

Miah Hammond-Errey: [00:30:18] The final segment we have is 'Need to Know'. Is there anything I didn't ask that would have been great to cover or anything I haven't covered that you'd like to share with our audience.

Jessica Hunter: [00:30:27] I think a couple of things. My techs will hate me, but one of my life hacks is getting a password manager, so I think I should share that just to manage your life if you're a little bit like me and trying to deal with so many different pieces of technology and how you manage those. My techs would say, 'No, no, we're moving to zero trust Jess, please. No one should have a passphrase'. But I will give you the real-life view that password managers are somewhat handy. I think my other piece of advice is every sort of five to ten years, we worry that technology has got so advanced we're not going to be able to cope. But we actually have that conversation every five to ten years. And what inspires me is we always, from an intelligence community perspective, from an academia perspective, from a cybersecurity perspective, we always find a way to be innovative and pivot with that technology change. And I guess my words of wisdom or advice or things that people need to know is to not lose heart with that and actually keep on keep on, knowing that we can do the innovative pivot and the change and that we can work through any of the new technology that's thrown at us.

Miah Hammond-Errey: [00:31:35] And the final question is, you talked a little bit about vulnerability. And I want to ask in true Brené fashion, like what makes you vulnerable? But that doesn't feel right for a tech, for a tech and security podcast. So instead, how do leaders embrace vulnerability in the technology space? And and I'm not specifically saying it's different from regular roles, but but there tends to be a different kind of attraction and different kind of people. And so what do you think is important in being vulnerable in, in an area where your work has huge ramifications outside what you might see every day?

Jessica Hunter: [00:32:10] I think being vulnerable is just being authentic. And when you're dealing with technology in particular, you have individuals who are very analytic in their thought process, very able to make data-driven decisions and truly want to get to the heart of of the challenge or the problem. So being vulnerable doesn't mean being emotional. It doesn't mean, you know, physically doing more hugging and kissing. It actually just means being truly authentic in what you do and don't know. And what I have learned over the almost 20 years working with some of the best and brightest technologists in Australia and overseas, is they want you to ask the questions to ensure that you have all the right information, but always admit when you don't have that information or you don't have an understanding, that's the authenticity. And that in my mind, is what being vulnerable is all about. If you are open and honest and authentic around what you don't know, then you also protect from a compliance and legislative perspective because you're never going to make a decision without the full knowledge, and you're also ensuring that that all of your team and workforce are asking the right questions through that process.

Miah Hammond-Errey: [00:33:20] Jess, thank you so much for joining me today. It's been a real pleasure.

Jessica Hunter: [00:33:23] Thank you. It's been super fun.

Miah Hammond-Errey: [00:33:27] Thanks for listening to Technology and Security. I've been your host, Dr. Miah Hammond-Errey. I'm the inaugural director of the Emerging Tech Program at the United States Studies Centre, based at the University of Sydney. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_he or send an email to the address in the show notes. You can find out more about the work we do on our website. Also linked in the show notes. We hope you enjoyed this episode and we'll see you soon.