Kara Hinesley, Canva’s global Head of Public Policy and Government Affairs, joins Dr Miah Hammond-Errey to discuss her experience during the creation of The Christchurch Call after the livestreamed 2019 terrorist attack in New Zealand while at Twitter, the complex relationship between AI, art and artists, AI and IP, AI regulation and the technology workforce shortages,as well asbuilding robust civic discourse and debate on digital platforms. They also discuss, what differentiates Australian and American tech companies and culture, working to prevent online and offline harmsand navigating a career shift from law into public policy and from the United States to Australia.
Before her current role as the global Head of Public Policy and Government Affairs at Canva, Kara worked at Twitter, including as the Director of Public Policy, Government, and Philanthropy, overseeing policy strategy in Asia-Pacific. Kara was also previously an advisor for the Honorable Minister Ed Husic when he was in Opposition and has a background in law, in Australia and the United States.
Technology and Security is hosted by Dr Miah Hammond-Errey, the inaugural director of the Emerging Technology program at the United States Studies Centre, based at the University of Sydney.
Resources mentioned in the recording:
- AI-generated art cannot receive copyrights, US court says (Reuters)
- Submission to DISR consultation paper, ‘Supporting Responsible AI’ (Canva)
- AI Bill of Rights (White House)
- Christchurch Call Story (Christchurch Call)
- Global International Forum to Counter Terrorism (GIFCT)
- Measuring The Health of Our Public Conversations (Cortico)
- Twitter health metrics proposal submission (Twitter)
- Lox In A Box (Bagel Store)
- God Human Animal Machine (Meghan O’Gieblyn Book)
Making great content requires fabulous teams. Thanks to the great talents of the following.
Research support and assistance: Tom Barrett
Production: Elliott Brennan
Podcast Design: Susan Beale
Music: Dr Paul Mac
This podcast was recorded on the lands of the Gadigal people, and we pay our respects to their Elders past, present and emerging — here and wherever you’re listening. We acknowledge their continuing connection to land, sea and community, and extend that respect to all Aboriginal and Torres Strait Islander people.
Please check against delivery
Dr Miah Hammond-Errey: [00:00:02] Welcome to Technology and Security. TS is a podcast exploring the intersections of emerging technologies and national security. I'm your host, Dr. Miah Hammond-Errey. I'm the inaugural director of the Emerging Technology Program at the United States Studies Centre, and we're based in the University of Sydney. My guest today is Kara Hinesley. Thanks for joining me.
Kara Hinesley: [00:00:22] Thanks, Miah. Excited to be here.
Dr Miah Hammond-Errey: [00:00:25] Kara is currently the global head of Public Policy and Government Affairs at Canva. She's previously worked at Twitter, including as the Director of Public Policy, Government and Philanthropy overseeing policy strategy in Asia Pacific. Kara was also an advisor for the Honourable Minister Ed Husic, when he was in Opposition and has a background in law in Australia and the United States. I'm so happy to have you on the podcast, Kara.
Kara Hinesley: [00:00:47] Again, this is phenomenal. Big fan, long-time listener. Happy to be here.
Dr Miah Hammond-Errey: [00:00:53] We're coming to you today from the lands of the Gadigal people. We pay our respects to their elders past, present and emerging, both here and wherever you're listening. We acknowledge their continuing connection to land, sea and community and extend that respect to all Aboriginal and Torres Strait Islander people.
Dr Miah Hammond-Errey: [00:01:09] Okay, Kara. First up, what excites and terrifies you about technology?
Kara Hinesley: [00:01:14] I'm going to go with Excite first because I love technology. I love tech. Whenever I got the chance to start working in a portfolio when I first moved to Australia about 13 years ago in tech, I was like, Oh my gosh, I've made it, this is fantastic. So for me, I'm actually really excited about the direction that things are headed. I've been really lucky to work a lot on some of the nascent AI regulation and participating in a lot of the government consultations that are going on. I know that there's a lot of concern, and I would definitely acknowledge that there's going to be some really thorny issues of intersection around safety, around compliance, around just overall use. It's going to change how we're able to work and interact with each other. But that being said, I'm really lucky to work at a company like Canva that's doing a lot of really thoughtful work on how it will be integrated into people's lives, how it's really going to put humans at the centre of things and really just act as a tool. So it'll be like we're the composers and AI is just the thing that like helps us hit the right notes. So I'm really excited about where this is going and where it's headed in terms of being scared of things. I'm not like frightened. I think that we actually have really good dialogues going from, you know, the actual industry to government to also working with academia and civil society. When you get that perfect mix and when you have everyone working and talking together, you actually come to really good solutions. So I'm actually I mean, hate to sound like a cheerleader, but a little bit overly optimistic at the moment. So I think we're going to be okay.
Dr Miah Hammond-Errey: [00:02:57] What a great start. So let's talk AI. What areas of AI are top of mind for you at the moment?
Kara Hinesley: [00:03:04] Well, most people are really focusing on LLMs at the moment. So large language models, that's basically your typical ChatGPT. We've all played around with it. We've all had a bit of a, you know, go at making funny puns or looking ourselves up, whatever it might be. But I think with LLMs, that's where a lot of the focus is. There's also a lot of foundational models that are looking at being able to have image generation. There's of course, a lot of research being done into artificial general intelligence. And so I think there's a full gamut. But right now the focus I think, is really looking at how are those LLMs going to be able to condense information, make work life a little bit easier? What kind of job skills can we start looking towards and really making sure that we're getting people prepped for it? So that would be the main kind of focus area that I'm getting to work on.
Dr Miah Hammond-Errey: [00:03:59] How do you balance a strong creative user base as well as employing AI? And can you talk us through some of the tensions, maybe not in Canva, but kind of more broadly?
Kara Hinesley: [00:04:11] That is a big question. And I think that there's a lot of discussion going on between the creative industries and artists and how AI could impact them in their work or displace what they're doing. With Canva, we're very lucky. We have a strong creator community and we're doing a lot of that consultation to make sure that they have their viewpoints heard, that they're letting us know what they're worried about, how they're using the tools and what we can do to make sure that it's working well for them and again, taking into consideration their concerns. So I think that it's been a little bit different in terms of like how the conversation has been going. Again, being in that more intermediary situation, I think that when there's a lot of these conversations at the foundation level and like we have the UK AI Summit coming up quite soon, there's also a lot of conversations going on right now with the EU trilogue process as they're really looking at negotiating what the final AI Act is going to look like, I think that's where we're going to get into the nuts and bolts of what does it mean for actually just ownership? Like a lot of conversations are going around with copyright and with fair use or fair dealing. What that looks like for training data, not to mention even the output.
Kara Hinesley: [00:05:33] So right now we've seen a lot of court cases coming through the US that have basically said anything that has an AI tool used to generate like an image or something, that it's not eligible for copyright because you know, there's this basic copyright tenant in the law that you need to have human originality, you need to have human creativity, that it has to be a human output. And I'd actually start to hazard the guess that there's a lot of people that would actually dispute that and say there's a lot of human interactivity that's going into this. And that I think we could even get into the whole discussion around prompt engineering and what that starts to mean. But again, I think that we're going to start to see the law have to kind of tweak and look at, okay, well, whenever we had cameras, for instance, there was a lot of these conversations about could this be copyrighted? It's, you know, a photo. It's not something you drew or painted. But then they started to say, well, actually, you know, when you're setting up the camera and you start to figure out what you see through the viewfinder, and then you start to envision what that photo is going to look like, then that's something that you created. That's something that.
Dr Miah Hammond-Errey: [00:06:45] I'm so happy that you went here so quickly here because I wanted, one of the questions I had was I want to talk about the use of art in AI. And, you know, I guess what are some of the conversations to support artists and creators and protect their IP, but equally enable them to use AI tools, noting they're often trained on existing design and artwork. But at the same time, this is a really complex question about where is human engagement with art?
Kara Hinesley: [00:07:11] Whenever you're looking at art, I think that there's a big question that's starting to come through in the context of the AI conversation, where you're looking at where does something become derivative or even copying or where is it just influencing? And you hear when you're talking to a lot of artists that they will talk about artists that mentored them or, you know, they saw something that influenced them. And when we're looking at how AI is going to plug into all of that, it could be trained off of certain data and a lot of different images, but we're not quite sure to what point is it just being influenced or is it an actual derivation. So I think that these are things that are continuing to kind of come through. I think that these are bigger topics that we need to really unpack to try to figure out how does this all come through with the neural networks, with all the weighting and freighting that goes on behind the scenes? And this is something I don't think I have the answer here, but definitely I'm very interested to see how it plays out.
Dr Miah Hammond-Errey: [00:08:13] All right. So you've gone there, 2023 has already been huge in the tech policy and regulation space. What are the key issues that you're thinking about from an Australian and global perspective?
Kara Hinesley: [00:08:23] In Australia we're really thinking about what is the AI regulation going to look like. We put forward a submission in the Department of Industry, Science and [Resources] consultation that was currently going on and we're looking forward to working with the Australian Government and Minister Husic as they look at coming up with some sort of approach that's going to be tailored for the environment here. I think also globally, one of the things that I'm very mindful of is that we're seeing a lot of both voluntary and self-regulatory regimes or frameworks start to come through that are related to AI, like the Biden White House, you know, had their AI Bill of Rights that was released in October of last year. And now they're also looking at all of these different formulations with a lot of the frontier companies or frontier models as they're talking about. And we're also seeing the UK approach starting to peek through. But then we've recently seen President Macron in France start to say some things that were a bit counter to Rishi Sunak. So I think that we're just starting to see a little bit of this fracturing across all the different markets, even though they have, I think, pretty shared common law approaches or at least shared democratic values. And what I'm concerned about is that we're going to start to really see a regulatory fragmentation and having very inconsistent approaches or thresholds or even definitions possibly that will make it really difficult for companies that are multinational to operate with any sort of business certainty. So what I'm hoping that we'll get to is some you know, harmonisation, some international cooperation.
Dr Miah Hammond-Errey: [00:10:14] Access to tech talent is often billed as a really big issue, both in Australia and domestically. There are global shortages of qualified tech workers and that gap is predicted to continue to increase. You know, where do you look for the best talent and how do you see developing Australia's tech workforce – What are the challenges in doing that effectively?
Kara Hinesley: [00:10:36] Yeah, I would say that one of the challenges is that it's probably and this has been discussed before, but that pipeline issue of trying to get people to go into those STEM subjects when they're younger and then stay in them instead of kind of dropping out throughout the pipeline. But I would say I think that people from all different backgrounds actually have a lot to contribute to tech. And so I would say that instead of having to try to find like specific tech backgrounds or folks who have certain credentials from a university degree, we can start looking at people from all different walks of life and just kind of put them into these types of roles where it's not necessarily that you're a product manager or an engineer, but you could be from a PR background and all of a sudden be helping with writing policies that companies need.
Dr Miah Hammond-Errey: [00:11:36] I mean absolutely. I think embracing a broader perspective of what tech workforce is is critical. You know, I do still see and I did just present at the Quad Tech Network, you know, in terms of the the workforce and particularly the cyber security AI workforce, there are critical shortages in some specific areas. And we need to both expand our view of tech workforce in a broader sense. You know, as we've just talked about, bringing more people along that journey means we're going to get more representative tech, too.
Dr Miah Hammond-Errey: [00:12:07] Are there differences in the way that Canva thinks about policy issues around emerging technologies? And I'm thinking about this between, you know, an Australian and an international context. Like how do you find working for an Australian company?
Kara Hinesley: [00:12:20] Working for Canva I have to say, one of the things that I really love is that the company and the founders always have this view of: How can we look at policies and how can we look at shaping a conducive environment that will be really great for all the companies that are coming up behind Canva? So they're always thinking about, okay, we were able to get to this point, but we want to make sure that there's going to be the same sort of environment for all the future companies and all the start-ups. And that's been something that I think is really important and sometimes gets forgotten by some of the US multinationals that whenever you're kind of, again, looking at the landscape, you want to make sure that you're going to have a diversity of choices, that you're going to have lots of companies that, when everyone's able to kind of work together on a lot of these problem sets, that you're going to achieve a better outcome. So it's been wonderful when we're looking at kind of public policy engagement, to really think through how can we make sure that the next Canva of tomorrow is able to get to this point.
Dr Miah Hammond-Errey: [00:13:26] You were a former adviser to the now Minister for Industry and Science, the Honourable Ed Husic, during his time as Shadow Parliamentary Secretary to the Leader of the Opposition, assisting with Digital Innovation. What did you learn from that experience that was subsequently really valuable in your roles at Twitter and now at Canva?
Kara Hinesley: [00:13:43] I was really lucky whenever I worked for the Minister's office when he was in Opposition because I actually learned how to do a lot of things all at the same time. I became a very effective 'Jill of all trades,' pretty much. So I was used to having to be across issues ranging from, I remember, tax and employee share schemes to, you know, skills and migration, to grappling with what was going on in the digital economy, working with him during the pricing IT inquiry, it was it was a range of everything. And so it helped me become very well acquainted with what was happening on a federal level in Australia, as well as being able to take my background as a lawyer in the United States and really start to see where there are commonalities, but where there were also key differences and being able to grapple with what it looked like to help actually build out legislation or figure out what self regulatory frameworks could look like. And then working with industry and how that actually is like the magic sauce in terms of you can create all these great ideas in a vacuum. But in practicality, if they're not going to work for the business environment that you're in, it's not going to work.
Kara Hinesley: [00:15:04] I used to say whenever I was working there that it was like going into a dark room, like a dark art gallery. And I knew that there would be a huge painting in front of me, but they just gave me like my phone flashlight. And I'm like, just kind of flashing it around trying to figure out, okay, what what is going on? What is this painting? What does it actually have in front of it? And I think that when you get everybody in the room and all of a sudden you start to hear, oh, that's what they're experiencing or that's what they were saying, you know, you start to really understand their problems in a very tangible way. And that's when you can actually start to figure out, like, real solutions.
Dr Miah Hammond-Errey: [00:15:40] I want to ask you some questions about the Christchurch Call. You were actively involved in Twitter, joining part of the Christchurch Call, which arose in response to the livestreamed 2019 terror attack in New Zealand. It focused on eliminating terrorist and violent extremist content online. Can you tell my listeners a bit about that process and the response?
Kara Hinesley: [00:16:01] Going through that process was incredibly challenging because it came about because of such tragic events. And I still remember March 15th, 2019, clear as a bell in my head and how it all unfolded. And it was incredibly harrowing and really difficult. Now, that being said, the way that former New Zealand Prime Minister Jacinda Ardern approached the issue was, I think, incredibly insightful. And she really again had the foresight of, okay, when we're looking at something like a terrorist attack, how can we approach it with shared and joint kind of resolutions? Whenever you have like an incident like that, it's an offline harm, but there are also online harms that came from it. And when she was formulating the Christchurch Call to action, she came together with the tech companies and basically said we need to come up with an approach to make sure that this never happens again. And what would that look like in terms of on the tech side and also on the government side? And so on the tech side, there was some really key principles that were pulled together. I remember having to fly to San Francisco. We had a meeting actually at Twitter's headquarters on Market Street, and we had all the different general counsels and chief legal officers and public policy folks from all the companies that signed on. And we came together with Prime Minister Ardern's team and we started talking through what this could look like. And there was a really key part of it around the companies looking at how we could examine algorithmic recommendations and also how could we have an approach that would enable quick sharing of information across all the different companies and having a coordinated response if something like this were to happen? And if for those of you not familiar with the Global Internet Forum to Counter Terrorism, it had been a group that had been stood up previously as kind of like a kind of portion of the UN, and it had been this kind of group that was looking at just having like a hash sharing database.
Kara Hinesley: [00:18:16] All the different companies that had signed up to it were able to feed in hashes of content that had been verified as terrorist content. And what kind of came together after that was how could the companies have a content incident protocol essentially, so that whenever there is an incident happening, but it had an online aspect to it, like the Christchurch attacks, that we would be able to quickly not only let all the other companies know, but also start to put all of these hashes into this database and be able to eradicate that content that had been verified as terrorist or violent extremist content, so illegal in the jurisdictions that we operate within and be able to take that down en masse. And that was probably one of the real key elements that we've seen that still benefits today, where when we've had other attacks like in El Paso, in Halle, Germany, that we haven't had this sort of virality effect really take off because this has been working. These content incident protocols have worked through the GIFCT.
Kara Hinesley: [00:19:26] But it's also one of the interesting pieces that I don't think a lot of people talk about with the Christchurch Call to action was it also included the media. So I think that that sometimes is forgotten is such an important part of the conversation as the fourth estate and also in disseminating information and being able to share and inform people that are looking for, you know, just what's going on. They're trying to figure out what's happening. And one of the key stats that I think was a bit more indicative of the type of users that Twitter had at the time, but when we were looking at what kind of accounts were mainly responsible for the proliferation of the videos and the type of content that we had to action at the time, the the final stat was that there was 72% of the accounts that were taken down or the content that was actioned came from verified accounts. And at the time Twitter verification was on hold, it was being revamped. And the only way that you're able to get verified was if you are a news outlet or a journalist. And so a lot of it came from, again, good intentions of trying to share news information and what was happening. But also I think having them at the table around these conversations was like, okay, well, how can we safely share that information and how do we start to think through responsible, you know, thoughtful sort of approaches in that way? And I would just say things have changed quite dramatically since I worked for the company. Um, but at the time this was the approach.
Dr Miah Hammond-Errey: [00:21:03] Yeah. I was going to ask you actually what lessons this sort of shared government and industry response provide for other potential challenges, but it sounds like you're expanding that out here and saying it's not just government and industry, it's also media. And now we might need to think about civil society as well.
Kara Hinesley: [00:21:19] And civil society was and still is a part of the Christchurch Call. So they there was four real kind of key groups that were involved in trying to discuss, okay, what happens from more of an operational standpoint, but then also what's taking place post incident. And so how do you discuss social cohesion programs? How do you talk about dealing with the aftermath and with like community trauma and shared tragedies, but also looking at how to kind of, you know, future proof and build muscle against having something like that happen again. So all those different sectors are imperative and integral to the solution.
Dr Miah Hammond-Errey: [00:22:00] It was a pretty profound response to a really harrowing incident.
Kara Hinesley: [00:22:04] Well, I would say that it was really unique in the fact that it was led by a government that was trying to look at an approach that was very holistic and not looking at it as a carrot and stick approach, but how can we all move things together? Because I think sometimes people look at tech as like a really simple silver bullet solution, but you cannot have any sort of, you know, you can't have technological change fix things that are social problems. And when you're looking at these underlying issues that happen offline, we still have to solve them or they're going to keep coming back online.
Dr Miah Hammond-Errey: [00:22:53] Thank you. Integrity, trust and safety have been themes in your career and you're obviously passionate about them. What do they mean to you? As in, why are they important? And can you talk me through some of your career highlights?
Kara Hinesley: [00:23:05] So with integrity and trust and safety, I've been really lucky, a lot of that work I was able to fully realise through elections. So I've worked on like some 20-odd elections while I was working at Twitter throughout the kind of region. And then of course trust and safety work threads through all different policy kind of discussions and approaches. It's basically, I'd say trust and safety is very foundational to how we need to think through dealing with any sort of like content moderation or online experiences. Any sort of Internet governance needs to have trust and safety at its core. The integrity piece, I think is something that really came alive post 2016 US presidential election. And typically, I'd say a lot of the companies really understood that it was incredibly important to look at elections and they always were looking at it from like a participation event. But they started to realise that the misinformation and disinformation problems were something that really needed to have a different approach than, I think previous Internet issues that we discussed, like online safety and, you know, abuse or harassment.
Kara Hinesley: [00:24:23] And so as we started to transition into looking at elections as something that needed to be viewed as like a civic event, instead of just kind of like, oh, let's all, you know, tweet like, you know, go vote or, you know, something a little bit more light-hearted, it started to really focus again on how can we make sure that people are able to understand what's happening, what the timeline is and when it's happening and what they're expected to do. I was very lucky, for instance, like working in Australia, I was able to work with the Australian Electoral Commission. And having again experienced electoral processes and systems in the United States, I think that sometimes Australians don't realise how lucky we are to have the AEC here, the way that they are bipartisan, the way that they go about actually having the carriage of elections or even like referendums which we have coming up like it's very methodical. It's a very, you know, safe, robust system of being able to just look after the roll, being able to get information out to people on how to vote.
Dr Miah Hammond-Errey: [00:25:37] How do you feel about compulsory voting here?
Kara Hinesley: [00:25:39] I have absolutely done a 180. When I came here, I thought that compulsory voting, I was like, it should be your right if you don't want to vote, like, come on, you know, I was a bit, you know, kind of libertarian on that front. And then actually, as I have seen the national dialogue really ramp up and people clue in at these specific times where they know they're going to have to show up or at least get their name marked off. So they might as well, you know, read a newspaper or listen to a podcast or, you know, maybe just have the television on in the background during the news. Like they they do clue in. I think that Australians have a more heightened sense of what's going on in that civic field and that they are a little bit more interested actually in like day-to-day politics than I feel like in the United States. A lot of folks that I talk to at least are starting to kind of take a step away, that they're finding it all really challenging in the current environment and with the polarised conversations to weigh in all the time or to try to stay across it. And so I find that the compulsory voting I know very controversial, very controversial, but I'm a fan. And besides, come on, democracy sausage like.
Dr Miah Hammond-Errey: [00:26:56] Hello, win.
Kara Hinesley: [00:26:57] I know it's like highlight of my Sunday or Saturday whenever we have those elections.
Dr Miah Hammond-Errey: [00:27:02] So I want to talk a little bit about the information environment and, how do you see the differences between being an information dissemination platform like Twitter and a platform for the creation of content and design like Canva? How do these two different platforms, if you like, how do they impact our role in countering mis and disinformation and kind of creating a more resilient information environment?
Kara Hinesley: [00:27:31] One of the things that I've really noticed is just what people come to the platform for. It's like, what is the platform's core competency and how are people using it? At Canva, like you said, people are coming to create. They're coming because they want to do something that's either professional or personal, but they're creating something that's for them and they're getting to do it in a really beautiful, wonderful way.
Kara Hinesley: [00:28:00] And with Twitter, it was all about serving the public conversation. It was all about trying to bring people together, often that have very different, differing viewpoints and trying to share all of that information in a way that is not going to all implode into a dumpster fire. I think that it was always going to be really challenging to try to figure out, how can you have healthy conversations in a space where we've seen, again, a lot of deterioration of our general political discourse, of a lot of polarisation on kind of key issues, going through a global pandemic. Like the world hasn't been an easy place to deal with for the past few years for anyone. And one of the key things that I remember Twitter was trying to do with a non-profit called Cortico back before the pandemic was trying to figure out what are the key indicators for a healthy conversation. And I remember that some of the key things that they were looking at were shared reality, shared attention, receptivity and diversity of opinion. And those were the four key areas that if you could achieve all those things, then you could start to see public discourse really come about in like a productive way. And there are certain ways to kind of introduce some of these things through like fact checkers having certain changes in algorithms, trying to pierce filter bubbles with certain information. But the company was still trying to figure out what was that perfect mix, what were all the ingredients that, we were figuring out the ingredients, but we hadn't figured out the recipe. And unfortunately that was derailed. And so I think that there's still a lot of conversation of how can we get things to a good place when we know that there's a lot of information disorder. But I'd say it's definitely different in the public square versus, you know, in your private spaces.
Dr Miah Hammond-Errey: [00:30:08] We ask our intelligence and security leaders about nation state alliances, but how do you see tech companies contributing to the alliance between Australia and the US?
Kara Hinesley: [00:30:17] I think tech companies are playing a really key role in not only developing a lot of the technology, but also partnerships. I think that we need to realise that we need to be able to deal with partnerships in a lot of different changing environments and we need to be facile and dextrous enough to be able to adapt as we see either things change on a geopolitical stage or as we continue to see new emerging technology come through. So I think the partnerships piece, both in the private sector and the public sector, is integral to success.
Dr Miah Hammond-Errey: [00:30:55] What do you see as Australia's tech strengths as a nation?
Kara Hinesley: [00:30:59] Australia is a place where a lot of products get tested out. Australia, early adopters, high market penetration. Australians also, they love watching video. Whenever I was at Twitter, you know, video ads were a big deal and Australia was consistently one of the top ranking countries to watch video on their phones. I don't know, maybe we're spending too much time stuck on trains or like public transit. But yeah, Australians love video. So I think that it's a place where, again, because we see a lot of smartphones and a lot of early adopters, this is a great place to test stuff out.
Dr Miah Hammond-Errey: [00:31:47] We've just got two more segments. One is called Disconnected, and it's asking, what do you do in your downtime to keep sane?
Kara Hinesley: [00:31:54] I always was really jealous of all the people that would come in and be like, Oh yeah, I windsurf on the weekends. Or, you know, I've also just been like, you know, casually like, I don't know, building a ship in my backyard or something. Like, I feel like all these tech folks, I mean, they're all overachievers, they're all so overkeen. So I don't have anything that exciting. You know what I like to do on my weekends? I like to binge watch TV and I like to eat at really new, wonderful places. And so.
Dr Miah Hammond-Errey: [00:32:25] Okay, so what's the most recent awesome restaurant you've eaten at?
Kara Hinesley: [00:32:28] So this is actually not a fancy restaurant. This is just like a really nice, cool casual eatery. I don't know if you've ever had Lox In A Box? Okay, so I'm like a big fan of, I miss delis, like whenever I'm. Before I moved here, I lived up in Connecticut, in New York for five years. And so I got really used to deli culture and going to diners. And I used to have like a law class that ended at 10 p.m. at night on a Wednesday. And I also have always worked while I've gone to school, I've put myself through school, and so I usually didn't get to eat dinner before this class in like 10 p.m. I'm like, you know, starving. And I've only had, like, stuff out of the vending machine. So we would go to this deli that was open 24 hours in New Haven and they would have like bagels and pickles and like all this sort of stuff anyways, Lox In A Box. The first one was in North Bondi, but they've just opened one up in Marrickville and it's like a proper corner store and they have the best babka that I have ever had in my life. And I've had like Russ and Daughters, okay, like I know good babka. Their babka is amazing.
Dr Miah Hammond-Errey: [00:33:36] So we have a segment called Eyes and Ears, and it asks, What have you been reading, listening to or watching lately that might be of interest to our audience?
Kara Hinesley: [00:33:43] I have been reading a book called God Human Animal Machine, and it's by Meghan O'Gieblyn. And I'm halfway through, so nobody tells me the ending yet, but it's fascinating. She's talking a lot about AI, but also what does it mean to be human in a digital technology field? And also talking about a lot of the religious metaphors that come through with the way that we discuss the mind, the body and also tech. And right now, she's just gone through a really beautiful discussion about having these comparisons to machines. And now it's kind of these digital comparisons. And so talking about, you know, like our bodies being hardware, our brains being software and what that all looks like. And it's, she basically said that all eternal questions have become engineering problems now. And so anyways, I think that AI and the information systems have all really started to talk about the mind's relationship to the body, the question of free will and what it will all look like as we continue to grapple with all of the emerging technology that's coming at us. So this book is very interesting.
Dr Miah Hammond-Errey: [00:35:03] I did have a question about, you know, saying you've worked in multiple countries in tech and security, and are there things we can learn from each other, particularly from a cultural and diversity perspective?
Kara Hinesley: [00:35:15] One of the key things that I think that it would be good for us to learn how to do is how to disagree without offending each other. I think it would be incredible if we were able to figure out and become more comfortable with ways to have a healthy debate. Having worked with a lot of folks, especially in like a lot of European countries, like some of my counterparts in France or Germany, they are a lot more comfortable sometimes coming forward and saying, I have a counter view to what you're saying, or I don't agree, but we're able to kind of have the back and forth without it being personal. And it would be really incredible, I think, if we were able to start thinking through how to have civil discourse and move back into a space where we feel comfortable with disagreement.
Kara Hinesley: [00:36:12] I think I was raised in the South. I'm from Dallas, Texas, originally, and there was always this saying that you don't talk about politics and you don't talk about religion at the dinner table and to the point like it does really help. I think sometimes when you have sensitive issues that are deeply personal, if you just don't talk about them, then you don't ever have to deal with the tension. But at the same time, then you never really learn how to ever talk about the tough stuff. And I wish that we could get back to a space where we could start to figure out what does civil discourse look like in our society.
Dr Miah Hammond-Errey: [00:36:51] You know, you mentioned that research about things that create a healthy conversation. And, you know, as we start to come together more as a policy community, trying to deal with, you know, tech companies, with government, with, you know, academia, you know, how can we bring these threads to try and create something where we can create a healthier conversation? You know, how can we bring the existing research, the knowledge of the algorithms, you know, the understanding of regulation together to try and actually solve some of these problems? I think that's my hope for like 2023, 2024.
Kara Hinesley: [00:37:24] Yeah, I agree with you. I think that most of the time people just need to feel valued and they need to feel safe. And if you have those two elements, then they're willing to have the conversation. Even if you're not agreeing. I hope that we can get to that spot where people will feel that psychological safety. They'll feel vulnerable enough to open up with each other across the dinner table and that they know that their opinions are valued even if they're not always the same.
Dr Miah Hammond-Errey: [00:37:50] Yeah, that's a really aspirational and beautiful place to leave it. Thank you so much for joining me.
Kara Hinesley: [00:37:55] Thanks, Miah.
Dr Miah Hammond-Errey: [00:37:57] Thanks for listening to Technology and Security. I've been your host, Dr Miah Hammond-Errey. I'm the inaugural director of the Emerging Tech Program at the United States Studies Centre based at the University of Sydney. If there was a moment you enjoyed today or a question you have about the show, feel free to tweet me @miah_he or send an email to the address in the show notes. You can find out more about the work we do on our website, also linked in the show notes. We hope you enjoyed this episode and we'll see you soon.