Aller au contenu principal
Background image

Closing Cybersecurity Blind Spots: Civic Engagement and Policy Innovations with Betsy Cooper

Share

Podcast

About This Episode

On this episode of To the Point, hosts Jonathan Knepher and Rachael Lyon sit down with Betsy Cooper, founding director of the Aspen Policy Academy and a longtime leader in cybersecurity policy. Betsy brings a wealth of experience from her work at the Aspen Institute, UC Berkeley’s Center for Long-Term Cybersecurity, and her time at the Department of Homeland Security.

In this insightful conversation, Betsy discusses the importance of bridging the gap between technologists and policymakers, revealing how the Aspen Policy Academy empowers experts—and everyday citizens—to make a real impact on cyber policy. She shares practical advice for getting involved in civic engagement, details her organization's latest cyber civic engagement initiative, and unpacks some of the biggest policy blind spots facing communities today. 

Tune in as the trio explores how AI and cybersecurity are reshaping the threat landscape, what roles individuals and small businesses can play in shaping policy, and why now, more than ever, your voice matters in the digital conversation. 

Podcast

Popular Episodes

      Podcast

      Closing Cybersecurity Blind Spots: Civic Engagement and Policy Innovations with Betsy Cooper

      FP-TTP-Transcript Image-Betsy-Cooper-780x440.png

      Rachael Lyon:
      Hello, everyone. Welcome to this week's episode of to the Point podcast. I'm Rachel Lyon here with my co host, John Knepher. John, hi. We're recording on a Tuesday this week.

      Rachael Lyon:
      This is kind of fun.

      Jonathan Knepher:
      I know it feels, it feels weird not being Friday this week.

      Rachael Lyon:
      So I know I've got Tuesday brain versus Friday brain. So maybe this will be an even more fun conversation. We'll see.

      Jonathan Knepher:
      I sure hope so.

       

      [00:46] Bridging the Gap: Technologists Entering the Policy Arena

      Rachael Lyon:
      So excited to welcome to the podcast Betsy Cooper. She's the founding director of the Aspen Policy Academy. She joined Aspen's Cybersecurity and Technology program after serving as the Executive Director of the BER Berkeley center for Long Term Cybersecurity at the University of California, Berkeley. She's also served at the U.S. department of Homeland Security as an attorney advisor to the Deputy General Counsel and as a policy counselor in the Office of Policy. And she's also worked for over a decade in homeland security consulting, managing projects for Atlantic Philanthropies. If I'm saying that right, I'm embarrassed now, Betsy, in philanthropies.

      Betsy Cooper:
      You got it.

      Rachael Lyon:
      Philanthropies. I'm having a day. The Prime Minister's Strategy Unit and legend of the World bank, among many others. Welcome. Welcome, Betsy.

      Betsy Cooper:
      Thank you so much. Delighted to be here.

      Rachael Lyon:
      All right, John, jump in, babe.

      Jonathan Knepher:
      Yeah. So, Betsy, let's kick this off through your Aspen Policy Academy. You're training technologists to engage in policy. What have been some of the biggest mindset shifts for those people who are joining for the first time?

      Betsy Cooper:
      Yeah, well, first, thanks for having me. And it's so exciting. Always. Cybersecurity is obviously near and dear to my heart. So talking cyber policy, it's always a good Tuesday when we get to do that. So technologists come into our programs and they're really interested in getting involved in policy and usually haven't done it before. And so here are a few things that we tend to have to tell them. First, put your bottom line up front, like tell the policy person what, what your ultimate goal is and what you want them to do about it.

      Betsy Cooper:
      As technologists, we, we tend to meander. We might be thinking about things and trying to tell a story. The policymaker might not have time for that. They might leave before you get to the end. Of the story. So put your bottom line up front. Another fun one that I see, especially with cybersecurity technologists is the style of email that you write is very different. So scientists and technologists tend to write very blunt one sentence emails.

      Betsy Cooper:
      No pleasantries, maybe no hello, certainly no goodbye. And if you do that in the policy context, you're going to be seen as rude. So in this sort of situation, we tend to recommend people send a nice, you know, introduction. Dear so and so, my name is blank and I want to do X, Y and z. Like here's my authority or why I'm interested in this. So like it's a totally different style of writing. And then I think finally, like teaching people that they do have skills to bring to bear that can make a difference in the policy process. And that's both their technical skills, so they deeply understand cybersecurity or AI, but it's also the way that they think about solving problems can be really helpful to policymakers.

      Betsy Cooper:
      So going through a process of agile design and thinking about how to actually use product testing to improve things, policymakers tend to do stuff a little bit more in the software world and can learn a bit from some of these hard planning skills. So, so my favorite point is, yes, I have plenty that we can teach you to make you better, but you have plenty to teach them and you should be in the conversation.

      Rachael Lyon:
      I love that. And if you don't mind, Betsy, I'd love to take a little step back too. For those that may not be familiar with the Aspen Policy Academy, I mean, could you just give us a quick overview of how it, how it came to be? And it looks like, you know, you guys cover tech environment, capacity building. There's a lot that your organization is doing.

      Betsy Cooper:
      Yeah. So we came to be out of my work at UC Berkeley running a cybersecurity center. At the UC Berkeley center for Long Term Cybersecurity, I was the executive director. And one of the key things that we were doing was trying to fund faculty to do important research. One of the things that I quickly realized was that we were aiming not just to do academic research, but to have it impact the policy process. And even for our faculty members and researchers who wanted to do that, that skill set wasn't really taught. And so it was very difficult for them to figure out how to actually engage on that front and get involved in policy. And so one day I sort of had an aha moment where I realized that there was a big gap that we needed to bring into the cybersecurity space and beyond, which was we needed an incubator like these startup incubators that exist, but instead of for startups, for policymaking, to help people learn how to get involved in policy.

      Betsy Cooper:
      So like a startup incubator, we need to give people a boot camp where they learn how to get involved. We need to give them the opportunity to collaborate. We need a high tolerance for failure. We need to transmit the key skills that somebody needs, like the bottom line up front, I was just talking about. So I got very excited about this idea and thought, like, I should be doing this instead. And so brought the idea to some funders in the Aspen Institute, and seven years later, the rest is history. And so what we do is we try to help improve the democratic process by getting subject matter experts and ordinary people to advocate for change within their communities. So for subject matter experts, like the cybersecurity experts that hopefully are listening today, we run programs.

      Betsy Cooper:
      We have free webinars open to the public on things like how to write an op ed or how to submit a public comment. We have short courses. So we have a recruitment open for environmental policy accelerator, for instance, for people working on environmental problems to get coaching. And then we ran a science and technology policy fellowship, sort of our flagship fellowship every year in which, like, 15amazing people come in for a month of immersive training. So some of them will be cybersecurity, and some of them might be another tech and science topics. And then with sort of more ordinary people, we run programs that are actually designed to help them tell governments what they think, to share their ideas more broadly. And so we're expanding a lot in the cybersecurity space on that front as well. But more generally, the goal is to help people understand the skills of civic engagement.

       

      [06:45] From Academia to Advocacy: The Birth of the Aspen Policy Academy

      Betsy Cooper:
      There's a lot of civics courses out there that teach you the branches of government. We don't want to teach you just the branches of government. We want to teach you how to interact with the branches of government to advocate more effectively for change in your communities. And so it's the marriage of those two things that I think makes our program really unique.

      Jonathan Knepher:
      That's fantastic.

      Rachael Lyon:
      Yeah, it really is. Yes.

      Jonathan Knepher:
      So maybe if you go into some more detail on how you keep your training up to date with all the things that are changing so rapidly across cybersecurity, AI and to, like, the rapidly changing privacy stuff that's been happening recently.

      Betsy Cooper:
      Yeah, so it's definitely a changing field, for sure. And one of the benefits of our program is that we aren't teaching people cybersecurity in a sense, where mostly the people, at least the subject matter expert side, they're coming in knowing cybersecurity and they want to learn how to get involved in policy. And so that narrows the playing field of what we need to teach people. We're not teaching people the beginnings of how cybersecurity might work or the intricacies of artificial intelligence or algorithms. Folks are coming in with that technical expertise and we're teaching them how to translate that. And so that helps us not have to constantly be at the forefront of both of these fields. That being said, there's tons of changes happening, as you mentioned, whether it would be new AI and cybersecurity programs at the federal level or each individual, state or locality has their own programs. You might have something like an incident like happened recently in St.

      Betsy Cooper:
      Paul that really changes the policy trajectory in different places. So we do have to constantly be innovating and adding new examples. We like to draw from a lot of case studies that will pull from things like St. Paul so that people actually get to see what it would be like to try to work and change some of these spaces. But I think the key is that the basic skills that we teach, like communicating clearly or how governments are structured, those are things that are long lasting. And then we build technology examples on top of them and that helps it stay fresh while also focusing on a core skill set that we think everyone should know.

      Rachael Lyon:
      That's amazing. The whole, gosh, the path of policy is pretty astronomical in how it works, particularly new things coming online. I'd be interested in your perspective on the policy blind spots you're seeing, particularly in cyber and what's a path to close those?

      Betsy Cooper:
      Yeah, so I think first, most people, as I mentioned at the top, don't realize that they have value to add. So I'd say the biggest blind thought is just like, get involved, get off your butts and come share your information and experience. Because the average policymaker knows very little about cyber security. The average policymaker has taken that one hour cybersecurity training course that we all have to take to keep our jobs. And that's probably all they're thinking about, right? And you are sitting in your chair, probably, you know, shocked by the number of times you log into a government website that doesn't have multifactor authentication, or that seeing some of the out of date systems that are being used. I remember not in the not so distant past. I was at a university and saw that their stadiums, for instance, were still a public university. We're still using Windows 95, folks.

      Betsy Cooper:
      I think I started my career as an elementary school Student using Windows 95. And that's still, you know, that's still what's being used, right? So you see something, you should be saying something, you should be writing a letter, you should be submitting a public comment, you should be going to your city council meeting. And so that's a huge blind spot. Then. I think the other thing to understand about government is that there are lots of structures in place to prevent things from happening quickly. So a lot of technologists will head into the policy space and get frustrated very quickly because they want to make a change. And it seems very obvious to them that this change should be made, that, you know, multi factor authentication should just be turned on or whatever. And then they realize that there are lots of reasons why it's not that easy to do that.

      Betsy Cooper:
      Well, maybe the vendor contract didn't include that and then you have to go back through the contracting process or maybe there's a reason that, you know, some customized systems are built on top of the non multi factor authentication system and you would have to unwind all that. So go in with a vibe of curiosity like try to understand the problem. Don't assume that you know better than the civil servants who have been working on these problems for years. Ask a lot of questions, engage them so that you have the full landscape of the problem, and then work within that landscape to try to improve things. So it may take longer than you expect, or there may be hurdles that you wouldn't find in the private sector. Some of those hurdles are there to to make sure governments don't make huge mistakes and lose billions of dollars in scams in the first place. So let's work together to try to improve that rather than making assumptions about how government should or shouldn't work.

      Rachael Lyon:
      I love that.

      Jonathan Knepher:
      So, Betsy, looking ahead, there's a lot of emerging policy challenges and opportunity coming forward. What areas should technologists be preparing for right now?

      Betsy Cooper:
      Yeah, great question. And I think the simple answer is a I so AI is a huge policy topic right now, as I'm sure anyone listening to this show will know. And the intersection between AI and cybersecurity policy is probably going to be the biggest growth area at the federal level. People are going to be grappling with these questions. So how will AI affect cybersecurity? How will cybersecurity affect AI and how should the two work together? But even at the more granular level, like think about your city and state and all of the implications that could have for your community. So I live in the city of Napa, and we, as far as I know, don't have a city chief AI officer or anyone playing that role. But the city is going to have to grapple with suddenly getting double or triple the number of job applications or more, because people are using AI to apply to these things at school scale procurement processes where you used to get one or two bids, you might get, you know, hundreds, thousands of bids. Because anyone can use AI to create a passing application.

      Betsy Cooper:
      The opportunities to reach folks at scale, like, can you actually use marketing that uses AI to reach your different community members, Translate, you know, everything that you're doing into Spanish, for instance. So. So there are both opportunities and challenges, and people need to be thinking about that within their communities. And then you layer the cybersecurity piece on top of that and start thinking about, for instance, malicious actors at scale, targeting communities and being able to personalize in a way that they've never been able to before. Or vendors being able to create fake websites that make it much harder for a small community to figure out is this even a legit vendor that you want to give money to. So. So I think that there's huge opportunities for people with cybersecurity expertise to get involved in the policy conversations around those topics.

      Rachael Lyon:
      It's wonderful in cyber, too. I think we'll get a little bit to this in terms of diversity, but when you look at the industry and as it's maturing more and more, just the diversity of voices, too, that come in and perspectives. For example, we've had conversations with folks that are currently CISOs, but have PhDs in, I don't know, mythical studies, things like that, or some music PhD, and they bring that lens. Right. They look at it a little bit differently, and that's just so critical. Right. As we look at how do we shape policy ahead, I'd love to shift gears a little bit. I understand your team's launching a cyber civic engagement initiative.

      Rachael Lyon:
      We'd love to hear a little bit more about that and kind of what we're hoping to have it drive.

      Betsy Cooper:
      Yeah. So I think if you begin from the premise that people need to talk more about cybersecurity than we need to get folks engaged civically in doing that work. So take, you know, the average person now has experienced a scam or their family has.

      Rachael Lyon:
      Yes.

      Betsy Cooper:
      So we have repositories of stories of people going through these effects, and yet policymakers are not being exposed to that. And that's what this cyber civic engagement program is intending to do is to try to close that gap. So this program is really inspired by the work that Craig Newmark Philanthropies and Aspen Digital are doing on their big take nine campaign. So this is pausetake9.org and the whole thesis behind this campaign is that people need to stop. You need to take nine seconds to think about cybersecurity before you click on something. And then the campaign includes lots of other tips, things like multi factor authentication or compartmentalizing your data so that if someone gets in, they don't get the keys to the kingdom. That's all focused on sort of individually. What can you do for yourself to protect yourself? It's a public awareness campaign, but for a large subset of those people, the reason they got scammed in the first place was because some institution had a failure that led them to actually get scammed.

      Betsy Cooper:
      So an example of this would be, I have a former nanny. This former nanny was trying to get on the Social Security Administration's website. Uh, Google had a hit that turned out to be a scam website that was the top hit. So the nanny clicks that instead gave a bunch of sensitive information over to the scammers and then had to try to individually unwind and cancel a bunch of bank accounts and, you know, go through that whole process because of the risk that this information was misused. You know, sure, it would have been great if my nanny had caught that the website she was going to was not the Social Security Administration website. But also, why did Google let that ad show up in the first place? That's an institutional failure. Then you layer on top of that government institutional failures. There are lots of stories of people being scammed when they try to log into a government website.

      Betsy Cooper:
      And maybe they get misdirected or maybe that website later gets hacked. Well, why weren't the city's infrastructures set up to try to prevent that? And wouldn't it be great if city and people who live in that city stand up and say, hey, we don't want to take this anymore. We, we want you to take cybersecurity more seriously. So the cyber civic engagement program is particularly targeted to people in that situation. We currently have a pilot open, so for listeners, if you're interested in this, we would love you to consider joining our pilot. And then we will officially launch later in September and have a bigger public engagement. But the goal will be we're running training sessions in which you come in and learn how to advocate for stronger cybersecurity security in your community. So we'll be giving you examples and tips that you can use.

      Betsy Cooper:
      And then the top folks who participate in that program who are really interested and passionate can engage with us to get one on one coaching or potentially work on projects so we can work together for them to get into their community and advocate for change. And then as we go on, we're going to add more resources. So we'll have tip sheets that anyone can download or we're going to have train the trainers so that individuals who want to learn to train our training can actually get involved. So, so there's a huge growth pattern for this. But the, the ultimate goal is to give ordinary people who have been scammed a voice to share with their communities why it's so important to take cybersecurity more seriously and hopefully help make that change as a result.

       

      [18:32] Cyber Blind Spots and the Urgency of Civic Engagement

      Jonathan Knepher:
      So if you could give our listeners who, you know, maybe ordinary, everyday citizens, what, what's the, what's the top thing that they should be doing right now to play a role in, in shaping those policies?

      Betsy Cooper:
      Sure. So first, find weakness. You know, is your utility not using multi factor authentication when you log in to pay your bill, are you seeing an outdated resource being used or advice being given? If you're a small business owner, are you able to get small business cybersecurity support from your community? And if not, what, what is the gap? So the first thing is identify a problem in your community, a place where stronger cybersecurity is needed. Second, figure out who the stakeholder is, who's in charge of that. So if it's your utility and you're, you want to contest the bill, it's probably the utility that you need to go to. Maybe if it's a broader program within the city, you need to go to your city council or your mayor. So find the right stakeholder that can actually solve that problem and then go talk to them. If it's your mayor, maybe go to a city council meeting.

      Betsy Cooper:
      At least here in California there are open fora in which you can get the chance to say anything you like at your city council meeting. So go take advantage of that. Write a letter. Government officials tend to have public addresses, so you can send them a letter advocating for change and, and offer to meet with them to figure out how to help. So it's really all about having an opportunity to figure out what the problem is, figuring out who can solve it, and making sure you actually go to them to talk to them about that. And that's what our training programs will break down in more detail.

      Rachael Lyon:
      And you had mentioned earlier to St. Paul attack. When we talk about communities and getting involved, I believe it was a ransomware attack that had disabled some payment systems and St. Paul chose not to. To pay, which is what we tend to advocate for. But I'd be interested in your perspective on that municipal hack and what, what other municipalities could learn from that or act on.

      Betsy Cooper:
      Yeah, so it's been interesting watching the drip of information come out of St. Paul and, you know, so first have an incident response plan, like be prepared to understand that it's not if you're going to get hacked, it's when you're going to get hacked and be prepared to understand how to, how to handle that. And some of that is what do you do with your systems? And some of that is how do you market and share information with the public? And that's. I think one of the challenges that St. Paul faced was that in the early moments, they basically shut down all city systems. Didn't really explain what happened, and left people in the dark wondering what to do next. And, you know, better communication to the community might help people be happy that they're taking these precautions rather than upset that all of a sudden they can't pay their bills and they don't know what to do next. Really, you know, having within your incident response plan an idea for what you're going to do in terms of ransomware is obviously very important because that's likely to be one of the things you most likely face.

      Betsy Cooper:
      So is your plan to pay or not pay under what circumstances and how, how are you thinking about who you're engaging? Are you calling your council first? Are you calling your security vendor first? Like, what for your organization is the right order of the operation for these things. Things. So really building that out, I think is, is incredibly important. And then, you know, St. Paul is a big institution, but we can still learn lessons for, you know, small and medium businesses that will not have the resources of an entire city. But so, you know, being clear about what systems you need, having those systems disconnected from the Internet if they don't need to be connected to the Internet, training your staff to be aware of some of the risks, which by the way, is getting harder as AI comes around. Like, those are all things that, you know, I'm sure lessons will be learned from St. Paul in terms of getting people to take the ordinary precautions to prevent the extraordinary result that they faced.

      Rachael Lyon:
      And just a quick follow up on that. When you mentioned small businesses, I think a lot of them. Why think, why would I be targeted?

      Betsy Cooper:
      Right.

      Rachael Lyon:
      Why should I actually worry about this. I'm just this little mom and pop store. What, what do you have to say to them on. On that kind of thinking?

      Betsy Cooper:
      Right.

      Rachael Lyon:
      Is it, is it a matter of when, not if?

      Betsy Cooper:
      Well, I think, you know, even if prior to AI, you could make the argument that, oh, my little convenience store, we're too small dollar they're not going to target us. That's not going to be the metric anymore in the age of AI because now it's not an individual human writing an email to try to get you clicking on it. It's a digital system being instructed at scale through prompts to write millions of emails, hoping that some of them get clicked on. And which ones are they going to be targeting? Well, A, everybody, but B, the people that are most likely to click on that, which are the ones who aren't trained. It's the small and medium business owner who's never really thought about cybersecurity and therefore is, you know, or their staff that, you know, the cashier person was never given a training when they're using the digital payment system, for instance. So. So you're probably right that you wouldn't have been top of the list when a single human needed to decide to target you. But when you can target millions of organizations simultaneously at scale, everyone's at risk.

      Betsy Cooper:
      And then the ones that are going to be the favorites are the ones that are most likely to click because those are the easiest targets to, you know, to convert on.

      Rachael Lyon:
      Is it bad to say I kind of long for the days of the poorly worded, grammatically incorrect email that you could generally readily identify as a scam?

      Betsy Cooper:
      Can, can we, I do as well.

      Rachael Lyon:
      Reminisce about the old days.

      Betsy Cooper:
      I got a scary one a few months ago where it included a photo of my house, and it was like an extortion scam, but it included a photo of my house. And it was only after some research that we figured out that had been pulled from Google Photos. So they'd done a good enough job to figure out I lived at this particular address and pulled a photo to make it seem scary. And then you're in the moment of panic and it takes a second, you have to pause and take nine to make sure you realize how they could have pulled this together. But, you know, that's a really simple scam and still included some weird typos and stuff. Imagine they're able to mine every Facebook post you've ever made to figure out exactly what personally you're interested in that I'm a fan of the Buffalo Bills and college gymnastics and target me with the tickets for the national championship of college gymnastics rather than just Taylor Swift, you know, so we're entering a world where not only is it that the typos are no longer the same, signal that it might be a scam, but they're able to tailor it to your personal interests such that you feel personally invested in the outcome. And that's a real problem.

      Jonathan Knepher:
      Yeah, that's definitely scary. And yeah, you know, the, these AI models are getting more and more efficient, so that barrier to entry is just continuing to go down.

      Betsy Cooper:
      Which by the way, is why another place for advocacy is advocating to the AI companies to not allow this use of, of their services. So the AI company will be able to see the prompt, create 1 million emails to 1 million small business owners specifically tailored to their interests, asking them to do X, Y and Z for a ransomware target. And they should be able to see that prompt and not allow that prompt to be submitted as they do with many other sorts of prompts that aren't allowed. So I think that's another area for advocacy is not just with governments to ask governments to regulate this, but also advocate directly to the companies to prevent that outcome too.

      Jonathan Knepher:
      Yeah, but a lot of these models are becoming, you know, readily available to run locally. Like it, it feels like that's going to be a never ending battle.

      Betsy Cooper:
      That's a fair point. And that, you know, so then how do we handle that? And that's a broader point than just cybersecurity. The New York Times this week published Trigger Warning, a story about suicide following AI therapy. And one of the key points was that the person very clearly had suicidal ideation and was putting that information into the chat and it wasn't flagged back. And so should we be allowing local transmission on those sorts of things? Because something could be done if you know that. Right. So, so yeah, I think that you're absolutely right. This is a chicken egg problem.

      Betsy Cooper:
      And this is why regulation is probably going to be required to sort some of it out.

      Jonathan Knepher:
      Yeah, so if we kind of take a step back on this as a whole. Right. Like especially as you're going down to state and local governments, you know, education, I think when you're talking about, you know, young people engaging with this technology, you know, what, what are the urgent policy gaps that are, that are lacking.

      Betsy Cooper:
      Yeah. So just to give a second of context before I answer the direct question, so the program that you're mentioning is rising civic AI, and this is a program that's actually starting In a couple of weeks, we've got about 25 folks representing cities and states from across the country, Democrat and Republican, rural and urban, who are focused on AI within their particular jurisdiction. So they might be the Chief Data officer or the chief AI officer, et cetera. And we're doing two things with those folks. So first, these are all folks who have been hired within the past year. Ish. But onboarding into government. And we're providing them with training to figure out how better to navigate government so that they can use their private sector skills on AI to deploy it for good.

      Betsy Cooper:
      And then second, we're creating a community of practice so that they can share within their own community of people who work on these topics what they're seeing and how they're approaching that. So maybe the city of Boston will have a solution for receiving way too many resumes for a human to ever be able to sort them. And that solution could then be shared with the city of Fresno, for instance. So one of the things we're trying to do is to create that moment where these communities are able to talk to each other and build towards a future where they all have shared knowledge about how AI is affecting their work. So I think some of the gaps that we're seeing in this space, certainly, I think again, when people come into government having worked in the private sector, there's a real learning curve for how government works and how to navigate it. The language, the acronyms, the culture, the how to write the emails, the, even the working in a physical space all the time when you've been working remotely for, for a long portion of your career. So giving people the chance to talk about that, to learn from each other, I think is really important. And then governments are generally pretty far behind on what tools their employees have access.

      Betsy Cooper:
      And so there's a real learning curve as to how could AI make your work more efficient and what are the cybersecurity risks that occur when you do that. So simple things such as training government employees that they can't put sensitive stuff in a chat GPT like we need to teach people to do that. So these chief AI officers are going to be in charge not just of cyber security for systems and thinking about that in the AI context, but also teaching the rest of the staffs of these cities and states how to use AI appropriately in that context. And so, so I think there's a lot of literacy that will need to happen to make sure that that teaching is effective.

      Rachael Lyon:
      Do you see in the future, kind of, you know, we, I think decade to Decade, we kind of see the changing roles of security. Where the CISO used to roll up to the cio, but now that's, you know, that's changing. And in the Chief AI officer, as that becomes more and more of a common role across, how do we, how, how do you kind of envision that impacting what we consider today's CISOs? And does it change, you know, the overall kind of cybersecurity framework? Right. Of, of how companies operate?

      Betsy Cooper:
      Yeah, it's going to be really interesting. So most jurisdictions today don't have a chief AI officer. So that means the slate is blank for where such a person will slot in. I think you're absolutely right. It's going to become a more popular position. But where does it report into, does it report into a cio? Does it report into a chief Technology officer or legal or operations or admin or keep going. Exactly. And where that person sits, because bureaucracies are so hierarchical, is going to matter, I would argue, even more in the government context than it might matter for a private sector company.

      Betsy Cooper:
      Like private sector is much better at dotted line reporting than governments are used to doing so. So I think that's going to be a real challenge. I think, you know, as we've seen at the federal level, we really have not figured out where cybersecurity fits structurally into our government. I mean, that's why we have cisa. But then we have several White House roles and they're all trying to claim ownership over cybersecurity. And you see that as a microcosm at different, at different states and cities. But here's the good news. I think that this new push towards chief AI officers, if we bring a security component into that, can help elevate the priority of cyber security within government and give it a bigger, you know, a better seat at the table.

      Betsy Cooper:
      So, you know, today, you know, the cybersecurity reporting infrastructure may lead you several roles deep in the organization and not at the table. When the C suite or equivalent is talking, everybody's talking about AI. Those roles are tending to be at a higher level. And so that may mean if they're taking security seriously, we need to, you know, they'll be, they'll be in the conversation. So maybe what we need to do is not worry about the structure itself so much as we need to prepare the people taking on the chief AI officer roles to care about cybersecurity, to prioritize that as one of the things on, you know, in their portfolio, maybe that will be more effective than a structural fix.

       

      [33:02] AI’s Expanding Role… Opportunities, Risks, and Policy Gaps

      Rachael Lyon:
      In this case, I like that it makes sense. Shifting gears a little bit and kind of coming back to some of the enablement that you guys are doing at the policy academy. Looking at the media landscape today, it's very, very different than it was 10, 20 years ago when you have to kind of validate news and things like that of credible sources or differing opinions. And you've spoken a lot about the media's role in shaping kind of understanding of cybersecurity in the public realm. You know, how do you think we can reframe narratives to, I think, a be more authoritative but also constructive and inclusive?

      Betsy Cooper:
      Yeah. So, I mean, there's so much to unpack there. But I think let's begin with the premise. The premise is that the average person thinks of cybersecurity and they think of one of two things. Hacker in a hoodie or that really annoying course that they had to take every year for two or three hours in order to get the cybersecurity credential to get access to their laptop. Like, that's what the average person thinks. That's a very narrow view. It's not a very fun view, and it's certainly not a very inclusive view because it, you know, generally begins that hacker in hoodie is generally a white young male approaching that topic, which, you know, is certainly not the case as we know in reality.

      Betsy Cooper:
      I mean, let's not forget our good Nigerian scammers. Right. So what do we do about that? So first, the media does play a really important role in narrative setting. And I think, you know, the Hewlett foundation historically has done a lot of work trying to encourage new narratives in cybersecurity, bringing together journalists to help them understand that cybersecurity stories do not just need to be St. Paul got hacked. Here's what's happened and here's how it's recovered. They can be stories about people, they can be stories about places. They can be, you know, they have narrative that can help you actually build that.

      Betsy Cooper:
      Then second, you can add a layer of. It's not just the hacker in the hoodie from a, like, who is involved in cybersecurity perspective. So telling the stories of people who are running programs, you know, like, there are some amazing folks running programs across HBCUs, you know, doing cybersecurity clinics, or there are huge opportunities to tell the stories of women who are running pen testing organizations and how they're actually building new ways of thinking about things from different perspectives. So. So we want to encourage media to tell that narrative story as well, but I think, you know, I don't want to just give, like, I just don't want to give it all over to the media and blame the media for these stories. We as cybersecurity professionals need to do a better job of telling our stories, too. And so for listeners, I'd encourage you, like, this is where your own personal LinkedIn posting comes into being. So talking about something that you've seen, a new style of scam that might be affecting people, or telling the story of, as I did with my nanny, like a personal implication, if we're not doing that, if we're not out there talking about the human aspect of our, of our field, then we can't really expect the media to pick up on that.

      Betsy Cooper:
      And so I understand that many folks get into cybersecurity because they love the tech, but many of the cybersecurity problems occur because of people. And so we need to bring the stories forward of the people in order to be able to get them invested in the tech.

      Jonathan Knepher:
      I think about too, like, you know, there's a lot of polarization on a lot of these topics, right? Like, like you mentioned, you know, should there be more filtering of what's going into chatgpt and things. And, and that's. To me, that sounds like a fairly controversial subject. Like how do, how do we make sure that things are framed to be balanced without overreacting or under regulating? Because I think, I think both of those are probably going to be problems as, as we go through this, you know, absorbing AI into, into all of, all of civilization.

      Betsy Cooper:
      Well, I guess I'd question the premise that the goal should be to be balanced. I think the goal should be to have a debate. So on these controversial subjects, we need to be in a position where we can engage in reasonable arguments about what we think is right, and then as a democratic collective, come to an agreement through policy that actually enacts what we've established. So let me break that down. Yeah, so first we need to actually be in a place where we don't discount the other side out of hand, and we actually are willing to listen and engage in a conversation. And I think there's something about the way the Internet works that. And let's be honest, it's. Clickbait is probably the answer as to what this is, but that has led us to trying to get the zinger and to trying to shut down the conversation rather than really listening to the other side when we're engaging in discourse.

      Betsy Cooper:
      Like, forget political discourse, just discourse generally, you know, on a Reddit thread or in the comment section of the New York Times or whatever it might be, be right. So we need to really work on that skill set. And that is one of the skill sets that we try to teach people in our programs is to be willing and able to listen to something that you viscerally disagree with and to hear the arguments that the person who agrees with it are making on the other side. So if you deeply agree, disagree that we should, you know, that we should have content moderation on the prompts being submitted through AI, like listen to the other side as to why that might be the case and then develop your own arguments in contrast and actually engage in a conversation. So, so that's what we need to do as individuals and then as a society, we need to have venues in which we're collecting those opinions. So historically that was the public comment process. Historically we would have opportunities where new regulations by government would be posted on a website and you'd be able to type or you'd be able to attend a public meeting and express your opinions on it. Well, AI is changing that too, because now anyone at scale can, you know, create fake comments or can, you know, interfere with that process.

      Betsy Cooper:
      So it's becoming a lot harder to tell what is an actual human right. We need to create new venues, we need to create new infrastructure that brings us back to that place where there was a public square and now it's going to be a digital public square in which we can engage in some of these tough conversations. But I think the biggest mistake is not having that conversation. It's the biggest mistake is not beginning from a moment where we're actually going to engage in this debate and just shutting each other down out of hand.

      Rachael Lyon:
      Right.

      Betsy Cooper:
      That's going to be the least productive way to develop our civic square.

      Rachael Lyon:
      That's a big challenge, Betsy.

      Betsy Cooper:
      Sure is. But I think, you know, there are programs out there that are trying to tackle, you know, aspects of this. There are citizens assemblies that are these local based opportunities where you get a group of citizens to engage on a particularly tough problem and make recommendations. They're very expensive, so you need to figure out a way to do that. But you know, I, I think one of the things that we'd found as public commenting has gotten more visceral and angry is that public servants are trying to cut back on the opportunities for that because they don't deal with the visceral nature of it. So if we can begin from a place where we're trying to create a positive environment and where you know, the other person where you're engaging on a personal level, like that's going to make a big difference. So I think we, we can use technology to create these.

      Rachael Lyon:
      Sure.

      Betsy Cooper:
      So one example of a project from a few years ago, we had a fellow who created what he called DemDoc, which is basically a digital platform that people can use to try to, you know, collectively write government documents. Sort of like a Wikipedia style, you know, document writing process. What if we use that to help, you know, build opportunities for people to be able to contribute to a process like writing the city's AI policy or whatever it might be. So like, there are tools. If we want to invest in technology, let's not just invest in AI to do the work for us. Let's invest in the technology that gives us the civic square that we all deserve. So, so that we are able to actually engage on that front as well.

      Rachael Lyon:
      And this is kind of a, maybe a left of center question observation. You know, I love my TikTok, I'm not going to lie, Betsy. But the wonderful thing about, you know, TikTok, I can skip over things that I don't want to, I don't want to hear it, I don't want to see it. And I have mostly like animal videos and things that I enjoy. But I mean, for the most part on social, you're gravitating towards those things that reflect your worldview and kind of getting further and further away of what, what the, the other side may, may think. And does this become then is this a social media platform opportunity or is this something where, you know, yes, there's free speech, but there's also. How do we hear both sides? I mean, do we need to help more, help with technology to stand up these communities or give them fuel to grow?

      Betsy Cooper:
      I mean, I think it depends on the level you're talking about. I think, you know, a government like the city of Napa offering the opportunity for, on a particular policy they're crafting for public engagement through a dumb doc is a really great way to get a subset of the community involved in that. And so, for instance, the city of Napa passed measure G, which is a bunch of money to go out to city priorities. They've created a very small task force to advise on that. But what if they got the city together using a platform to be able to engage like that? That would be one level of creating that, that I think you're right that at for when we're talking big macro policies like how we should feel about, you know, controversial Topics like LGBT rights or vaccines or other places. That's really where the example you gave of TikTok, you know, comes to bear, I think, you know, in the city of Napa, on really small local issues, you have enough mechanisms to be able to reach people, but when you're talking about big social issues, you're increasingly likely to only see the things that agree with your perspective, which makes it hard for you to be able to develop arguments on the other side, like I was suggesting.

      Rachael Lyon:
      Yeah.

       

      [43:58] Reclaiming the Narrative: Media, Misinformation, and the Digital Civic Square

      Betsy Cooper:
      So one concrete tip for listeners is get out of your bubble. So I regularly visit on the same topic, both CNN and Fox News, and see what they're both writing on the same topic. And I look also at their homepage to see what they're prioritizing. And that helps me understand what people on both sides of the aisle are thinking about a particular topic. I think we can do the same, you know, for many local issues, if there are multiple papers. So what are the Washington Post and the Washington Times saying differently about a particular topic? But I do think you're pointing to a broader question which is a platforming issue, which is that the whole structure of social media is to encourage you to spend as much time as possible. And so it is incentivized to provide you the things that it thinks you want, which can be great puppy videos or, you know, I've had situations, for instance, where social media thinks that I want to see more videos of children with cancer because I have family member who has a child with cancer and I was reading that intently and now it's showing me other kids, families with cancer. So that can have really negative effects even more broadly than.

      Rachael Lyon:
      Exactly.

      Betsy Cooper:
      But I think our solutions need to be, one, we need to decide as a society whether this is the direction we want to go. You know, we control whether these social media platforms go in that direction. Two, if you don't like it, then we need to work to figure out a new norm for what happens, at least for young children, so that they aren't being fed that sort of mono directional perspective. And then finally we need to decide, is this the place we want media and the news to live? More and more people are getting their news only from TikTok, which means they're only seeing one side.

      Rachael Lyon:
      Right.

      Betsy Cooper:
      We have the choice. There are other places you can look besides TikTok. But I know those puppy videos are just so attractive, you know, so that's a personal choice we gotta, we gotta take on and an institutional choice we have to work on. If this isn't the society we want, then we need to do something about it. And that's where policy comes in.

      Rachael Lyon:
      Absolutely. Well, I love that you're helping, you know, kind of lead the charge and empower folks, saying, you do have a voice, you can have a voice. Your voice matters. Because I think a lot of people just don't realize that, you know, what could little old me contribute to something? And it turns out it could actually have quite significant impact. You just have to take that first step and get involved. Yeah.

      Betsy Cooper:
      And on cybersecurity in particular, do not underestimate the fact that you have tremendous knowledge that your local city, your local community, your local school board, your local utility could really, really use. Like, you know, I would say this of any field, but in cybersecurity, uniquely, you have knowledge that your community could benefit from. So. So stand up. It's time to stand up.

      Rachael Lyon:
      Love it. I love it. Well, thank you, Betsy Cooper. Thanks for joining us today. And I do want to encourage all of our listeners to go to Aspen Policy Academy.org to learn more and how you can get involved. It's a wonderful website and, and it's a great pathway forward so you can start shaping policies and be part of the conversation. So thank you, Betsy. Really enjoyed our chat.

      Betsy Cooper:
      Thanks so much for having me. Appreciate it.

      Rachael Lyon:
      All right, and Jonathan, I'm going to do the drum roll again. What are we encouraging all of our listeners to do?

      Betsy Cooper:
      Drumroll, please.

      Jonathan Knepher:
      Smash that subscribe button.

      Rachael Lyon:
      That's my favorite part. Yes. Smash the subscription button and you get a fresh episode every single Tuesday right to your inbox. So until next time, everybody stay secure. 

       

      About Our Guest

      Betsy_Cooper_TTP-336

      Betsy Cooper, Executive Director, Aspen Institute

      Betsy Cooper is the founding Director of the Aspen Policy Academy. A cybersecurity expert, Ms. Cooper joined Aspen’s Cybersecurity & Technology Program after serving as the Executive Director of the Berkeley Center for Long-Term Cybersecurity at the University of California, Berkeley.

      Previously, she served at the U.S. Department of Homeland Security as an attorney advisor to the Deputy General Counsel and as a policy counselor in the Office of Policy. She has worked for over a decade in homeland security consulting, managing projects for Atlantic Philanthropies in Dublin, the Prime Minister’s Strategy Unit in London, and the World Bank, and other organizations.

      In addition, Ms. Cooper has clerked for Berkeley Law professor and Judge William Fletcher on the Ninth Circuit Court of Appeals. She completed a postdoctoral fellowship at Stanford’s Center for International Security and Cooperation (where she currently is a nonresident affiliate), as well as a Yale Public Interest Fellowship. Ms. Cooper has written more than twenty manuscripts and articles on U.S. and European homeland security policy. She is also a Senior Advisor at Albright Stonebridge Group.

      Ms. Cooper earned a J.D. from Yale University, a D.Phil. in Politics from Oxford University, an M.Sc. in Forced Migration from Oxford University, and a B.A. in Industrial and Labor Relations from Cornell University. She speaks advanced French. She is based in the San Francisco Bay Area.