Get a Break from the Chaos of RSA and Meet with Forcepoint at the St. Regis.

Close
Episode
121

Understanding the "Who" in Cyber with Dr. Margaret Cunningham

Understanding the "Who" in Cyber with Dr. Margaret Cunningham

In this episode, we explore the intersection of cybersecurity and human behavior with returning guest, Dr. Margaret Cunningham, Principal Research Scientist for Human Behavior at Forcepoint X-Labs. For public and private sector organizations, cybersecurity has always been addressed as a technology-first challenge. However, as cyber threats evolve, the lack of behavioral science becomes a growing issue in today’s threat environment.

We discuss the challenge of calculating and addressing risk, the importance of understanding human behavior vs. controlling it, and why most organizations fail to effectively measure and understand the true impact of cyber solutions. Additionally, we look at how the pandemic has created opportunities for expanding and diversifying the cyber workforce, and why it’s critical for us to open the aperture of traditional security to include experts in fields such as human behavior.

Episode Table of Contents

  • [00:36] Getting to Know Our Guest, Dr. Margaret Cunningham
  • [07:48] Shall Statements
  • [12:55] Not a Superficial Fix
  • [16:35] Start With Measuring Things
  • [22:13] A Benefit COVID-19 Provided
  • [30:53] Track the Change, Show the Value
  • [36:26] Resilience According To Dr. Margaret Cunningham
  • About Our Guest

Getting to Know Our Guest, Dr. Margaret Cunningham

Rachael: We've got Dr. Margaret Cunningham on today. She was here back in October 2019 for Episode 49 and 48. And by the way, if anyone hasn't listened to that, push pause, go bookmark it, and make sure you listen to it. Because it's amazing, talking about the anatomy of a human breach. We're going to, I think, do a little jumping-off point from that discussion, and fast forward two years, and look at this crazy world we're in today, and how are we dealing.

Eric: Margaret, what are you a doctor of?

Margaret: Yes, so I have a Ph.D. in applied experimental psychology, which is, basically, I'm a professional at measuring human stuff. But it's more than that because the applied part of that degree means I'm really good at measuring human stuff in the real world versus in a laboratory which is a completely different story.

Eric: Okay. Sounds a little more complex than my marketing degree, Rachael.

Rachael: But this is, I mean, this is what's so fascinating because you've done this for the Department of Homeland Security. And then you've come over here to the cybersecurity world. We were talking a little bit, I think last week. Are you now a cyberpsychologist? I mean, is that like a thing now?

Margaret: Rachael, I get called a lot of different names.

A Niche Field

Eric: I feel like we could use a few hundred of those.

Margaret: So, it's funny because I've had a lot of different job titles because it's a very niche field. I actually came from more of a physical security and technology integration background with Homeland Security. I was supporting the Human Systems Integration branch, and that's a mouthful. But it's a little bit different, but now I'm applying that same toolset to the cybersecurity domain.

Eric: I'm fascinated by this. I think with everything that has been going on in the past couple of decades in cybersecurity, we need to change it up, and looking at it differently is so important.

Margaret: Yes, it's kind of remarkable because I find myself laughing a lot because I think one of my best skills is writing. So, in absence of all of my technical skills, it's being able to communicate my findings, talk to different types of engineers, and software developers, and security folks, and make those connections, put it on paper, and help people see the vision. And it's not something that we think about very often.

Technology Is Everywhere

Eric: No, and I know you've spent a lot of time recently thinking about the lack of behavioral science expertise in the analytics and software piece. And one of the things we've talked a lot about on the show which I've spent a lot of time on is about tech. Like tech is going to fix this world we live in as we move in digital transformation and become more and more online. But we're not injecting a lot of behavioral science into the tech, even if the tech were to fix some of these massive problems we have.

Margaret: Yes. It's very funny because I think I've probably said this on almost every podcast I've done. But I don't think that we can separate people from technology anymore. No matter what your job is, you're touching technology. So, it's not like it's this unique corner of the universe that is special. It's actually everywhere, all day, personal life, professional life. And by ignoring the behavioral impact, and also some of the design things that we need to think about, we're not doing a very good job.

Eric: I'm going to go off on that. I don't know if it's a tangent or just a line of thinking. To me, it's almost like separating HR from the workforce. You just can't do it. Like, it's a societal thing at this point with technology. It's just kind of built-in, in the way we interact with one another, interact at work with systems. Everybody has to have some level of capability awareness and know what they're doing in tech.

Mislabeled Things

Margaret: Yes. What I find pretty frequently is that what companies will do is they'll take someone who is maybe an engineer who's interested in the human stuff. Then they make that person the one that does those things, even though they might not necessarily have the same training or background.

Eric: And what are those things that you speak of?

Margaret: So imagine a company that's coming up with a behavioral analytic software. And they don't have anybody who understands human behavior in the same way that somebody who is a psychologist or a sociologist, or whatever. And they just sort of guess at what these behaviors mean. They label them something that makes sense, but it's not really an accurate reflection of real human behavior. So, I know that's a little fuzzy, but we get a lot of mislabeled things because of that. Like, this person is disgruntled. If I got a product built by engineers and they said this can show me when you have a disgruntled employee, I would be wary of that indicator.

Eric: Because the engineers who built it probably don't understand the human mind to the extent to understand what a disgruntled employee, what the markers are. Right?

Skipping Over Cybersecurity Tools

Margaret: Sure. Yes, and they might get some of it, but it's not going to be as good. When we're dealing with some of these, I would call them supplementary indicators in our world. Those supplementary indicators have to be very good for them to be useful.

Eric: Okay. Rachael, I almost, I feel like I've seen this even generically in tech when I'm working with organizations where just the mission comes first. I mean, not even looking at behavioral technologies. But, they'll skip over cybersecurity tools or capabilities because they don't want to slow the mission down. So, a very rudimentary example. They buy the tech, but then they won't use it because the mission has to happen.

Eric: And nobody looks at the balance, the risk balance, the risk equation, I would say. I mean, very, very few people look at that. Even when they understand it and study it, and they say, "Okay, this is a conscious decision of what I'm going to do with this technology, how we're going to implement it." Pick something as simple as like DLP. "We'll just monitor. We won't block because we can't stop the general's email." They'll talk about it. But the second there's a problem, they shut the whole thing down. Because human behavior says we can't be impacting the mission. It's crazy.

Shall Statements

Rachael: That's a great point. And I love that you brought up risk because, Margaret, you had mentioned this earlier too, about calculating risk. And getting to a point where you're like, "Okay, this is risky, but then what do you do about it?" I thought that was a really fascinating point.

Margaret: Yes. It's really funny because you think about what it means to not really cross the line but come right up to it. So, this seems very suspect. This doesn't seem quite right, but I'm not confident that I can make any sort of decision or act on it because the rules aren't there. In our world, we're very dichotomous. It's good or it's bad. There's not enough of that nuance. And organizations aren't necessarily prepared to deal with that person who's coming right up to the boundary.

Eric: Well, we see this in the government all the time. It's very black and white. You can do this. You can't do this. Look, the regulation says you can do this, you can't do this. And they implement technology that way or they don't because of that. It's really interesting.

Margaret: Yes. You know, Eric, you're probably really familiar with shall statements, and that you shall do this. And it's written into so many government documents and if you've been on the other end of developing these documents, it's should I use should or shall. And it really is baked in from the very beginning, even for an RFI. It's very rigid. I mean, that's how we're built.

A Manual for Everything

Eric: Well, the whole bureaucracy, not that it's bad or good, not a judgment statement here, but the whole bureaucracy is built that way. Everything is written down. I remember when I was in the army, there was a manual for everything. What color ink can you use? What if I use gray? Well, gray wasn't an option. What do you do?

Eric: So, I get it when you get to infosec tools, cybersecurity tools. And I think one of the outcomes we see is people buy a lot of tools because of low risk. I'm going to buy these tools, but they don't, Rachael, as you were saying, we were talking about risk.

They don't necessarily think about risk in what they're trying to accomplish. They think more in the terms of what tools do.

Eric:  And does that map to what I need? And can I put that up on my list of things accomplished so I can move on to the next level? But did you really make that agency, make that organization, that company on the commercial side a safer place to work? Did you protect the information better, the IP?

Rachael: And to Margaret's point, you've made this in the past as well. If you look across the department and you see the majority of people using this workaround, hey, maybe we actually need to go back as the agency and rethink how we deliver these things if everyone is obviously needing them. But we're not giving it to them so there are tiny crimes, as you like to say, sometimes.

The Downside of Human Creativity

Margaret: Yes, I mean, it's like, Eric, you're saying we just keep buying stuff.

Margaret: And we're saying, "Okay, well, this one is to deal with this problem." It's like whack-a-mole, but like technology stack. And the hope is that with all this different stuff, people will fall in line, and that's going to make the difference, and everyone's going to stay in their little lane and follow the rules. But the reality is when we have more and more stuff layered on, we just get way more creative.

That's what I love about people. They're resilient. They're creative. They have so many different ways to get around things. But when you make people very, very creative by layering all this stuff on top of them, the only thing you manage to do is lose complete visibility.

Eric: So, I know you're not a clinical psychologist because I wish you were. We'd have much deeper conversations about this type of thing. We've been doing this in infosec for decades. The adversary clearly has an asymmetric advantage over the defender. For all the reasons. There are no silver bullets. The adversary only has to be right once. They get as many attempts as they want, et cetera, et cetera, et cetera.

At what point does human psychology, Margaret, say, "We've got to change the rules of this game. We've got to change things up. We can't just keep doing the same old." Somebody puts a zero-day out and we create some toolset, or we create some capability, essentially, to deal with it because the next day another 0-day comes out or another technique comes out. At what point does the human mind say, "Enough is enough, let's figure out a better way of doing this?"

Not a Superficial Fix

Margaret: I think some of it comes down to cost because what I do know is that people make mistakes in predictable ways. So, if we took the time to understand why we're making certain types of mistakes and what the situations are. Then we can deal with some of those issues from social engineering attacks. But it's not something that we can expect the human to do. Because we're always going to make those mistakes because we're human.

Margaret: I mean, there's no possible way that you can increase my attention span by showing me a security awareness training video. But to cope with those predictable mistakes, you have to go deeper into the system and start kind of designing error-tolerance systems that can withstand the mistakes that people make. That is not a superficial fix.

Eric: Okay. So, if I'm a government customer and I just got the CISO job for Agency X. And I've realized, over time, my agency has spent more and more money. We've created more and more tools. We have more and more people focused on cybersecurity, infosec, call it what you will. We're not getting any better, and I want to make a difference. I would argue there aren't a lot of options out there for me outside of the tool space. I mean, I can take a step back from a management perspective and say, "Okay, how are my people organized? What are we doing? Where are we spending our time? Can we spend our time here? Can we spend our time there?" But there aren't a lot of options.

Predict the Types of Mistakes People Are Going to Make

Margaret: In the tool space, yes. It's funny because I do think there's a serious advantage for people who decide that they want to understand behavior instead of control it.

Margaret: And for the same reason, we need to understand and predict the types of mistakes people are going to make or what seems weird. So, let's look at anomalous behavior, which doesn't necessarily mean bad. It's just different. But the more we can identify those patterns, the sooner we can understand that somebody's made a mistake, the better our system can recover.

Margaret: Is he buying it, Rachael?

Eric: I'm trying to think. I'm a new manager. And I'm now responsible for the protective posture for this agency I just took over as a CISO. I'm trying to apply that and say, "Okay, what do I do tangibly?" It's very easy for me to go out and buy the latest, right now, it's Zero Trust. Last year was machine learning and artificial intelligence. Go back to 2010, '12, it was next-gen firewall, and we're going to collapse the stack.

Eric: I mean, the industry does a great job of marketing to people what they should buy to solve their problems. The problem is, as we saw with Sunburst or what we're now calling Holiday Bear, these agencies don't even know how badly they've been penetrated in some cases.

The best recommendation I've heard from any experts, myself included, is burn the whole thing down and start over. But that only works if you have enough money, time, staff, and until the next breach that comes in.

Eric:  You're not making yourself any better, but we keep buying tech. I'm trying to think through. If I were in charge of an organization's defense, how I would take what you just said and do something about it.

Start With Measuring Things

Margaret: I think it starts with measuring things. And I don't know that most places are very good at measuring the outcomes that we want. Right?

Eric: Agreed.

Margaret: If I want to buy something, and I'm anticipating that this thing that I buy is going to make an impact, I better have something to compare that to. Because if you just add stuff, who knows what's going on? You don't know if it's making it better, or worse, or what have you because there's no measurement involved. I think that's actually a critical issue that we are facing. We say we're data-driven in organizations, but are we?

Eric: Well, and even when we are, right, we received 352,000 phishing email attempts last month. Outstanding. How many got through? Well, we're not sure. What are you going to do about it? I'm going to put an RFP on the street for a better email tool. Awesome. And speaking to outcomes, I've yet to see an RFP, RFI, RFX really talk about outcomes. It talks about speeds and feeds technology. Do this, do that, where they're almost prescribing what they want you to do, but not in the sense of a specific outcome.

Rachael: Is it a lack of knowing what that outcome needs to be? I mean, is that part of the problem?

Looking at the Wrong Outcomes

Eric: So, I'm not a psychologist. I personally think it's really hard. I think it's difficult for a cybersecurity professional today to really make a difference. And when they are making a difference, it's almost like insurance. You're not using it. You have it. That's great. You're protected. But you can't go somewhere and say, "Hey, I protected this agency from cyber attack," because it's almost like proving a negative. You stopped everything. But what was everything? Where it's really easy, Rachael, to buy a tool.

Eric:  I implemented X. We reduced the number of phishing emails from this to this. And those were the results. That's tangible and that's measurable. That's something everybody else in the industry can respect and see. But did it make a difference? No, because the five critical ones just got through and it really didn't matter that we stopped 30% more this year because five got through, and then we were had. So, I think psychologically speaking, it's really hard as a manager to prove a negative. We were fine this year because of us, because of what we did.

Margaret: Yes. And there's something we really have to work on making those things visible somehow. I get asked all the time, "Margaret, why would you care about that behavior of a person? Show me exactly why." And I get asked to prove it all the time. And I'm like, "Oh my goodness." Because the reality is most people are good. So, for me, when I'm looking for bad actors, it can be difficult to prove it because there are so few. And we get into that same situation sometimes with things like phishing because the number, the scale is enormous. It's a real challenge, but I think we might be looking at the wrong outcomes sometimes.

Understanding Outcomes

Eric: I agree with you. We're not even understanding outcomes. The other thing I'll say on the insider threat business, we do a lot in that area so we've got some decent experience. Even when we stop somebody, a child pornographer, corporate espionage, or sabotage, harm to self, harm to others. We've had some major cases, Rachael, where you can't talk about it. You can't prove it. Did you save five lives or did you save 50? I don't know. When the FBI raided their house, they had three Claymore mines, six hand grenades, and two automatic rifles, semi-automatic rifles. Who knows what we saved? But you can't prove it and you can't talk about it so it is a really hard problem.

Margaret: Yes. I did say I might not have all the answers.

Eric: Yes, I may have to replay that. I may have to replay that. Okay. So, let's talk resiliency. You were just on an ACT-IAC. Was it a workshop or a panel? How would you characterize it?

Margaret: It was a series of round tables, we'll say, on different topics. I was on the workforce panel, and then when the report came out last week, I sat down with a few other folks and talked through some of the results. I think we touched quite a bit on innovation and what innovation means for the government, especially in light of the pandemic.

A Forced Innovation

Eric: And what did you hear? What was the results from that?

Margaret: Well, it's funny because it seems like there is some sense of innovation and change because it was forced. So, we had to do these things to keep the lights on.

Eric: Because of COVID?

Margaret: Yes.

Margaret: I mean, yes, so the workforce had to shift dramatically. So there's this sense of yay, we innovated. We did this. We've had some success and a little bit of looming fear that when this is over, we're going to revert back to normal.

Rachael: Exactly. Yes, fall back into other habits. Right?

Margaret: Yes.

Eric: What you're saying is we'll shut down all the web access, the VPNs, things like that, and we'll go back to the office, working from the corporate office, and data centers, and everything else?

Margaret: Yes.

Eric: I don't buy that.

A Benefit COVID-19 Provided

Margaret: Yes. And I don't really buy it either, but I do think that there are some pockets of people who would love to have that because it is what they've been doing for 25 years or so.

Eric: Comfortable. Yes, they know it. Yes, and I know you've spoke on the innovation of the workforce quite a bit. It's comfortable. So, I get why they would go back to that. But I think there will be an equal number that like working from wherever. They like the greater accessibility that COVID provided to them. And candidly, they don't really think about security in most cases so they assume it's going to be there or it's taken care of by somebody else, I think.

Margaret: Yes. There's an additional benefit because I lived in D.C. for a long time. I'm in Texas now. A lot of these cities are really expensive and we can expand our workforce dramatically to people who have varied skill sets or even very specific technology-related skillsets when we couldn't do that before. And I think that's really, really wonderful.

Eric: So, as a hiring manager, one of the things that should have come out of this is my hiring pool has now increased significantly. I can pull somebody from Fargo, North Dakota if they have the right skill set and I can provide them with capability to connect. And they can go on vacation to Tampa, Florida if they want to, and they can still work, and that works great.

Diversity in Cyber

Margaret: Yes, and there's not really a good reason to not do it, except for if it interferes with the department's mission, for instance, or if the cost is extreme. And on that, I don't really know because I don't have a deep understanding of how much it costs to do it differently. But I do think there's a major shift coming.

Eric: Okay. So, you talk about a gatekeeping culture or a reputation in infosec. Give us a little more on that one.

Margaret: I don't think it's as bad as it used to be, although I'm still relatively new. But there is a history of being less friendly for women sometimes as it is in STEM, in general.

Rachael: Yup. True.

Margaret: And a lot of people know each other. So, if you're not necessarily in that group or in the clique, per se, and you're joining as an outsider, it can be difficult to get included. It's challenging. I think it's pretty common in technology fields, but I do think that security has sort of a club mentality. I don't know. You know it when you see it. Rachael's laughing because I think she might understand.

Rachael: Yes, I love this topic though because it kind of gets to what we talk a lot about, the work skills gaps. But also diversity in cyber because it has been this kind of insulated thing for so long. And to your point, as we start opening the aperture and understanding behavior, and coming at cyber from a very different lens because we have to. You had this great blog that was talking about philosophy or linguistics. You know what I mean? Or anthropology, but it's so smart. Because when you look at social media domains and disinformation, and how people react to things that trigger their belief system, those are very important things to understand as we look at how do we address cyber ahead.

Hire Outside Select Communities

Margaret: Yes. And I mean, really, if you're coming into this world and you're like, "You know what? I understand the difference in language between bots and real people." That's really, really cool and very relevant for security. But that same person might not know what a packet is. Then when they're trying to talk to somebody and they're hyper-technical, or they're really into contain our security or whatever, they get kind of an eye roll because they don't know that part. And it's not everywhere, but it does happen.

Eric: I, fully, fully agree with you. I would not argue that at all. I do think it's gotten better over the last decade-plus. We've seen a lot more, I mean, just the sheer number of jobs and the amount of opportunity in the space requires us to pull from outside of those tight, select communities, and I think that makes us better.

Margaret: Yes. And I can't be an expert in everything. I am never going to understand malware, for instance.

Eric: Do you need to though?

Margaret: No, I don't.

Eric: Exactly.

Margaret: Because I know somebody who understands malware and I can be like, "Hey." And I think that we really need to start doing that better and recognizing what we don't know. That sort of ability to ask questions or say, "Hey, I'm not quite sure what you mean by that," and clarifying and finding other experts to work with, I think that's going to make a huge difference.

Fresh Eyes Are Also Valuable

Eric: I've heard you say before, "Experience is valuable, but fresh eyes are also valuable." And I think that's a really, really pertinent statement to this conversation. The experienced eyes, if you will, haven't made this problem any better over time. I mean, we're getting worse and worse here. So, it's time for a fresh view, in my opinion. We've got to look at this problem differently.

Margaret: Yes, and I'm a good editor. I can look at somebody else's stuff and say, "Hey, this and that is wrong." And I think most people are very good editors. What really takes a little bit more, maybe courage is the right word, is to be the creator and be willing to have other people throw darts at your work. I think when we have new people, people with different ideas, coming into our world, and they're saying stuff that seems weird, when we throw darts, we should also offer alternative solutions. And I think that's for any industry. But really, when you get somebody who's new, they've got some strange idea, instead of just poo-pooing and squashing it.

Margaret: Be a little bit mindful and work with them on it. Help them shape it and be more constructive and that's what I mean.

Rachael: Sorry, I was going to say, it's like that improv thing where there's yes-and, instead of just automatically shutting it down because you never know where it could go.

There's No One Way to Solve a Problem

Margaret: Yes, and I try and do that. I'm not going to name names, but sometimes people ask me some very strange questions about what I can do with cybersecurity and people. And I get it. It's exciting. It's fun. It seems like it has magic dust on it and I can do whatever. But I try not to make people feel silly for asking the question because it's not going to help. It's fun.

Rachael: Because what you do is fascinating, though. I mean, I have so many questions and I've known you for how many years now. There are just so many layers to what you do, and when we look ahead at how critical it is and how much we don't know. That seems kind of daunting, I think, in a lot of ways to have to admit that and really have to kind of rejigger how we go about this whole thing. And where do you start?

Margaret: Yes, and it's funny because if you had three Margaret-type people sitting in a room and you asked us the same question, we would have our different theoretical orientation and perspective, and there's no one way to solve a problem. If there was, we'd be done. Something else to keep in mind.

Eric: Okay. So, I want to take you back to the beginning. New manager, CISO of an organization. We've been buying tools for years now. It hasn't worked. How should we think through that problem? Give us a couple, two or three things in your mind, Margaret, that understanding humans, and their behavior, understanding the industry, and the problem set, how do you think through that problem? Where do you start?

Track the Change, Show the Value

Margaret: I would tell that person to start with something simple and sort of cut the complexity down by a lot. And what I mean by that is pick one thing that you want to change. It could be a click problem, or a web problem, or whatever. Pick that one thing, understand the outcome that you want, measure it, do whatever intervention it is. If it's technology, cool. If it's something else, whatever. Track the change, show the value. And then do it again.

Eric: And again and again. Rinse and repeat.

Margaret: Yes, and again and again. And what's funny is when you come into a room and you say, "Hey, I want to look at this one thing," people will give you really dirty looks. And then they're like, "Okay, but let's add this and this and this and this." That's actually the mistake because you can't do it all at once.

By trying to do it all at once, you're doing nothing.

Eric: Can't argue with that at all. What else? No, I like it. I mean, I'm a keep it simple, stupid guy. I like basic.

Margaret: So, for somebody new, don't do a bunch of stuff when you first start.

Eric: Yes. Sit back, listen, find one problem, sink your teeth into it, solve it.

Margaret: Yes.

Eric: And then iterate it.

Different Perspective Is a Good Thing

Margaret: Find your angriest employee and listen to them. And find the person who makes you angry or gets you a little bit uncomfortable because they may be telling you something that you really need to hear, but you don't want to.

Margaret: That's just sort of a personal thing. I'm fascinated by people who I don't like or who don't seem to like me because I'm like, "Oh, I need to figure this out." Not so that they like me, but just so I can understand a little bit better because that usually means we have a very different perspective, which is a good thing.

Eric: Yes. Not a bad thing at all.

Margaret: Yes, it's a good thing.

Eric: So, Rachael, you've been in the business a while. A major problem you'd like to solve. We've got Dr. Margaret Cunningham here. She's going to give her perspective on it. Give us number two.

Rachael: Well, Margaret, and I think you've asked these questions before. What would it take for people to stop clicking on crazy things? I mean, what would that take? Do we need a keyboard that shocks people every time they make a mistake?

Eric: No crazy things.

A Perspective From Dr. Margaret Cunningham

Rachael: I mean, what would it take so we don't click those links?

Eric: An internet outage.

Margaret: We have to take all the links away, all the buttons away, and the mouse away because clicking things is fun. We actually get some feedback from clicking things and it's fun. We like it. We're always going to click the wrong thing. I mean, the best thing that we can do is beef up the protections in that area and figure out a way that when someone clicks the thing, you have ways to mitigate the impact within your system.

Rachael: Versus just making everything unclickable.

Eric: And they're comfortable, they're aware and comfortable to report it. I don't know, episode 48, 49, when we had Arika on, Rachael, as the cohost.

Margaret: She clicked.

Eric: Remember, we talked about, she clicked on this link.

Rachael: That's right. She clicked on it.

Eric: She knew it was bad. They send something out. She never reported it. I shamed her into reporting it.

Margaret: There was a lot of shame in that episode.

Eric: There was, I actually felt bad as I listened to it in preparation for today.

Rachael: But I think you said at least the company didn't go out of business. Right? I mean, that was the good thing about it.

What People Say and Don't Say

Eric: Well, exactly. I was watching CBS, 60 Minutes, and they were talking about the Sunburst breach and how FireEye figured out that SolarWinds was the attack vector that got into FireEye to steal the Red Team tools. And it was an employee in IT who had noticed an employee had two cell phones set up, according to Kevin Mandia. Two cell phones, in the story, two cell phones set up for his two-factor authentication. Which I think, he didn't really say it, but my suspicion is that was against policy.

Somebody in IT contacted them. The employee said, "No, I only have one cell phone." And they started diving into the problem with some of the best IR people in the world, and they traced it back. But it was the psychology of that IT person, that inquisitiveness. Who knows what their background was, Margaret? I mean, they could have been coming from anywhere. But that's what solved that problem and broke that open, which I thought was fascinating.

Margaret: Yes. It's just what people say and what they don't say, what they're willing to share, what's the truth, how easy it is to get information out of people. Those are all really important to our world. We don't think about it very often. And one of the reasons why is it's very difficult, and I'm sitting here saying we're always going to click. We're always going to tell. We're always going to do this. And so people think, "Well, then why bother working there? Let's just make more technology." But it's in understanding all of that, that we can make technology that works.

Eric: So, Rachael, it's time to wrap up. Last question for Dr. Cunningham.

Resilience According To Dr. Margaret Cunningham

Rachael:  That's a tough one. I think, here we go. We hear a lot about resilience and I think for a lot of people out there, there may be some confusion on what that means. So, since you've been on a lot of panels and discussion boards about resilience, I would love to hear your perspective on what that means today.

Margaret: I will keep it simple.

I think that resilience is the ability to adapt to change, successfully adapt to change.

Eric: Interesting.

Rachael: Okay. I like it.

Margaret: And I know that that is different from like technology being resilient to things and not failing. But I really do think that it is that capability to adapt.

Eric: I like that.

Eric: I really do. I think a lot of people would talk about technology all day long. But it's so much more than technology.

Margaret: Not my area.

Eric: Yes, you saw with NotPetya in the Ukraine. Right?

Margaret: Yes.

The Resilient People of Ukraine

Eric: Technology went away, yet Ukraine was able to get back online. The people were resilient. They adapted to the tech going away. They got the power online. Society started to function again. I would say they were very resilient and it wasn't about tech once it went away.

Margaret: No. And I would even argue that the people in your organization reflect the resiliency of your actual organization. So, it's not separate. So, when your organization can withstand change and adapt, your people are better at adapting and dealing with that change. In turn, because the people are doing a good job because they have an organization that's capable of doing it, it's a positive feedback loop. But when the organization fails, the people are set up to fail. And then when the people fail, other direction.

Eric: Margaret, you always open my mind. I really, I mean, love having you on the show.

Margaret: I love being here.

Eric: These topics, it's so much better than talking about speeds and feeds or the number of data packets we can inspect. It's the direction I think the industry needs to go in, really understanding people. So, thank you for your time.

Margaret: Yes, you're welcome. And I'm always happy to talk. Rachael, Eric, it's been a pleasure.

About Our Guest

Dr. Margaret Cunningham is Principal Research Scientist for Human Behavior within Forcepoint X-Labs, focused on establishing a human-centric model for improving cybersecurity. Previously, Dr. Margaret Cunningham supported technology acquisition, research and development, operational testing and evaluation, and integration for the U.S. Department of Homeland Security and U.S. Coast Guard.

Listen and subscribe on your favorite platform