Get a Break from the Chaos of RSA and Meet with Forcepoint at the St. Regis.

Close
Episode
49

The Anatomy of a Human Breach w/Data Scientist and Behavioral Psychologist Margaret Cunningham Part 2

The Anatomy of a Human Breach w/Data Scientist and Behavioral Psychologist Margaret Cunningham Part 2

Part 2 - Data Scientist and Psychologist Margaret Cunningham breaks down the "human" factors of a cybersecurity breach.

Welcome to Forcepoint’s To the Point Cybersecurity podcast. Each week, join Eric Trexler and Arica Pierce to explore the latest in government cybersecurity news and trending topics. Always covered in 15 minutes or less.

Now let's get to the point.

Arika Pierce: If you didn't listen to last week's episode, hit pause and go and listen to last week's episode. You can hear all about some personal and professional security threats. We had a really good discussion. So, Margaret, you said this last week on the episode; you're a data scientist, but your background is that you're a psychologist. And so knowing that is fascinating because you don't necessarily meet a lot of psychologists who work in cybersecurity, but I think that also when you talk about how we really have to look into, you know, inside the brain and how we think and that's how we should be approaching cybersecurity, especially in this day and age, I'm assuming that is the bridge of your background versus now your cybersecurity work.

Margaret Cunningham: Yeah. So, I actually worked as a psychologist looking at how technology impacts performance and I started out working in physical security, so, you know, how does this sensor impact first responders ability to do their job and measure that. And then I started working a little bit more closely in software, but a lot of these human performance capabilities and limitations, if we can understand what we're good at as people and what we're bad at as people, we can design our technology to secure and support people versus trying to have people fit this mold for whatever technology is built. And, you know, people are never really going to change. So, what we do have control of is our technology and how we build it. So, I advocate for building for the people instead of trying to get people to fix tech somehow.

Eric Trexler: How do you do that?

MARGARET CUNNINGHAM: Well, you hire people like me! A lot of times we don't really consider what the technology is doing in the wild. We've got people sitting together figuring out very complicated ways of ingesting data, making it go more quickly, what is it going to look like? And those challenges are so difficult from a technology perspective that we often forget about the people who end up using it in the field.

ERIC TREXLER: We forget about why we're deploying technology so many times and who we're deploying the technology to. Yes. It's crazy.

MARGARET CUNNINGHAM: Yeah. I mean, you've all heard of a startup that had a fantastic idea. They build this really cool capability and then it sits in a warehouse forever because it just doesn't work out in the real world. I mean, millions of dollars are spent on tech that, that never sees the light of day.

ERIC TREXLER: Well, I see it in cybersecurity all the time; we're going to solve all your problems. Just deploy my technology. It'll be perfect. I believe I've heard that.

ARIKA PIERCE: Margaret, I know we started talking about this a little bit last week but want to go a little bit deeper on it. It kind of even relates back to the conversation we had about myself. So as background, I got an email at work and it was a phishing email when they later learned, but I did indeed click the link, right? Technically I was breaking the rules that I had been taught through the trainings and things like that, but it was really because I was multitasking, the email looked real, it said I needed to verify my Microsoft 365 account and so on. And so when we think about rule making, I mean, to me that was just a human error, right? There was no malicious intent. It was sent to me, I assume maliciously, but my error wasn't malicious. But let's talk what the different types of rule breaking that occur and this, how we break that down from that human perspective.

MARGARET CUNNINGHAM: Yeah. So, I actually love rule breaking. I love understanding why people break rules. We like to call the rules that we break in a way that doesn't seem so bad as you know, work arounds! So, we have these very handy work arounds that we all do. Sometimes we learn them from our coworkers. Sometimes we engage in these behaviors because we're collaborating with someone externally. We've got to share, we're not supposed to share, but we've got to do the job so we share. So we do all these things so that we can get our jobs done. But in that same vein that work around exposes our organizations to a lot of risks, right? So that's one

ARIKA PIERCE: I have a work around. I would love to share it with you all, but I don't want to get in trouble!

ERIC TREXLER: No, we see this actually in the cross domain space. It's bizarre when customers can't implement cross domain due to timing or the sheer amount of regulatory requirements to do it in a timely fashion or they don't have budget, what they end up doing is moving CDs from one network to another and maybe doing a virus game. We see it all the time because they want to get their jobs done. They aren't employing the proper technology, but they just want to get their jobs done. It's, it's a light rule breaking almost. And the government sometimes accepts it as okay, because they want to get their jobs done.

Workarounds vs. risky rule breaking
MARGARET CUNNINGHAM: Yeah. So this light rule breaking is really, really interesting because how do we tell the difference between light rule breaking that doesn't really have any bad intent behind it besides I'd like to get my job done and the type of rule breaking that someone's doing because they're stealing from you, right?

ERIC TREXLER: Or even light rule breaking, not knowing the consequences. I mean Arika, I don't think anything happened from you clicking on that link, correct?

ARIKA PIERCE: No, no. Nothing happened, at least that I'm aware of!

ERIC TREXLER: But imagine if the company went out of business as a result.

ARIKA PIERCE: And I'll share. I have a friend who had a business need to send a company document to a contractor who had a Gmail account. It was someone who is an independent consultant/contractor. And so when they went to send the email from their work computer, there was an automatic message basically saying you're prohibited from forwarding this type of document to a Gmail account contact and if there is a business need, contact or supervisor and so on. However, if you send that same email from your cell phone from the outlook account to it, it's a work around that overrides that because it's that automatic edit is not in the outlook version on your cell phone.

ERIC TREXLER: It's really difficult to protect mobile devices, but they could've sent it through a personal email account. There are a lot of work arounds, as I even mentioned with the cross-domain side.

MARGARET CUNNINGHAM: Yeah. I mean, people find a way to get it done. And that is a testament to how resilient and fun people are because we always find a way to get it done. But it also illustrates what a huge challenge we have from a security stand point because the more barriers you put up, the more creative people get so, you know, how are we going to do deal with something like that? We're just wonderfully creative!

ERIC TREXLER: I see that in cybersecurity in, I'll call it, the depth of technology that people try to implement and then also we'll call the depth of requirements at the same time that people try. They always try to overcomplicate things thinking it's going to be better if I cover for this contingency and this one in this one, it will be as perfect as I can make it. I'd love Margaret to hear your perspective as a behavioral psychologist on where is that sweet spot right. When I was in the Army, ‘keep it simple, stupid,’ the KISS principle really kept us alive and made things workable. Cyber security is the opposite.

MARGARET CUNNINGHAM: From my standpoint, sometimes it's better to observe the work around then to try to block it from every which way. So obviously we want to decrease risk and risk exposure, but if we can understand patterns of chronic workaround that a lot of people in an organization are making that's potentially more useful than forcing 800 new different types of workaround that are going to be even more difficult to spot. So on that simple message, not all workarounds are bad, but understanding them for what they are helps us differentiate between a workaround and a malicious actor.

ERIC TREXLER: So I know some of the work your team has done is it's been around data stockpilers and doing peer based and anomaly detection. And really what I'm hearing you say is that understanding the environment and being able to understand when you see an anomaly, somebody or something that sticks out.

MARGARET CUNNINGHAM: Yeah so peer groups, what's cool about people is that we understand them much better in groups than we do as individuals. What if I noticed that an entire marketing department is engaging in the same workaround? Okay, well that actually tells me that there's a business need for that within that group of people with that job function that makes sense. And it's either an issue with the policy or an issue with some sort of business process. But then you can start to understand why they're doing it and what it looks like. And you can understand the risk better than if you just said, ‘oh, there's a lot of people breaking the rules’ so it's simplistic, but it's very useful.

Agencies must move towards understanding ‘the good’

ARIKA PIERCE: So Margaret, what advice do you give based upon everything that we've talked about today and also last week? I mean, two organizations- be it government or private sector that are really trying to figure this out? That are trying to get people to start clicking the links and do these work around? I know there's not a perfect system, but if an organization said, okay, we want to take a step back and really look at our security profile in a different way. I mean, what, what do you recommend? It has to be something beyond the trainings. I mean, the trainings I think have become such a, you know, they don't really trigger anything in us on a day to day basis.

MARGARET CUNNINGHAM: So the trainings are kind of a copout and it puts the responsibility on maybe not the wrong people, but it, it sort of kicks the can down the road in a way that's not useful. So, you know, a lot of times companies say, I want to understand who is bad. I want to understand all the bad stuff happening. I'd argue that it is even more valuable to understand what everyone is doing and most people are good. So, you know, if we can emphasize understanding people in a more holistic fat pattern or our way, we get a better understanding of what people are doing with technology. Where our biggest risks are and how to contextualize bad or risky behavior within the sea of resilient, adaptive, positive, good behavior that most of our employees and most people are doing. So that's, it's critical.

ERIC TREXLER: Once again, focused on the good though?

MARGARET CUNNINGHAM: Yeah. I mean it's absolutely impossible to say we're only going to understand shades of bad. I only want to know stuff about the top 20 horrible people when your organization has thousands of people who are for the most part, keeping the ship straight, making everything work. Understanding that good stuff is, I think that's really the magic.

ERIC TREXLER: You know, it's interesting in cybersecurity we tend to focus on the bad, the black side of that, of the art, right? What's happening that shouldn't be as opposed to the whiter, the good side where, you know, the white hat hackers as opposed to the black head hack. But we do the same thing in life, don't we, Margaret? I mean you're the psychologist, but what does your coach focus on typically as a parent? What do you focus on the bad or the good?

MARGARET CUNNINGHAM: Yeah, I mean what are we watching the news for, right? Because it's 24 hour news cycle and occasionally we'll get the happy news story. But we're fascinated by the bad because it's strange. So we're fascinated by what's unique, what stands out, what's strange. Normal stuff, normal life is not nearly as interesting. So, you know, that's fine for the news. That's fine for somebody who's trying to get you to tune in. But if we're really trying to create a better, more robust, more resilient security, then you know, the day to day good stuff is 90% of the pie.

ARIKA PIERCE: Well in leadership training, they tell you you should actually be focusing on your higher performing people on your team even though our inclination is to always focus on the lower performers. So a little tidbit there for ya! Well, thank you, Margaret. I think you've given myself as well as our listeners a lot to think about. I do think this is something that we'll hear more of and I think your approach, especially as a behavioral psychologist is quite fascinating. I mean I have to say this- that's one of the things I find really interesting about Forcepoint is that you all have such a diverse range of backgrounds, which I think really brings a new sort of thinking to cybersecurity. So I've never thought about a behavioral scientist looking at even cybersecurity or technology. So, but it makes sense.

MARGARET CUNNINGHAM: Yeah! We have a lot of fun.

ERIC TREXLER: Arika, what do you think a cybersecurity behavioral psychologist does for fun?

ARIKA PIERCE: Watch Dateline?!

ERIC TREXLER: Help us out Margaret! What do you do for fun when you're not studying people in cyber security?

MARGARET CUNNINGHAM: Oh, I really like to take my dog on walks and, and get the brain cleaned out in fresh because sometimes I do tend to have to look into some of the deeper, darker, weirdo things out there. Um, so, you know, just enjoy time with a family. That's, what I'm up to these days.

ERIC TREXLER: We hear that over and over. We had a discussion a couple of weeks ago with Chris Krebs as you know, he rides to and from work to clear his head because he's in the thick of it sounds like the same. We get that a lot.

ARIKA PIERCE: Yeah. No, I think it's important in this business because there is, there's so much bad, I mean no one has, I'm sure most organizations don't have meetings to talk about how great their cyber security, you know, policies are, are growing. They're going to have the meetings to talk about, you know, what the talk about the bad talk about what has to be fixed.

Thanks everyone for joining part two of this episode. This was a great discussion and please continue to tune in every week, rate us on iTunes or your podcast platform of choice. Let us know what you want us to talk about and please tune in next week. Until then, thank you! Thanks Eric. Thanks Margaret. Have a great week!

Thank you for joining us on the to the point cybersecurity podcast brought to you by Forcepoint. For more information and show notes from today's episode, please visit www.forcepoint.com/govpodcast and don't forget to subscribe and leave a review on iTunes or the Google play store.

Welcome to Forcepoint’s To the Point Cybersecurity podcast. Each week, join Eric Trexler and Arica Pierce to explore the latest in government cybersecurity news and trending topics. Always covered in 15 minutes or less.

Now let's get to the point.

Arika Pierce: If you didn't listen to last week's episode, hit pause and go and listen to last week's episode. You can hear all about some personal and professional security threats. We had a really good discussion. So, Margaret, you said this last week on the episode; you're a data scientist, but your background is that you're a psychologist. And so knowing that is fascinating because you don't necessarily meet a lot of psychologists who work in cybersecurity, but I think that also when you talk about how we really have to look into, you know, inside the brain and how we think and that's how we should be approaching cybersecurity, especially in this day and age, I'm assuming that is the bridge of your background versus now your cybersecurity work.

Margaret Cunningham: Yeah. So, I actually worked as a psychologist looking at how technology impacts performance and I started out working in physical security, so, you know, how does this sensor impact first responders ability to do their job and measure that. And then I started working a little bit more closely in software, but a lot of these human performance capabilities and limitations, if we can understand what we're good at as people and what we're bad at as people, we can design our technology to secure and support people versus trying to have people fit this mold for whatever technology is built. And, you know, people are never really going to change. So, what we do have control of is our technology and how we build it. So, I advocate for building for the people instead of trying to get people to fix tech somehow.

Eric Trexler: How do you do that?

MARGARET CUNNINGHAM: Well, you hire people like me! A lot of times we don't really consider what the technology is doing in the wild. We've got people sitting together figuring out very complicated ways of ingesting data, making it go more quickly, what is it going to look like? And those challenges are so difficult from a technology perspective that we often forget about the people who end up using it in the field.

ERIC TREXLER: We forget about why we're deploying technology so many times and who we're deploying the technology to. Yes. It's crazy.

MARGARET CUNNINGHAM: Yeah. I mean, you've all heard of a startup that had a fantastic idea. They build this really cool capability and then it sits in a warehouse forever because it just doesn't work out in the real world. I mean, millions of dollars are spent on tech that, that never sees the light of day.

ERIC TREXLER: Well, I see it in cybersecurity all the time; we're going to solve all your problems. Just deploy my technology. It'll be perfect. I believe I've heard that.

ARIKA PIERCE: Margaret, I know we started talking about this a little bit last week but want to go a little bit deeper on it. It kind of even relates back to the conversation we had about myself. So as background, I got an email at work and it was a phishing email when they later learned, but I did indeed click the link, right? Technically I was breaking the rules that I had been taught through the trainings and things like that, but it was really because I was multitasking, the email looked real, it said I needed to verify my Microsoft 365 account and so on. And so when we think about rule making, I mean, to me that was just a human error, right? There was no malicious intent. It was sent to me, I assume maliciously, but my error wasn't malicious. But let's talk what the different types of rule breaking that occur and this, how we break that down from that human perspective.

MARGARET CUNNINGHAM: Yeah. So, I actually love rule breaking. I love understanding why people break rules. We like to call the rules that we break in a way that doesn't seem so bad as you know, work arounds! So, we have these very handy work arounds that we all do. Sometimes we learn them from our coworkers. Sometimes we engage in these behaviors because we're collaborating with someone externally. We've got to share, we're not supposed to share, but we've got to do the job so we share. So we do all these things so that we can get our jobs done. But in that same vein that work around exposes our organizations to a lot of risks, right? So that's one

ARIKA PIERCE: I have a work around. I would love to share it with you all, but I don't want to get in trouble!

ERIC TREXLER: No, we see this actually in the cross domain space. It's bizarre when customers can't implement cross domain due to timing or the sheer amount of regulatory requirements to do it in a timely fashion or they don't have budget, what they end up doing is moving CDs from one network to another and maybe doing a virus game. We see it all the time because they want to get their jobs done. They aren't employing the proper technology, but they just want to get their jobs done. It's, it's a light rule breaking almost. And the government sometimes accepts it as okay, because they want to get their jobs done.

Workarounds vs. risky rule breaking
MARGARET CUNNINGHAM: Yeah. So this light rule breaking is really, really interesting because how do we tell the difference between light rule breaking that doesn't really have any bad intent behind it besides I'd like to get my job done and the type of rule breaking that someone's doing because they're stealing from you, right?

ERIC TREXLER: Or even light rule breaking, not knowing the consequences. I mean Arika, I don't think anything happened from you clicking on that link, correct?

ARIKA PIERCE: No, no. Nothing happened, at least that I'm aware of!

ERIC TREXLER: But imagine if the company went out of business as a result.

ARIKA PIERCE: And I'll share. I have a friend who had a business need to send a company document to a contractor who had a Gmail account. It was someone who is an independent consultant/contractor. And so when they went to send the email from their work computer, there was an automatic message basically saying you're prohibited from forwarding this type of document to a Gmail account contact and if there is a business need, contact or supervisor and so on. However, if you send that same email from your cell phone from the outlook account to it, it's a work around that overrides that because it's that automatic edit is not in the outlook version on your cell phone.

ERIC TREXLER: It's really difficult to protect mobile devices, but they could've sent it through a personal email account. There are a lot of work arounds, as I even mentioned with the cross-domain side.

MARGARET CUNNINGHAM: Yeah. I mean, people find a way to get it done. And that is a testament to how resilient and fun people are because we always find a way to get it done. But it also illustrates what a huge challenge we have from a security stand point because the more barriers you put up, the more creative people get so, you know, how are we going to do deal with something like that? We're just wonderfully creative!

ERIC TREXLER: I see that in cybersecurity in, I'll call it, the depth of technology that people try to implement and then also we'll call the depth of requirements at the same time that people try. They always try to overcomplicate things thinking it's going to be better if I cover for this contingency and this one in this one, it will be as perfect as I can make it. I'd love Margaret to hear your perspective as a behavioral psychologist on where is that sweet spot right. When I was in the Army, ‘keep it simple, stupid,’ the KISS principle really kept us alive and made things workable. Cyber security is the opposite.

MARGARET CUNNINGHAM: From my standpoint, sometimes it's better to observe the work around then to try to block it from every which way. So obviously we want to decrease risk and risk exposure, but if we can understand patterns of chronic workaround that a lot of people in an organization are making that's potentially more useful than forcing 800 new different types of workaround that are going to be even more difficult to spot. So on that simple message, not all workarounds are bad, but understanding them for what they are helps us differentiate between a workaround and a malicious actor.

ERIC TREXLER: So I know some of the work your team has done is it's been around data stockpilers and doing peer based and anomaly detection. And really what I'm hearing you say is that understanding the environment and being able to understand when you see an anomaly, somebody or something that sticks out.

MARGARET CUNNINGHAM: Yeah so peer groups, what's cool about people is that we understand them much better in groups than we do as individuals. What if I noticed that an entire marketing department is engaging in the same workaround? Okay, well that actually tells me that there's a business need for that within that group of people with that job function that makes sense. And it's either an issue with the policy or an issue with some sort of business process. But then you can start to understand why they're doing it and what it looks like. And you can understand the risk better than if you just said, ‘oh, there's a lot of people breaking the rules’ so it's simplistic, but it's very useful.

Agencies must move towards understanding ‘the good’

ARIKA PIERCE: So Margaret, what advice do you give based upon everything that we've talked about today and also last week? I mean, two organizations- be it government or private sector that are really trying to figure this out? That are trying to get people to start clicking the links and do these work around? I know there's not a perfect system, but if an organization said, okay, we want to take a step back and really look at our security profile in a different way. I mean, what, what do you recommend? It has to be something beyond the trainings. I mean, the trainings I think have become such a, you know, they don't really trigger anything in us on a day to day basis.

MARGARET CUNNINGHAM: So the trainings are kind of a copout and it puts the responsibility on maybe not the wrong people, but it, it sort of kicks the can down the road in a way that's not useful. So, you know, a lot of times companies say, I want to understand who is bad. I want to understand all the bad stuff happening. I'd argue that it is even more valuable to understand what everyone is doing and most people are good. So, you know, if we can emphasize understanding people in a more holistic fat pattern or our way, we get a better understanding of what people are doing with technology. Where our biggest risks are and how to contextualize bad or risky behavior within the sea of resilient, adaptive, positive, good behavior that most of our employees and most people are doing. So that's, it's critical.

ERIC TREXLER: Once again, focused on the good though?

MARGARET CUNNINGHAM: Yeah. I mean it's absolutely impossible to say we're only going to understand shades of bad. I only want to know stuff about the top 20 horrible people when your organization has thousands of people who are for the most part, keeping the ship straight, making everything work. Understanding that good stuff is, I think that's really the magic.

ERIC TREXLER: You know, it's interesting in cybersecurity we tend to focus on the bad, the black side of that, of the art, right? What's happening that shouldn't be as opposed to the whiter, the good side where, you know, the white hat hackers as opposed to the black head hack. But we do the same thing in life, don't we, Margaret? I mean you're the psychologist, but what does your coach focus on typically as a parent? What do you focus on the bad or the good?

MARGARET CUNNINGHAM: Yeah, I mean what are we watching the news for, right? Because it's 24 hour news cycle and occasionally we'll get the happy news story. But we're fascinated by the bad because it's strange. So we're fascinated by what's unique, what stands out, what's strange. Normal stuff, normal life is not nearly as interesting. So, you know, that's fine for the news. That's fine for somebody who's trying to get you to tune in. But if we're really trying to create a better, more robust, more resilient security, then you know, the day to day good stuff is 90% of the pie.

ARIKA PIERCE: Well in leadership training, they tell you you should actually be focusing on your higher performing people on your team even though our inclination is to always focus on the lower performers. So a little tidbit there for ya! Well, thank you, Margaret. I think you've given myself as well as our listeners a lot to think about. I do think this is something that we'll hear more of and I think your approach, especially as a behavioral psychologist is quite fascinating. I mean I have to say this- that's one of the things I find really interesting about Forcepoint is that you all have such a diverse range of backgrounds, which I think really brings a new sort of thinking to cybersecurity. So I've never thought about a behavioral scientist looking at even cybersecurity or technology. So, but it makes sense.

MARGARET CUNNINGHAM: Yeah! We have a lot of fun.

ERIC TREXLER: Arika, what do you think a cybersecurity behavioral psychologist does for fun?

ARIKA PIERCE: Watch Dateline?!

ERIC TREXLER: Help us out Margaret! What do you do for fun when you're not studying people in cyber security?

MARGARET CUNNINGHAM: Oh, I really like to take my dog on walks and, and get the brain cleaned out in fresh because sometimes I do tend to have to look into some of the deeper, darker, weirdo things out there. Um, so, you know, just enjoy time with a family. That's, what I'm up to these days.

ERIC TREXLER: We hear that over and over. We had a discussion a couple of weeks ago with Chris Krebs as you know, he rides to and from work to clear his head because he's in the thick of it sounds like the same. We get that a lot.

ARIKA PIERCE: Yeah. No, I think it's important in this business because there is, there's so much bad, I mean no one has, I'm sure most organizations don't have meetings to talk about how great their cyber security, you know, policies are, are growing. They're going to have the meetings to talk about, you know, what the talk about the bad talk about what has to be fixed.

Thanks everyone for joining part two of this episode. This was a great discussion and please continue to tune in every week, rate us on iTunes or your podcast platform of choice. Let us know what you want us to talk about and please tune in next week. Until then, thank you! Thanks Eric. Thanks Margaret. Have a great week!

Thank you for joining us on the to the point cybersecurity podcast brought to you by Forcepoint. For more information and show notes from today's episode, please visit www.forcepoint.com/govpodcast and don't forget to subscribe and leave a review on iTunes or the Google play store.

Don’t forget to sign up for upcoming episode alerts!

How to Listen

Listen and subscribe on your favorite platform