Get a Break from the Chaos of RSA and Meet with Forcepoint at the St. Regis.


Is there a risk management gap in government cybersecurity?

Is there a risk management gap in government cybersecurity?

Risk management is a fundamental principle of cybersecurity as the alternative would be the pursuit of total security which is unaffordable and unachievable for an organization as large and complex as the federal government. This week former Department of Homeland Security Undersecretary Suzanne Spaulding joins the podcast to discuss the risk management gap the government faces and how government officials must manage making security investments when the benefits and outcomes are uncertain.

… and don’t forget to sign up for upcoming episode alerts!

How to Listen

Episode Table of Contents

Episode Introduction: Cybersecurity Risk Management Gap

Arika: Hi, and welcome back to 'To the Point' Cybersecurity. I'm your host, Erika Pierce, joined along with Eric Trexler. Hi Eric.

Eric: Hi Arika.

Arika: How you doing this week?

Eric: I'm great, the sun is finally out.

Arika: Finally, finally.

Eric: On the east coast.

Arika: Yes, on the east coast. I heard it actually hasn't been out on the west coast either, so we're all looking forward. This week, this is episode 25, and wow, I guess we're hitting our first quarter in our episodes. We are excited, we have a great guest today.

Arika: We have Suzanne Spalding, who was the former Under Secretary at DHS responsible for cybersecurity and critical infrastructure. Definitely had a very large job there, and now is a senior advisor for Homeland Security at the Center for Strategic and International Studies.

Arika: Hi Suzanne, thank you so much for joining us this week.

Suzanne: Absolutely Arika, happy to be on.

The Government Cybersecurity Risk Management Gap Approach

Arika: We're excited because we've had quite a range of guests, both government, industry, academia. But you really are just a few years removed from your role at DHS, and given all of your responsibility I can imagine that you have such a great perspective in terms of challenges, in terms of what government's doing well, what it could be doing better.

Arika: And today we'd really like to focus, just for the first part of the podcast, on risk management. Just start off by giving your thoughts on how government approaches risk management in terms of its cybersecurity strategy. I know this is an area that you have talked about quite a bit in terms of your expertise.

Suzanne: Yeah, thank you, I guess I approached it from two slightly different perspectives. One is the work that DHS and my old organization does for helping the and what we loosely call, but all the non-governmental parts, private sector, academia et cetera, with respect to helping them understand how to do cybersecurity risk management.

Suzanne: And then, what the departments and agencies and [inaudible 00:02:31] et cetera and how they approach cybersecurity for the federal government. And I would say that,

I think that the government has traditionally taken that approach, to cyber risk management, that similar to what I so often see in the private sector. And that is an IT focused approach, meaning that it is really focused heavily on vulnerabilities, identifying and addressing vulnerabilities, as the primary focus of that risk management.

Focus on government and non-government data confidentiality

Suzanne: And my concern has always been that, we have to always remember that, risk management, when you are assessing your risks, it is, there are really three factors that go into it. Vulnerability is one of those but threat is another and I think, too often neglected but perhaps most important is, consequence or impact. And so to the extent that the government, traditionally has focused at all on understanding the impact of consequences as a part of the risk assessment.

Suzanne: It is been heavily weighted toward prioritizing large stores of data, right. So, we have, as a country, so it's a government and non-government, really focused on confidentiality of data. When we think about cybersecurity, and, we really need to think more broadly, right, about both confidentiality, but also access to data and integrity of data.

I argue that, in cybersecurity, our almost so focused tradition on threat and vulnerability to the neglect of really understanding consequences of risk assessment and mitigation presents a real challenge of never being able to fully address, effectively cybersecurity.

Suzanne: And I think it is furthered by the FISMA process, The Federal Information Security Management Act, which imposes requirements on all departments and agencies. That is almost entirely an exercise for the IT staff. And again, it is mostly focused on vulnerabilities.

Understanding data integrity

Suzanne: And so, I think that, while it's valuable, is it incomplete way to do effective cyber risk management? We were starting to get at some of these when I left DHS in the wake of the OPM breach. We put out a call to all the departments and agencies and said, we are gonna really try to focus on risk based cyber risk management here, start by identifying your high-value assets and what we got from departments and agencies, not surprising, were, assets that were large data storage, databases and again they were thinking about the last breach, the OPM breach, where the issue was confidentiality of sensitive personnel information.

Suzanne: So, we had to go back out to them and say, no think more broadly about high-value assets, including assets where you are dependent on the integrity of that data and what would happen if that was disrupted and where you are dependent upon accessing, being able to access data to do your mission essential functions.

Eric: Did they understand that?

The integrity of communication between scalar system and your operational machinery

Suzanne: Well, what we got back then was a more sophisticated tests. I think once you sort of explain that to people, they broaden their thinking. They do begin to understand it and that, I think if integrity of data is being really the issue, a large part of the issue, with respect to industry of control systems, for example, you wanna be able to trust the integrity of the communication between, for example, your scalar system and your operational machinery.

Suzanne: So, thinking about the industrial control systems that government owns and operates, as well as these broader categories of disruption, and I always tell them, start by thinking about your mission essential functions. Don't start by thinking about your IT networks. Start by thinking about your mission essential function, think of this as the way you think about coup, it's a cognitive operation issue.

How different perspectives affect the understanding of cybersecurity

Eric: But there is a breakdown, at least I have found in my career, between IT and we will throw cybersecurity in there in a second, and the business or the mission of the agency in many cases. Some do it really well. But, in a lot of cases and this is a commercial problem, almost as much, the business and IT don't necessarily understand, they come in from different perspectives and they don't understand your point, the importance or the risk, like what is mission critical?

Suzanne: Right and I always say, look I had brilliant IT experts on my team at DHS, really smart capable folks they are. But, they don't always understand the impact of a network disruption on your ability to do your mission essential functions or if you are in business, they are not necessarily gonna understand the impact anymore than the electrician can tell you what the impact on your business is gonna be, if you lose power for a week or several days, right?

The impact of bringing in the right people

Suzanne: You need to bring in your business operations people. Your people who, your communications folks. You need to bring in the folks who understand what it takes to do your mission, if you are talking about government.

And then, they need to work collaboratively with the IT folks, but it starts with what are the disruptions from whatever cause, that would prevent us from doing our mission essential functions.

Suzanne: Now, IT staff, which of those could be caused by a cyber incident? And then, when we, go back to mitigation, how do we reduce that risk? It's not just IT solution that you should be searching for. Think paper ballots, think hand cranks, think about doing your operations differently, redundancy, resilience and some of those are gonna be non-technical solutions that your IT staff would never come up with, but your business folks, your operations folks would.

Eric: Because they understand the business.

Suzanne: Right. [crosstalk 00:08:43] Go ahead, Arika.

Government and businesses are moving towards aligning with IT

Arika: Well, I was gonna say Suzanne, do you feel as though in terms of aligning business with IT, do you feel as though that is starting to happen? I know, we have seen efforts where we were trying to elevate the roles of CIOs especially, that both in government as well as in the private sector, so they are at the table for when these business decisions are being made.

Arika: Do you see the tide's sort of turning and going in that direction more and more and if so, is that, are the policies becoming more effective in terms of moving that way?

Suzanne: I think, yes. I do see it moving that way. We never move towards changes rapidly as we should or need to,

Eric: Right.

How not being technical can help build new programs and new operations

Suzanne: ,with the sorts of urgency that the threat requires, but I do see it starting to change. But I think, again I talked with my colleagues, particularly my colleagues around the Department of Homeland Security, but also around the government, at senior meetings at the White House and elsewhere, to engage in the cyber security discussion, despite not being technical experts, because they were experts in their mission.

Suzanne: And that they were responsible for understanding the degree to which their mission essential functions could be disrupted by cyber and I do think that, that engenders dialogue between the IT staff and the policy folks and the operations folks, the mission folks, that is much more constructive and much more fruitful. Once they start thinking in those terms, then as they put in place new programs, new operations,

Arika: Right.

Suzanne: They are much more likely to be thinking, right from the offset, this is, now how could these be disrupted? And what we build in front-end to make this new program activity more resilient against cyber risks. Then you bring in your IT people at the beginning. Your CISSO.

Eric: Love that.

Suzanne: The CISSO is a part of those conversations.

Security As An After-thought

Eric: Too often we find security is not built in, it's an after-thought. It isn't built in from the beginning, sometimes it's never built at all through the lifecycle and it's rarely ever thought about or built in from the beginning.

Suzanne: Yeah yeah. I do think...

Arika: It's more of a 911 versus the 411 right?

Eric: Yeah imagine building a building and not having security cameras or a fence around it or locks on the doors and windows and either never doing that or saying yeah afterwards, okay maybe, we should put some locks in here.

Suzanne: Yeah, yeah, yep. Or putting in those cameras and locks, the most hi-tech that you can find thinking that you have now secured your building when you haven't considered that those very sophisticated hi-tech locks and cameras are connected to the Internet and could be disrupted through malicious cyber activity.

Suzanne: So, we have long argued that CSOs and CIOs and CISSOs, all need to talk to each other much more frequently and again that's another part of enterprise risk management.

Cybersecurity risk management is bound to fail

Arika: And Suzanne, I was reading a article that came out a couple of years ago. It was put out by the Brookings Institute and so, it's going essentially to right around what you are saying but basically it stated that cybersecurity risk management is bound to fail and their theory there is that, managers and the business part of either government or commercial, that they are incentivized to under invest in security measures because as you said earlier, they are always necessarily visible until your mission is at stake or you can't deliver certain types of programs, no longer can operate because of some sort of cybersecurity breach or attack.

Arika: They are also talking about how, the more you invest in cybersecurity of course, the less than that the business, would put quotes around that business, can't invest in other areas that they feel are probably more critical to business. So, from what I'm hearing, this is the issue, this continues to be the issue that we have to overcome, making progress especially in government but it's finding that balance, how do we make sure that the business is incentivized to recognize security as an important part from the beginning, so that it's not bound to fail in terms of a risk management strategy?

Who should we hold accountable?

Suzanne: Right. I think that's right and so a number of things come into play there, to address the concerns that the authors of that report and others have raised this approach. So, when they talk about the managers don't have much incentive to invest in security measures, in part that's because they have traditionally not been held accountable for those failures, because we have relegated that if you will, or consign to that to the IT staff.

Suzanne: So, the business manager, the program manager is not held accountable where as if you, if you may get the program manager's responsibility to understand how their program, their operational activity could be impacted and disrupted by cyber, then it is much easier to hold them accountable if they haven't taken the steps and worked collaboratively with the IT staff to do something.

Suzanne: There are ways to make them more accountable that I think are not unfair. They also talk about, one of the reasons they said is it's very hard to know what the cost that failure will be and I actually, there I would push back a little bit. I think there is a lot we can know about the cost of disruption now. We don't have anywhere near the kind of data we have for say, natural disasters or fires or the kinds of things to which we have all kinds of actual, real data, that we are getting an increasing amount of data about the costs of disruption, from things like [inaudible 00:14:49] and WannaCry and other kinds of malicious cyber activity.

The Formula For Risk Management

Suzanne: You can borrow from the data that you have about other disruptions to your operations, that are caused by storms, fires or other kinds of activities for which there is much more data, so I think our ability, again if we don't try to assign specific numbers to threat and vulnerability in that so-called formula for risk assessment.

I agree that trying to assign a number to evaluate the threat, to assess the threat and even assigning a number to ever increasing vulnerabilities, is a huge challenge.

Suzanne: But understanding the cost of the consequence, of the impact, start with understanding what the impact will be and there are ways to do a better job of assessing what that will cost and there are also then ways of understanding, whether you, how you have done at preventing that. So, one of the other criticisms is, we have to, if we do a good job, nothing happens. And how do we know when the absence of something happening, means we did a good job or that no one was attacking us?

Helping decision makers understand the risk in terms they can understand

Suzanne: In cyber space, that's not the case. If you have good audit logs, if you have good perimeter detection devices and a knowledge of what's going on and visibility in your network, you can tell whether you have pre-blocked malicious activity. You can catalog the malicious activity that you have detected, more quickly, as a result of the investments that you made and because then you are able to articulate what the impact would be of disruption, you can now begin to make a return investment on a business case.

Suzanne: So, it will always be a challenge to attempt to identify and attach precise dollar figures, but if we move the discussion from impact to IT networks, to instead focus on impact to mission essential functions, that will help decision makers understand the risk in terms they can understand and do a better job of risk calculus.

Eric: I agree with you and I think this is a scenario where it just gets overlooked because we are also busy. I mean, how many people actually go back and say, this is what we did this last month or this last quarter, this is what we prevented and understand the potential gravity of what could have happened, had they not had anything.

Suzanne: Right, so

Eric: It doesn't happen.

Seeing through malicious activity

Suzanne: to go ahead and say we detected 10,000 attempted malicious attacks and most of them are junk but that's not particularly meaningful, right?

Suzanne: But, there is technology out there now and it is not prohibitively expensive, that really would allow you to be able to see much more about the malicious activity that you have blocked or detected and compare that to alerts and bulletins out there, to thread intelligence information that is being shared on a much larger scale today and be able to make a much clear case to a board of directors or in this case, for government, to Cabinet Secretary or CIO or program managers, that this is what we think we have been able to prevent in terms of impact, to what you come into work everyday to do, which is perform your mission.

Eric: Agreed. And I think there is a regulatory component to it also. I mean, if we are talking about weather impact, if you live near the coast, in a hurricane zone, there are certain regulatory requirements, building codes, that you need to adhere to.

Nobody Needs Cybersecurity Until They Are Attacked

Eric: Arika, last week we talked about seat belts in the podcast.

Arika: Right.

Eric: Two weeks ago with Ricky George, imagine what cars would be like, if you didn't have safety features. How's this if they didn't have reinforced glass or special roofs in hurricane zones? There is a basic minimum that we can regulate or should be regulating, so that you don't need that seat belt, until you need it. When you have an accident, you need the seat belt, right?

Eric: Same thing in cyber security. Technically, nobody needs cyber security until they are attacked. Unfortunately, everybody is attacked all the time. We do need basic level of security capability and agencies, businesses really vary in what they provide.

Suzanne: And DHS has been given this really kind of extraordinary authority, binding operational directives to direct other departments and agencies about some of those basic things they have to do for cyber security. The first binding operational directive that we issued after Congress gave us that authority, was a requirement to implement patches within 30 days for critical vulnerabilities and it's such a basic step for cyber hygiene but it was not being done.

Suzanne: The response was very good and in fact, they have significantly shortened the time frame, I forget what it is now, but it is much shorter than 30 days. Those binding operational directives have been effective. The most recent one that got a lot of attention of course was, make sure you don't have Kaspersky products in your systems.

Involve the public and devote the resources

Eric: Suzanne, with the FISMA reports being what they are, meaning there are many agencies that are not, how do I put this?

Eric: where they would want to be or where the American people would want them to be. I know we were making some progress. What would your guidance be?

Suzanne: To the departments and agencies? Or to the American public?

Eric: I don't think the American public can do a whole lot, so let's go out with the department and agency level.

Suzanne: Well, I will say, the departments and agencies respond to pressure, like anybody else. Congress responds to constituents. So, I think, if the public was more engaged in this conversation, it could make a difference. But really, I do think that, one of the things that we have to be honest about is that, if we take this threat seriously, as we should, and members of Congress talk about the seriousness of the cyber security threat all the time, we have got to devote the resources, level of resources, [inaudible 00:21:17] with the seriousness with which we approach this threat.

Where are we? Where are we then?

Eric: Where are we? Where are we then?

Suzanne: Again, I think this is not yet an issue that has strong public currency. I think, members of Congress feel enough pressure that they wanna talk about it and maybe put in legislation but in order to provide serious resources to DHS and to individual departments and agencies, to really upgrade their systems from old legacy systems and implement, have the resources that they need, particularly the staff resources, these are the things that we are asking them to do.

Suzanne: Would mean taking money from other places and so Congress, the authors of your report talk about how business managers have little incentive to invest in security versus new programs and activities. Frankly, Congress is in the same boat.

Arika: That's a good point.

Suzanne: It's harder to justify going home and saying, I voted to put a lot more money into departments and agencies, into the federal government, to improve cyber security within the federal government versus I put money into some new program or activity that's gonna impact our district.

It will take a serious, significant cyber incident to bring about change

Eric: When does it change?

Suzanne: Well, people always say it. It will take a serious, significant cyber incident, to bring about change. We have had a number of serious, significant cyber incidents, including the OPM breach and I do think that each of them, brings about some steps forward, some measure of change.

Suzanne: I think it's, I don't know what it's gonna take, really to get the level of seriousness in the sense of urgency that will result in a real allocation of resources that matches our rhetoric.

Eric: It seems to be a common theme we hear on the podcast.

Arika: Got to say the same thing. Yep.

Eric: We need a catastrophic event.

Arika: It's unfortunate, yep.

Eric: We haven't had one yet but we need something significant enough to force the government to change.

Arika: Well, I think also to Suzanne's point to also have that public outcry, that's also going to put that additional pressure on the government from the constituency standpoint, to also force that change.

Eric: Right, because if you look at the last couple of events, they have been pretty serious, pretty costly. I don't think the people, I don't think the American public, has really rallied behind any of them and said "We demand a change"

Arika: Right.

The S in IoT Stands For Security

Suzanne: Yeah. I will say, I do think that, bit by bit this public education and awareness is beginning to have an impact and I think we are seeing it mostly in the private sector, where slowly but surely, new products that are being delivered, that have wonderful connectivity and the benefits of connecting to the Internet, are starting to talk about the ways in which they have gotta protect your privacy, for example.

Eric: You aren't talking IoT, are you?

Suzanne: Yeah.

Eric: Internet of Things?

Suzanne: Exactly. Right, like the S in IoT stands for security.

Eric: Right. Right.

Suzanne: But I actually do think that, some of the more responsible developers of the Internet of Things, are anticipating the blow back, that comes from incidents that involve their equipment, if it's not adequately secured or protected. And I think it will help if we talk about it in terms that public can understand, talk about it in terms of privacy, talk about it in terms of reliability.

Suzanne: Everybody, I think, understands that if they are gonna buy an autonomous vehicle some day, they wanna be confident that someone can't hack into it and cause it to do something that the occupant, the puts the occupant in danger.

Eric: Right.

Talk about cybersecurity using words that people understand

Suzanne: So, you talk with cyber security, that is an abstract term that means very little to most people and I think we have to start talking about it using different words and talk about it in terms that people understand.

Arika: That's absolutely right and actually we discussed in our last podcast, kind of, long lengthy discussion about autonomous cars and the security measures that will have to be in place for that trust to really be there for the public, as we start to move towards those new technologies.

Arika: It sounds like we have made progress but we still have ways to go.

Eric: We have a long way to go.

Arika: Well, and thank you to all of our listeners that have tuned in every week. Please continue to subscribe to the podcast, also feel free to rate us on iTunes and do feel free to also send us a message and let us know what you want to hear us cover, in terms of government cyber security or cyber security as a whole.

Arika: So, until next week. Thank you again and thank you for tuning in.

Listen and subscribe on your favorite platform