[01:48] Principles for Board Governance of Cyber Risk
Rachael: Let me welcome to the podcast the World Economic Forum's Head of Governance and Trust for the Center for Cybersecurity, Daniel Dobrygowski, and Forcepoint's Chief Legal Officer John Holmes.
Rachael: So the two of you have worked on this really awesome report as part of an initiative with the World Economic Forum. It was the Principles for Board Governance of Cyber Risk. I think we can all agree, that's a really critical topic today because the role of business leaders has markedly changed in the last few years. Daniel, since you're from the World Economic Forum, do you want to give us a little bit more background on that report?
Daniel: Yes, absolutely. Like you said, this is fundamentally a leadership issue now. Cybersecurity underpins so much of what we do, you can't really talk about having a digital economy versus a regular economy. It's all digital. So from the World Economic Forum's perspective, leaders need tools to be able to understand how to get by, how to make decisions, in this digital world. One of the areas that we recognized probably six, eight years ago, was that corporate boards especially have a really important role to play in setting company culture. Asking the right questions of executive teams, and just making sure that cybersecurity is a pro.
Daniel: So we've developed this board of work called Cyber Risk and Corporate Governance to help educate board members. To help them understand what kind of questions they need to ask. We bring together a multi stakeholder group. People representing governments. People representing businesses at all different levels, from CISOs to general counsels like John to chief risk officers. We work on how to develop an area of guidance, a body of education for corporate board members.
Daniel: In this last iteration we wanted to have an opportunity to bring together best practices from around the world. So we worked with the National Association of Corporate Directors in the US, and with the Internet Security Alliance. We worked with our partner companies like Forcepoint, Swiss Re, like others. And we brought together their best practices on how their boards approach this issue.
Daniel: How leading boards can help other board members understand what cybersecurity's all about. What we came out with was a set of six principles that can sort of be applied globally, in order to integrate cybersecurity into the business and into thinking at that board level.
The Six Principles of Cyber Risk Governance
Eric: So Daniel, you come up with these six principles, and we'll cover them in a second. But as you were speaking, the thought that came to my mind, is it really a priority for a lot of companies out there? I'd love to know in your experience what percentage of organizations, what types of organizations, really do care, and which ones don't? Because I run into a lot of peers in the industry, I work with a lot of organizations. I'm more on the government side these days, but global. And there are a lot that just don't think about this as a board-level or a company-level significant issue.
Daniel: Right. Well, with the Forum we're very much focused on crafting a better future. So the point is not so much that everyone cares now. It's making sure that they know what to care about in the near future. Not to get too much into the weeds of our philosophy about this, but there's a book by our chairman called The Fourth Industrial Revolution.
Daniel: That explains that right now, everyone in the world, governments, businesses, no matter what your industry, is going through this incredible transformation. And it's all related to different types of technology that are impacting our world. So whether you think you should care about it now or not, you have to. And that's what we're trying to explain with some of the work that we're doing here.
Shaping the Standards of the Future
Daniel: So as we put together these principles for boards, as we put together a lot of these other work streams around cybersecurity and other technologies. What we're doing is, we're shaping the standards and the norms for the future. So you might say that some industries don't pay a lot of attention to cyber right now. There are a number of industries that don't. But they're going to have to. So we're playing ahead of the ball. And we're there with guidance like this for when they're ready to pick up the ball and work on it.
Eric: You're providing the principles, the guidance, the capability so there's information, there's capability available when they're ready.
Daniel: Yes. Exactly.
John: Let me jump in here. I think that since roughly 2015, from looking at this issue from the perspective of my role as an advisor to boards, from a corporate secretarial standpoint, from a GC, vastly different landscape than it was five, six years ago.
Eric: You're seeing evolution?
John: Absolutely evolution, but driven by terror, as it should be oftentimes. But all the high profile breaches, all the NACD and other World Economic Forum sessions, you don't go to one. And you haven't for at least six years now, when there's not a robust discussion around the board's responsibilities, fiduciary responsibilities. Significant table stakes to participate on a public company board, or any other sophisticated enterprise's managing body is an understanding of the cyber risk.
[7:08] Disruptors to Cyber Risk Governance
John: And if you read something like any public companies’ disclosures, 8-K, 10-K stuff. Go scroll through any of them. And where there used to be a lot of conversation around what could be disruptors around their strategic plan that they've articulated, it could be a pandemic for example. You don't read any of them today without it discussing cyber risk being a potential existential threat to their ability to achieve their strategic business goals.
John: So I think it's really well understood, maybe at a higher end of the food chain in terms of size of enterprise. But I think that you know that large government customers have a mandate to pay close attention to this, and standards are granted.
Eric: We'll get into that.
John: All right. But when it comes to private enterprise, we don't have the same level of standardization. So you know what good looks like, or at least have some proxy for what good looks like. Which is why this tool that NACD and World Economic Forum and Internet Security Alliance produced, which is the first time I believe. Daniel, correct me, where those three really influential organizations have aligned to give guidance to private boards and public boards, what good looks like.
John: Not exactly what they're supposed to have installed in their security stack. That's not the point of this. The point of this is, because it's not one answer to that question. It's going to be largely dependent on the types of threats you're facing and your type of organization and your industry. But at an important and a higher level, it gives boards a playbook for what they should be paying close attention to.
A Game of Speculation
Eric: That's what I love.
John: You and me, both.
Eric: To our listeners, you've got to read the report and look at the richness of the content there. Let me play a game of speculation here, which, John, I know just knowing you, you're not going to like. Daniel, you'll love this one. Colonial Pipeline.
John: I only look backwards, never forwards.
Eric: No problem. Two years ago, Colonial Pipeline. You think their board was asking questions? Pick on somebody.
Daniel: Looking back, based on results, you think possibly not.
Eric: Well that's the easy answer.
Eric: "Hey, what is our cyber risk posture?" Something that easy, do you think they had those discussions?
Daniel: Now I don't work with that specific company, but based on the industry.
Eric: And pure speculation.
Daniel: Yes. You look at the extractive industries, and the oil and gas industry as a whole, and historically they haven't been asking these questions. In fact, we've developed a body of work just recently. Look how timing works sometimes, specifically on oil and gas, and helping boards in the oil and gas industries understand.
John: Thank you.
Daniel: Yes, you're welcome. Maybe it would have been great if we came up with that two or three years ago, but we have it now.
Eric: Yes, they probably wouldn't have been able to do anything at that point, but at least it's there now.
Daniel: Yes. So if you have companies looking at what happened to Colonial Pipeline, they say, "I don't want that to happen to me. What do I need to do as a board member?" Well, the Forum has guidance for them specific to their industry, as well as this general guidance that works for anybody.
Eric: Okay, so let's move away from Colonial Pipeline, because I don't want to pick on anybody. They were clearly a victim here. John, in your perspective, if you're an oil and gas company now in 2021, you're in the board meeting. How has that dialogue changed? What are they asking now, compared to a year ago?
John: Well, an example, and Colonial Pipeline is a good one, because these events tend to be change agents across an entire industry vertical. When one of their proud competitors gets targeted, that does create anxiety in the boardroom. That tends to drive people towards things like, what does good look like? Hence the product we produced to help people identify what good looks like.
John: I suspect that their board is demanding not just some sort of biannual. Like every two or three years or four years, which has been common since 2015 or so. Sort of cybersecurity readout to the board, and check that box. It needs to be a lot more continuous in terms of the board's evaluation questions, demands, in terms of agenda items with respect to this type of threat. "What are we doing? Have there been any breaches? How have we changed our security posture in light of the ever-evolving threats?"
John: I'd like to believe that not only oil and gas, but every industry and boards are now, from a good governance standpoint, asking those questions. Not every now and again, as a checkbox item to meet some de minimis regulatory requirement. But more so because they know that this is table stakes to be in business. In the same way that they have to constantly be evaluating the competitive landscape, the products, the demand for those products, and the financial performance.
Paying Close Attention to Cyber Risk Governance
John: All the good things that companies have been paying close attention to as measures of goodness since the dawn of time. They now have to pay attention to their cyber posture, because if they're not, then they almost don't have permission to do business.
Daniel: I need to add to what John's saying too. What you're asking as a board is really really important, but equally important is who you're asking. There's a lot of companies who just say, "The CISO gives the report, and that's all I'm asking about cyber." But you need to be asking other members of the C-suite, other executives, what their responsibilities are on cyber, how they're fulfilling those responsibilities.
Daniel: John has a great article about what to ask your general counsel about cyber. But boards also need to ask the CEO themselves, they need to ask chief risk officers, and other people in the stack and people who run business units, "What are the cyber implications of what you're doing?"
John: Yes, and one of the principles that we have in the paper is talking about organizational design to make sure that the board is looking at organizational design. And if you find that your information security team is buried somewhere in an administrative function.
Eric: In sales and marketing?
John: Probably not the best place. I don't want to cast aspersions on my sales and marketing brethren. But you wouldn't typically put a compliance function like information security, which shouldn't be buried, under some sort of extraneous type of functional remit. It really does need to be front and center in terms of org structure as well.
No Company’s Invulnerable
John: Such that everybody at the table, not just board members but across the C-suite, should have a hand and an understanding of "Where we're going from an InfoSec standpoint? How the threat's evolving, and what we're doing to try and address that." No company's invulnerable, and we all know that. And the vulnerabilities are constantly changing.
John: The threat vectors are constantly changing, the bad guys are getting better, more well funded, more persistent. Perfection is not available to us, which is why it's got to be a constant evaluation. So you're trying to keep track of what good looks like, because it's changing board meetings to board meetings.
Eric: I always look at it as risk management, but really resilience. You know the adversary has a first mover advantage, you know they only have to be successful once. And you know there are no magic silver bullets or magic potions that are going to protect you. So how do you make a resilient business and really address risk? I see a lot of challenges in that.
Eric: But, are there any parallels? I mean InfoSec, cybersecurity, the impact to the business, risk management around a cyber framework. John, you mention you've seen a significant increase or improvement since 2015, or increase in focus maybe.
Eric: Are there any parallels, if you go back over your careers, or you look at business over the decades, over history. Where business materially changed like that that we can use to learn from? And I don't know what that would be. I'm thinking like telecommunications, or something like that. The advent of the computer, which really changed the way we do business. I don't know though. Could be the fax machine. What do you think?
[15:11] Evolutionary Threat
John: I think you're talking about evolutionary threat, which is a little bit different than business productivity evolutions.
Eric: Well, I don't want to talk about the threat. I want to talk about where corporate business, boards, had to change the prioritization due to market changes, or the evolution of business. You guys wrote this Principles for Board Governance of Cyber Risk. 20, 30 years ago, was someone writing something about principles for board governance of the advent of telecommunications into the business.
Rachael: Kind of like the Internet.
Daniel: Yes, I think that's right. I would even go further back. You think about board governance, it's not like this old area of law that's been with us for thousands of years. But you can look back at big changes in what availability we had. Look back at electrification and the kind of things we went through.
Daniel: No one knew, when we first started wiring houses and businesses, what you'd be able to do with this stuff. Just like no one really knew what you'd be able to do when we started wiring things to the Internet. And just like with electrification, there were excellent, great results. Huge increases in productivity and the availability of people to work and learn and do all sorts of things. There were also significant risks, like people would cause fires. People would die of electrocution.
Create Systems for Cyber Risk Governance
Daniel: And we had to create systems. Some business systems, some government, some interconnected ones, that helped make sure these things were safe. This is where you have, if you look on a light bulb, it says, "UL listed," and this is where you have that coming from.
Eric: And it's in the whole process now.
John: And Daniel, I was going to say, you're right. I was going back further as well. I think from a regulatory standpoint you go back to 1933, '34, when the SEC is established. And you start talking about what governance looks like. What types of reporting and information public companies have to provide to the investing public that is enforced if they misinform the investing public, to allow them to make good investment decisions.
John: That didn't exist prior to the Great Depression and the collapse of 1929. So the government did respond. Not swiftly, the government never responds swiftly, and it's rarely ahead of risk. But that was a fair response to a significant global catastrophe resulting from, in part, poor information provided to the investing public. And so you get '33 and '34 Act requirements that result in what we know today as all your 8-Ks, 10-Ks, et cetera. That gives investors the information they require to make good decisions about their investments.
Eric: The reason I ask is, at the operator level, we're still not seeing risk translate down. So John, I'm hearing the boards are now talking about, they're talking about it at the C level. We're seeing it in the government. You saw with Sunburst.
Eric: You saw massive change come out in the executive orders and what the government is focusing on. And at the senior levels, they're talking about it a lot. At the operational levels though, they're still not sure how to deal with that. They're still not organized and structured appropriately. The funding levels.
Eric: "Okay, so I have another priority number one. What do you want me to do with the 13 I have over here?" It's not built into our way of doing business, and John, I think your example is really well-positioned here. Just reporting our financials to the SEC on a quarterly basis. I mean, that's just something businesses do today because they have to. We're not doing it around cyber risk.
Rachael: I want to take this in a little bit of a different direction though, and I'm thinking about the Kasea.
Eric: She was not pleased with my direction.
Rachael: I love your direction! But I also wonder, because you hear a lot, I use the term whistleblower for lack of a better term. But you're hearing about Amazon, and recently Kasea, that there were people at the operational level. I would say, raising their hand, saying, "We're seeing what you could call and say technical debt, here. We're sacrificing security for the speed of getting products out the door." Allegedly.
Eric: Agreed. "But we don't have funding, we don't have prioritization. We have the wrong people. I don't have time. The business isn't enabling me. We've got great technical operators."
Eric: "But what do they focus on? What do they do? I've got 52 priority ones. We're not aligned, is what I'm saying. Top-down, bottom-up, both."
Cranking Out Pintos
Rachael: No okay, I will give you that, absolutely we're not aligned. We were talking before, and I love the Pinto, I thought those were great cars. But this idea, are we getting into these companies cranking out Pintos, and it's kind of like GDPR. A lot of companies said, "You know, I'm just going to wait and see what happens. And maybe it's cheaper for me just to pay whatever happens versus actually updating my infrastructure."
Rachael: How do you solve that? I mean, that's a big question. It's always about the risk calculus, the financial calculus that businesses have to go to. When does it become more advantageous for them to make those investments? And I don't know if we're there yet. Are we?
John: Business is not risk free, and we've spent trillions on cybersecurity. So I don't want to cast stones at anybody in particular, even those who may have been breached. Because in many instances that I can think of, without giving specific names, I believe that some of the companies that have had high profile breaches have been companies that have made some of the largest, as a percentage of revenue, investments in their security posture. And as I mentioned before, a well funded, frequently government backed hacker who has time and resources and the will to go get something out of your digital domain is probably going to ultimately be successful.
Eric: John, I hate to do this. I've got to correct. They absolutely will, they go after NSA, they can go after DHS, it doesn't matter. And they will be successful in some ways.
John: Even more terrifying. But Rachael, I do think you're right. You're going to make a decision, on a new product introduction for example, or a new release of a product. You are making a number of calculated decisions about the market, and about the efficacy of the product, and yes, about the security of that product. There is some measure, it should be a stop gate in anybody's new release that they're thinking about product security. And if you're selling to the US government, you have to. I mean there are standards you have to meet.
Eric: "If you want to deal with us, this is how you're going to comply."
John: That's right. So the new product introduction stuff that probably you aren't having the benefit of understanding with the current threat environment. What worries me more, and maybe where the threat is more profound, is legacy hardware and legacy software. Where the source code itself was originally written many years ago, we've just been iterating on it, probably. Obviously it wasn't written with an eye towards five, 10 years later if you still haven't updated that base source code. That's where a lot of times you find the most significant vulnerabilities that you're not even aware of. Because you weren't thinking about it when the original lines of code were created.
John: The type of threats that are present today. So there's a constant looking back, as well as looking ahead, to try and shore up and review your code base and think about your hardware vulnerabilities as well. That is a really pernicious problem, and a costly one that requires constant review.
[23:24] Staying Ahead of the Threat
John: And again, I'll just reiterate, I don't think any product is invulnerable. They all have vulnerabilities, and you realize new types of vulnerabilities constantly. So it is an iterative, constant drumbeat of review and shoring up and sending out patches and trying to make sure you're staying ahead of the threat.
Eric: Can we take a segue here? We should probably cover the six principles. I think they're outstanding. We're talking about risk and we're talking about government, but we can talk through the six principles and why they're important.
John: You bet.
Eric: I'm not qualified to read them out.
Daniel: Why don't I go through? So I think they're not necessarily in priority order, but we try to keep some of the important ones at the top. But the first one we focused on was that you have to view cybersecurity as a strategic business enabler. So you can't just push it off to the IT department and say, "This is your problem and I don't want to hear about it again."
Daniel: And this is what we had talked about earlier, where you need to have continual interaction and really think strategically about cyber. Like John said, there's always going to be pluses and minuses to any new product or business line, or whatever it is. But you think strategically about the financials of it. Boards, anyone in the business needs to think strategically about the cybersecurity implications as well. And also how better security can enable you to run a better business, a more resilient business.
Economic Drivers and Impact of Cyber Risk
Daniel: The second one is to understand the economic drivers and impact of cyber risk. We've been trying to get to the quantification of cyber risk for a long time in this industry to understand, what are the values of cyber risk that we can apply? And I think when you're talking at the board level, we want to keep furthering that conversation. We need to be able to put a price tag on some cyber risks, or otherwise equate it with the other economic bases that boards use to make their decisions.
Eric: How do we do that? Do we look at it and say, "Okay, if our systems go down we're out of business for 10 days. How much revenue do we do over a 10 day period? That's our dollar quantifiable risk." Or pound, or whatever your currency is. Is that the easy way to do it, or are there better ways to look at that?
Daniel: I think that's part of it too, but there's also the likelihood of whether this will happen or not, which is the difficult part. Yes, the probability bit of it, understanding what it costs us if our system goes down, you can calculate that. Understanding whether that's going to happen.
Eric: I could take it back to Steve Groban from McAfee, he has a probability of cyber equation. Probability equals opportunity times risk.
John: I'm going to attempt an analogy here, Eric, to answer your question. My home that I'm working from today, I've got basic ADT perimeter security and a couple of security cams. And that seems sufficient for me, given the assets inside my home, and that's a calculation I've made.
Adequate Security Posture for Cyber Risk Governance
John: If somebody really wanted to get in, they could. If they had the time to stake the house out, know when I'm not here, know when the dog's not here, they could get in. I'm aware of that. But I think my security posture's adequate given the assets I'm trying to protect and what I use the assets for.
John: If for whatever reason, you were like, "Hey Holmes, can you house the Hope Diamond at your place this weekend?" I would be very nervous that my security posture was not adequate as of that moment. Because the asset I was trying to protect when I structured that posture wasn't designed with respect to knowing that I'm going to become a major target because of the asset that is now within the perimeters of my security defense.
Eric: I think the important thing for me, when you say that, John, is, you've actually had that consideration, that thought. You've thought about what adequate protection is for your environment, and I see that missing in most of the conversations I have with all customers. They don't think in that manner. "What is adequate? What are we doing? How do we address risk here?" They don't have risk in their equation, many times.
John: Well that's what this principle's about. They should. And again, there's no one type of standard that says, "If you're in this type of industry vertical, you likely have this type of intellectual property, or this type of PI of your customers," et cetera. Then, "That equates to this type of technology in your security stack, and you should..." It doesn't exist.
John: That type of detailed standard doesn't exist. Because it's just much more subjective in terms of where you are and what you're trying to protect. But you should be having the conversation. And it's the board's duty and management's duty now to have that robust conversation about the exact nature of their business and the risks that are likely to threaten their business.
John: They got to be asking the questions, they need to have independent auditors come in and help advise them. But that's the nature of, I think, this principle, Daniel, is that the conversation has to be happening. "What's the right balance of risk and economics for us to be in business?"
Eric: I think one and two are in the perfect order. I don't know about you Rachael, but hey, understand this is a priority for the business. And then understand the drivers into the priority. Number three.
Daniel: Right, and I think the other two kind of follow as a pair, where you need to align cyber risk management with your business needs. I think we all recognize that cybersecurity is not an end in itself, it needs to serve the wider business. And then the fourth one is that you need to ensure your organizational design supports cybersecurity. So sort of the flip side of that.
Daniel: If you've aligned cybersecurity with your overall business needs, then you also need to make sure that your business is based on resilience, based on security principles. You create the right kind of cybersecurity culture. So I think those two flow together as a pair. If you do them both right, you end up with a more resilient organization.
The Ability to Influence
John: Yes, so it's not just the amount of investment you're making in third-party vendors. It's also, how are you staffing your Infosec team? Do they have the ability to influence if they are under-resourced or there's a significant risk that is not being addressed? And do they have access to the CEO and the CLO and the trust officer? Do they have the ability to influence, to course correct, if those security professionals identify an area where they're under-resourced? It could be people, could be technology. The point is they're having the conversation all the way to the audit committee and then the board level when there's decisions that have to be made.
Eric: And we saw a lot of this in the government with Sunburst. They weren't organizationally structured and focused from a perspective of even determining what was happening on the networks, in many cases. And we're seeing them evolve now with enhanced logging requirements. Just store the data so we can at least go back and see what happened. So we are seeing some change here as they understand the risk and the need to understand when an adversary's inside their business, what that means. These are great. I couldn't argue with one of them if I tried. I think they're on point, they're outstanding from my perspective.
Daniel: That's good. Well I think John can say this as well. But we had a great group of very strong personalities, leaders in businesses, arguing about these for several months. So hopefully we got to the point where they're more widely acceptable, and that was the way we work.
More Than Just Principles
John: I think it's notable that you even invited me to the party, Daniel. I don't think that when we did these, or World Economic Forum did their prior iteration of this back in '17. It was probably as broad in terms of the participation. Because at this point, it's really much more understood that this isn't just principles that can be created by your group of chief information security officers.
John: At this point, you had representation and opinion, and you invited it from virtually across a traditional corporate C-suite. Which I think is, again, reflective of this sort of all hands on deck ethos that now is pretty well understood, at least across sophisticated companies. That this is not just, "Point to your IT organization or your InfoSec organization." Everyone has a hand in this, and decision-making spans the board and the C-suite at this juncture.
Daniel: Yes. I think that's a good point, and just to digress a little bit, just watching the evolution of the people who are engaged in this. The Forum runs a multi-stakeholder process. That means we try to engage everyone who has an interest, who has a responsibility, who can benefit from whatever public good we're focused on. And that universe has widely expanded in the three, four years since we've worked on the last iteration of our body of work here. You have a lot more different roles, a lot more different people, geographically speaking as well, coming together to talk about these issues.
[32:25] The Next Principle Around Cyber Risk Governance
Daniel: So that's a good sign, it means there's just more people engaged. It also gets us to the next principle, where boards have to avail themselves of cybersecurity expertise. And that doesn't mean that every board needs a cybersecurity expert. I'll say that even though it kind of limits our future career opportunities. So you know it must be true. But it means that boards need to listen to people within their organization. Other board members, people outside their organization, and gather that expertise concerned about.
John: Yes. And you'll note that the principles don't mandate or strongly suggest that you are not going to be using best practices if you don't have a cyber expert on your board. Which is fine, I think that's fair for today, although you are seeing a lot of Fortune 100 type companies actively recruiting our peers in the industry for board representation. Because a lot of people are getting ahead of it and believe they do want to have that specific voice on the board. In the same way you have somebody with deep accounting and financial principles chairing your audit committee.
John: It's likely that the current threat environment perpetuates that you will start to see cybersecurity moved away from an audit and risk committee into its own committee, just given the nature of the risk to the enterprise. That's quite conceivable if we don't do a better job of collectively defending business. But in the near term, you are seeing a recommendation.
Your Cybersecurity Expert
John: There's a risk of just having one person too, Daniel, who's your cybersecurity expert on the board who sits in the corner. It's even better to have the entirety of your board trained up. Understanding the basics of where they need to be spending their time and the questions they need to be asking when it comes to cyber threats.
Eric: I'll tell you, the one that underscores this for me is when you watch congressional testimony on the Hill. You see some of the questions coming out from the members, and they really don't understand these questions. It's so illuminating. They just can't comprehend the questions they're being asked to ask, and the responses coming back, because they don't have that expertise.
Eric: The military's really evolved over the last decade also. We had generals and admirals from the signals branch predominantly in cyber, but they didn't know cyber, they knew radios. And everything was a military problem. Cyber protection teams, platoons, and they organize in a military structure. It's like, "I don't think you quite understand this" sometimes. Another one where you have to have that background.
Eric: But even you, John, coming up from a legal environment, you don't have a cyber background per se, or you didn't 20, 30 years ago.
Eric: You weren't trained in it, you learned it. You now know what questions to ask. You know how to basically position questions to understand risk better, because you're more informed these days.
John: General counsels and chief legal officers have to meet their obligation to advise their boards about what governance and proper governance needs to exist for them. To be managing the company and meeting their obligations to the shareholders, yes. You're absolutely right.
Strain of Legal Practice Around Cyber Risk Governance
John: No, I did not talk about cybersecurity in law school, I did not study cybersecurity in law school. Although within the last decade for sure, you started to see a distinct strain of legal practice around cybersecurity. And a lot of that's because we got a whole host of new laws related to breach reporting obligations. GDPR and other regulatory approaches mandating some type of data protection, which resulted in what we're now calling cybersecurity law. But when I was in law school, that was not a course of study that one could focus on.
Eric: But every board member should have a level of responsibility to educate themselves, I believe. If you don't know it, ask questions. Learn. Read. Understand. Legal, marketing, whatever your discipline, your background may be, understand the implications of InfoSec, cybersecurity, what the adversary's doing to your functional area.
Daniel: Right. And I think that's true of all leaders, but especially board members. We choose board members for their leadership ability, their experience in business, and their judgment. And their ability to educate themselves on these issues and read up on them quickly. Cyber's just another issue that they can apply their discernment, their leadership ability and their ability to understand things too.
Daniel: So that's what this principle's all about, and I think this also is one that's generalizable. You talked about people in Congress in the US, obviously with the Forum we work with a lot of senior leaders in government, whether it's ministers or heads of state. And all these people, they came to their job, they might have had an expertise once in the past, but they're all generalists.
Making Better Decisions on Cyber Risk Governance
Daniel: They're able to be successful because they can take in knowledge and make good decisions based on that. So this report and a lot of the other work we've been doing is just another example of one thing they can take in, and it will help them make better decisions. They don't have to be experts on this. They just have to take their job seriously, and take their responsibilities seriously.
John: That's a great point. Great point.
John: Okay, so number six.
Daniel: Yes, number six. This one's important to the Forum, as you can imagine, as an international foundation. That's to encourage systemic resilience, encourage collaboration. So it's just, recognize that you're part of a greater whole. Improving your cybersecurity makes everyone in your industry and in your nation, in the world as a whole, more secure. Because anybody can be a vector of attack to anybody else. And in order to do that, people have to collaborate.
Daniel: So putting together these multi-stakeholder bodies like we do at the Forum, engaging in some of these other information sharing networks. These are things we want boards to encourage. And members of boards, directors themselves, are a great vector for sharing information, because many people sit on multiple boards. They collaborate with their other board members, they can bring good practices from one company to another. So we're trying to foster that thinking through this last principle.
John: We've discussed this, but that's another hill to climb. I think that could be another evolutionary step for collective security. Right now we're relatively siloed. In law enforcement, you certainly see information sharing coalitions. You see it in some industries, but you don't see it in most today.
[39:15] Companies Are Operating Independently on Their Cyber Risk Governance
John: Today, most companies are, again, operating independently when it comes to defending their organization from cyber risk. We believe it would be appropriate, again as a next evolutionary step. For there to be much more of a group-sourced approach to how the threats are evolving and what is working effectively to defend against those types of threats.
John: Now, there's going to be some resistance. People worry about sharing that sort of information, so we need to foster and engender new ways of bringing enterprises together. To allow them to, in a way that's not reputationally damaging or exposing some sort of intellectual property crown jewel. That doesn't need to happen. There are ways, and that's maybe the subject for a different podcast with Daniel. For us to come together in industry verticals and other cooperative approaches to defending ourselves in a better way by sharing information.
Eric: Do you think it'll happen though? At some level, don't companies say, "If I'm better than my competitors, that's a competitive advantage for me. I'm not going to share, because I'm the best at InfoSec in my sector."
Rachael: Yes, but then you get hit, and then your point of view changes though, doesn't it? And then you get by the SEC, because you didn't disclose in enough time after you knew about it, and all these other things. I don't know if you could be that arrogant and still succeed in business.
John: At some point, it becomes so painful collectively that you're driven to come together and don't battle one another on that particular front. I'll hearken to something.
Collective Pain Threshold
John: Sometimes industries see patent industries, and you get whole patent wars that arise within an industry. At some point, the cost overwhelms the benefit of trying to continue to prosecute that war. You find détente, and everyone crosses licenses. Similarly, there's going to be a collective pain threshold where companies are going to say, "You know what? It's better for us to cooperate on the defense and battle one another in the market than it is to try and differentiate ourselves in this particular area."
John: Again, knowing this is table stakes, every company's got to be investing. Every company's got to be trying to protect themselves as a condition to being in business. This isn't necessarily the area where you want to differentiate yourself. That being said, today you can differentiate yourself, especially when it comes to product resiliency. That's not going to change, and that's not when I'm talking about. I'm not talking about sharing intellectual property with respect to who can deliver the best product to the market that is the most cyber resilient. That's an area you actively compete.
Eric: You can compete on that.
John: When it comes to the best, most effective, most secure product in the market that's best for consumers. Great. That's where the competition should happen. I'm talking about internal cooperation and a shared defense strategy when it comes to the types of pernicious threat actors that you see everyday. And you've seen for a long time in the government, that is now sprawling outside of government and outside of critical infrastructure, and also actively attacking nation state attacks, against private enterprises. That's where private enterprises need to take a page from the government and do a better job of information sharing on the defense.
No Company’s Strong Enough
Daniel: Right. Just to add on that, no company's strong enough to go alone in this. If you're a hiker, there's that saying that you don't have to be faster than the bear, you just have to be faster than your slowest friend. That doesn't apply here. The bear has an appetite for everybody. And in this case, collaboration among like minded companies or similarly situated companies is the only way to get stronger.
Daniel: There's pre-competitive work that needs to be done so that everyone can be more secure and more safe from a business perspective, but also from the perspective of doing the right thing for the wider society. These are risks that spread beyond individual businesses, and we need to work together in order to limit their impact.
Eric: And I would argue that there's no government in the world able to protect the organizations. The organizations have to come together and protect themselves. The US government, China's probably more capable, honestly, of protecting their companies.
John: Because of the nature of their government, yes.
Eric: The nature of the government, the Great Firewall, the authoritarian society. They're probably the most capable, but even they aren't invulnerable. Organizations do need to work together, because it's not like you can call DHS and say, "Hey, I'm under attack, stop it." That's not happening. It's just not going to happen. So how do you come together? I do think the financial industry, from my experience anyway, is the best in the world at doing this, where they do work together. But to your point John, they compete on the product that they put out into the market.
Please Stop This
Daniel: Yes, and I think in the financial industry too, they've also recognized that they need to work together as a group. But also with the government. So yes, you can't call up DHS and say, "Please stop this." But working beforehand, if we're talking about resiliency, including that as part of your planning and part of your recovery plans, it's really important. That's something that's often neglected.
Eric: I'm not saying don't work with the government. Industry agency partnership is really important to the government, we haven't figured it out yet. That's got to be part of that resiliency, part of that plan. Don't rely solely on the government is what I'm saying.
John: I like the idea of more government private sector partnership on this. That's what you'd like to see continue to evolve. Because President Biden's executive order on cybersecurity does have a number of interesting things and it's a good beginning. It's taken some time since we'd last seen a focused executive order around cyber. But one of the principles he offers in here or suggests in here is some sort of cybersecurity safety review board that would be a public-private partnership, which I think is a really great idea.
John: Similar to NACD and ISA and World Economic Forum coming together to publish principles of what good looks like, constantly evolving. Good may look different, but at least that sort of government and private sector partnership could produce standards. Response playbooks, other things to give enterprises a good idea as to what best practice looks like at a current moment in time. And there's lots of benefits to that.
[45:56] Better Defense for Cyber Risk Governance
John: Not only is it better defense, but it ought to insulate those enterprises who are meeting the standards and following the guidelines promulgated by that organization. To have better liability protection as and when they do get breached despite their best efforts.
Eric: That would be a great world. We're not quite there yet. But I think it would be a great world. I think it's very aspirational, John.
John: Give Daniel another month, he'll get us there, don't worry.
Daniel: We'll get there. Yes, exactly. Well it's August, so maybe two months.
Rachael: So we're kind of coming up on time, but one of my favorite questions to ask, and Eric knows what this question is. I would love to get the perspective from the two of you. Given all these, given all the madness out there and seemingly impossible challenges to overcome, do you have optimism for the cyber path ahead? Daniel, do you want to start?
Daniel: Yes, I'll start. I think yes, absolutely. Look, I've seen over the last decade or so in the space increased cooperation, increased interest at the highest levels in the topic. And I think that that's the first step toward having an eventually more successful result in all of our cybersecurity endeavors here. So yes, I think I see more and more people cooperating. I see more and more people having an interest in this, and having the right kind of interest, asking the right questions. So yes, I feel positive.
Daniel: This is something that's going to be a constant struggle. But I feel like we're going to get better at it and improve our processes and improve our ability to cooperate.
John: You know when we were kids, my great fear was this nuclear end times. And we've managed to survive despite that being an existential threat to the world. Similarly, cyber threats are at that same magnitude in my mind, in terms of living life and having a global economy and the way we have today as massive kinetic warfare type threats at this point. It's just another massive threat that we have to evolve to protect ourselves as a society. And understand that that threat is in fact at that level, and develop tools and strategies collectively to learn how to live with that threat behind us.
Rachael: So a resilient society, I guess.
John: Requires constant understanding and evolution.
Eric: I think what I'm hearing, Rachael, is, I'll compare it to food. Societies from the beginning of time up until maybe the last 200 years were all focused around food and just making sure we had food to eat and survive. We're not that bad. We are going in the right direction, is what I'm hearing you say, Daniel and John, and there's a lot of promise out there. We're so much more advanced than just figuring out where the next meal is coming from.
John: So I guess the desire to survive is sort of an inherent instinct, and hallmarks humanity's need to constantly evolve. So I think survival, definitely, will drive the next level of evolution. This is a question of survival when it comes to the global economy.
Eric: It really is.
The Dystopian Future
Rachael: It'll be interesting to chart. I was watching Blade Runner 2049. I always wonder about the decisions we're making today, the choices of how we evolve and move forward. Are we getting to a fork where it's the dystopian future, where everything is dark and the sky is scorched? Or does it continue to be sunny skies and all the other good things? So it's an interesting time to keep track of for sure, and I would love to revisit this conversation and maybe see where we are in the next six months or so.
John: I'm going to focus on the sunny skies.
Eric: You got to keep a smile on your face and you got to keep going towards that North Star.
John: That's right. Keep getting better.
Eric: John, thank you so much for joining us today, for the time. We do not talk about risk enough in cyber, we just don't. And really appreciate your insight, your thoughts, and your time.
Rachael: All right, well thanks you guys, and everybody, don't forget to smash that subscription button. We're all about reaching more listeners, and you get a fresh new episode every single week in your inbox. And who doesn't want to hear from
Eric, myself, John, Daniel on a regular basis? Until next time, guys, stay safe.
About Our Guests
Daniel Dobrygowski is an attorney and educator with close to two decades of experience at the intersection of technology, civil rights, law, and policy. He came to the Forum as a Global Leadership Fellow and was one of the founding staff of the Forum's Centre for Cybersecurity. Previously, he practiced law with international firms in San Francisco and Washington, DC. in the areas of antitrust, consumer protection, IP, and privacy.
He conducts research and publishes in the fields of cybersecurity & resilience, digital trust, election protection, internet rights, and corporate governance. Daniel holds an MPA from Harvard University’s Kennedy School of Government, a JD from the University of California, Berkeley, School of Law, and a BA from the Johns Hopkins University. He sits on the board of the Cyber Risk Institute.
John D. Holmes is Chief Legal Officer and Corporate Secretary at Forcepoint, a global leader in data-first cybersecurity solutions that protect critical information and networks for governments and enterprises across the globe. As Chief Legal Officer, John leads the company’s legal and regulatory affairs, intellectual property creation and protection, litigation, M&A, ethics, and compliance programs. He is also primary advisor to the Forcepoint Board of Directors on cybersecurity law, privacy, and associated regulatory compliance matters.
John started his legal career in private practice with Thompson & Knight, before embarking on his corporate career with Motorola, followed by Freescale Semiconductor where he served as Vice President, Legal and Government Affairs, until the company’s acquisition by NXP in 2015.