Demystifying Security's Wizards - with Tony Sager
Joining the podcast this week is Tony Sager, Senior Vice President and Chief Evangelist for the Center of Internet Security. He shares insights from his 45+ years on the security risks front lines, including 34 years at the NSA. Risk was a big theme of the discussion, particularly looking at risk through a similar lens as we view other risky domains, such as the great work being done with the Cyber Safety Review Board. (And he shares color on the power of being okay with the risk of being wrong sometimes).
He also shares his perspective on moving to incentive-based cyber models (such as what’s been done in Ohio and Connecticut), and the criticality of translating technology, attacks & attackers into public policy and market incentives. And it can’t be a great cyber discussion without addressing the growing sophistication of cyber criminals and their organizations – really becoming the de facto organized crime success path today.
Demystifying Security's Wizards - with Tony Sager
[02:03] Cybersecurity, Where Are You?
Rachael: Welcome to the podcast, Tony Sager, senior vice president and chief evangelist for the center of internet security.
Tony: Thank you very much. It's great to be here.
Eric: How many years did you spend up at NSA Tony?
Tony: Just short of 35 years.
Eric: We're going to give you the extra couple of months because we appreciate your service to the country. A fellow podcaster, you have 36 episodes out on the Cybersecurity Where Are You Podcast, which I listened to three episodes this morning alone. I was driving up to New York, and what a great content.
Rachael: In your time in cybersecurity, before there was cyber security, you've really seen the gamut of change in evolution. I was reading your blog; I think it was from February. You were talking about The Cyber Safety Review Board, and that was being announced and you were joining the board. I really love this characterization of wizardry that you talked about in cybersecurity.
I’d love it if you could share that with our listeners because I found that fascinating.
Tony: Thank you for reading that number one Rachael, but yes, it's a phrase I've been using here recently.
Wizardry is the model that I grew up in. This idea of brilliant people inventing the right perfect mathematically verifiable technology for operating systems and so forth.
There's a lot of great technology and brilliant people that preceded me, in terms of the basics of computer security, computer science, networking security, et cetera. That's how I made my living.
Impressed By Security Risks Wizards
Tony: I grew up among these really smart people with great ideas and all that stuff. It occurred to me at some point in my career, as I got more into testing. People are really impressed by security wizards, but they almost never do what we recommend that they should do.
They're really impressed. Certainly, in the context of my work, testing a network system, finding all these flaws, you give a lovely bit of PowerPoint to the general, it's so cool and great.
He turns to his poor IT people who are underfunded, and under-equipped. Yet another report of stuff that we know we need to fix, but we can't afford to, or the boss won't pay attention to it. So, what I say is that wizardry is a great thing. It's great that we have smart people involved with this. But wizardry turns out to be better for job security for old guys like me, and it's not very good public policy.
You can't really run a country or economy on wizardry.
If you treat the bad guy in particular as a wizard or a magician, your only defense is to hire your own magicians, and they're in short supply. You have to pay them a lot and they speak a funny language. This idea of treating this as wizardry has gotten us to a certain point, but what's happening,
I think in the big sense across the industry, and it's a healthy thing. It's uncomfortable for all wizards by the way, but it's the right answer. We don't ask for wizardry so I can safely buy a lamp without getting electrocuted or so I can fly on a plane or I can worry about public health.
Building a Lot of Security Risks Management
Tony: We build a lot of risk management into our decision-making infrastructure. We codify things, certify people, establish a body of knowledge, certify products, and we are building codes. All this stuff is there to help us manage risk. That's the big shift we've seen over the last few years in the industry.
Everybody's a cyber person now. They just don't know what they're supposed to be doing. Lawyers, right insurance companies, regulators, and so forth. It's important to get the historical context, again, we need many more wizards. There's a phrase I used in my talks, we got plenty of wizards. We need more security mechanics to build the stuff that we have not put in place yet.
That's where that comes from. It was part of me growing up in the business to say, you grow up in a certain model. Then you realize, we're not really changing things at the rate we need to so we can function as a society. So, what am I missing? It was about that transition, for me personally, and then my observation of the industry.
Eric: You'd say it makes the wizards uncomfortable. What are they uncomfortable about?
Tony: Wizardry, I guess. I joke about it, but it's actually true, that there's a certain prestige that comes with being a wizard. They have to pay you well. The boss doesn't really understand you, but he knows he needs you. There is a position that you have in the pecking order of jobs and your role in the organization. There's a discomfort with that when that is threatened. There is a blog thing I'm working on Rachel, which you might enjoy. Look forward to the next few weeks. I say this with great affection.
Why Security Wizards Are Better at Managing Their Own Security Risks
Tony: I grew up in this business, but security wizards sometimes are better at managing their own risk than they are at helping everybody else manage theirs. So, what is this risk they're managing? What's the loss of prestige or the risk? The loss of high-paying jobs. A lot of it though is about the risk of being wrong. So, you asked me for my advice. I tell you to do this, but I don't like to be wrong. So I'm going to tell you to do this, that and the other thing.
You're going to do a hundred things. I'm going to give you a list of 150 things. That's how in the industry, we wind up with these gigantic control catalogs, complex frameworks, and all this stuff. There's a natural conservatism that comes with this. I don't want to be wrong. If something does go wrong, I don't want to be the one that said, you didn't tell us about so and so. So I say it just, but I really mean it seriously. There's a risk management equation at play here at the end of the day.
It's not my risk, I get it. I like prestigious jobs and getting paid, but at the end of the day, we're trying to manage other people's risks. What do they need to make a decision, not what do I need to make me feel better about what I've said?
It's really something that we all need to be thinking about here. At the end of the day, this is about how you as a citizen make decisions that are inherently risky. To fly, to drive across a bridge, to eat food at a commercial facility.
Mechanisms of Public Policy
Tony: You don't become an expert in all those things, you count on people who are certified. Knowledge has been codified into building codes or whatever the model happens to be. I'm a bit of a liberal arts mathematician too. I studied history a lot. Looking at the history of how we wound up with safety things, how we develop mechanisms of public policy or civil engineering or medical practice and all that stuff. I think it's an important lesson for all of us in this new IT-oriented cyber age.
Eric: If I can paraphrase just to make sure I understand, it's almost the democratization of cybersecurity. As more people join the community, as more people know more, those extreme wizards, those experts feel somewhat threatened because they're not the end-all-be-all. Is that what you're saying though?
Rachael: Demystifying a bit?
Eric: Pulling the curtains back?
Tony: I call it mainstream. That's probably not a great term for that. Again, please, I love wizards. I grew up around them, but it's putting their work in context. It's not enough. With the naive model that I grew up in, I know I have the military context. I have this network that supports my military operations. So, I got to make sure that the bad guys can't do it. What am I going to do?
I'm going to bring in these smart folks from NSA or wherever I bring them into, and they're going to hack an attack, pretend to be the bad guy. They're going to give me a list of things they found and I'm going to fix them. It'll all be good, and we'll move on.
[09:53] A Symptom of Larger Problems
Tony: Well, we've all learned over time. Any problem you find is just a symptom of larger problems. If there was a neat virtue to my career and roughly speaking,
I came into a place that was just an extraordinarily interesting place for its time, where the cyber stuff kind of grow up. The first third of my career, and just for context, my career has been through working for the defense, but being one of the pretend bad guys. If you have a system where there are a lot of risks involved, and you're wondering what are the bad guys thinking and how might they take advantage of this or undo me or get around me?
But those bad guys won't actually tell me what they're going to do. I'd rather hire my own pretend bad guys to go look at this. We call that in the business, red teaming, black hats, and all that kind of stuff. I was lucky enough to come into that, starting primarily in math and then in computer science.
Roughly speaking, the first third of my career was learning this technical craft, which at that time was really NSA or inside government. That's where the action was. The middle third was around moving into management, getting to see lots of this kind of work at scale. I think at one point, and I say this very humbly, I might have been running the biggest vulnerability-finding machine for defense in the US government.
Eric: I'm sure you were.
Tony: Seeing lots of things fail, how we could pretend to make them fail. But I can say with a straight face, that I'm one of the few lifelong defenders who spent most of his career inside the intelligence agency.
What Nations Do to Each Other
Tony: At that time, that lets you see what nations do to each other. How we go after other people, how they come after us, how they go after each other. I got to see a lot of failure at scale. What really struck me was, that I'm just seeing the same things over and over again.
Whether we go find it or they go find it or they find it in somebody else.
It wasn't like millions of unique attacks happening every day. It’s millions of repeats, of the same old garbage over and over again. What's going on here? We're not learning? That was tied to what we talked about earlier. Realized, that if just pointing out vulnerabilities was the key to fixing them, I should have been out of work 20 years ago. The idea was, if I just bring in these hot shots, they point out my problems, I'll be inspired or scared or something and we'll go fix them.
Much to my surprise, once I got to see this at scale, wait a minute. It's not inspiring people, and it's not scaring them. They just get overwhelmed by it. It's just more noise in their day. These are complex networks doing expensive, risky things and business owners or military leaders have lots to worry about. They're not breathlessly waiting for me to tell them what to do. They got a lot hundred other things we're trying to juggle.
Clearly, something was missing in the basic industry model of fine flaws, informing people, and walking away with no responsibility to fix it.
Managing Security Risks: Best Job Ever
Tony: The best job ever, that's the job I had. If you can get that gig, you should take it because it's great prestige. It's great fun. You're around clever people who appreciate what you do. It turned out, that by itself was not enough to fix those problems. I just felt so lucky to be in such a unique place to learn the technical craft. To have the opportunity to see across all these different sorts of jobs and what was happening at the national level to get to work with the vendors.
The first third of my career, learning the technical craft. Second third is seeing things at scale. To be honest; I spent the last third of my NSA career going, what are we doing wrong? Why are we seeing these same problems over and over again? It had to do with this observation. Just knowing about the problem doesn't get it fixed by itself, because fixing things is about incentives, it's about authority. It's about how I buy stuff, how I operate it.
That became my quest for that last third, I was still running the vulnerability analysis up at NSA, but really focused on how to create knowledge from this testing and analysis that can drive more directly the solutions. How to talk to policymakers, to the industry, to standards organizations, and put more information out there, that can prevent the problems that we're finding. That was part of the discomfort of wizards too. You hire great people, you train them to be on the red team, for example, go find more problems.
We Need a Good Story of Dealing with Security Risks
Tony: The goal is that it shouldn't take a national class asset to go and find that packages are missing, that our configurations are sloppy, that we have not managed changes well or administrative privilege well. That's a lesson that we all know now. Sometimes, what I tell people is that in the early days of red taping, we actually forget why we were doing it back then. I call it drama.
Eric: We need a good story.
Tony: You have to convince the senior decision-makers and the generals that they have a problem. Therefore, we need drama. Oh, your military operations won't execute. There's a noble purpose, but we keep doing it. By itself, it doesn't produce enough information to actually solve the problem. You're doing it for a legit reason. To inspire people to do something, but to actually create information that would drive policy, that would drive purchasing is a different class of problems.
If you don't connect those, then you don't really take down the solutions path. I can tell you what happened. The NSA red team worked for another organization, a Navy captain in the blue team. So, the red team pretends to be adversaries, nobody knows you're coming, et cetera. The blue team worked for me, the friendly face. They show up, how are you doing? Let's scan your systems and we can work together.
It was so broken back in these early days, that there was a well-known thing that can't be explained in public. But the red team had found this particular problem, scared everybody to death across a really large swath of the government. I guess it was tasked to go fix the problem. My blue team.
Tony: So, I go to the Navy captain, "I know something about this problem. Can my team get briefed, so they know what to do?" He goes, "Sorry, you're not authorized to hear the problem." I said, "But we're supposed to go fix the problem." He is saying, I know what you mean but, and he really meant it.
He said, "No, the senior person at the Pentagon said, close hold. No one else should know". And I said, why do I get invited to a meeting with this guy? I can't believe he's an idiot. He wants this problem fixed. Now, when he wants NSA to bring every tool it's got to bear.
Well, I happened to meet this guy and become friends with him years later, and he laughed uproariously when he heard this story. He said, "Of course, I wanted everybody that could fix this to know about it." But it was a sign of the times. The idea was, here's a problem in a live VOD system. We can't let anybody know.
What I also observed was back in those days, that by over close holding or by holding the information too closely, we were actually crippling the defense. We were not crippling the bad guys. They know what we have and what we're running. They've done their homework and their reconnaissance. So, this idea of how do I openly share this in a way that makes sense and can really empower defense? Some of that was the reasoning behind it. I don't know if you've followed. This is sort of a lost history now, but it was in June of 2001.
[18:06] Believing the Defense Department Is Never Going to Solve Security Risks
Tony: As an outgrowth of all the security testing, really clever people that worked in our group put together what we call then the NSA Security Guides. Here's what we think is the best way to configure a Windows desktop, or a web server, mail server, et cetera. I got permission to release that to the public in June of 2001, through the early days of nsa.gov. That was more than just being a nice person.
The idea was, given the background I was growing up in, that I thought the defense department is never going to solve this problem by itself. There was also very conscious activity to, for example, outsource a lot of non-war fighting functions to commercial suppliers or to allies, or other people. Our perimeter was disappearing. You don't get there's no nice, neat, tidy.
Let's secure the defense department. It's, wait a minute. How about all our suppliers? No one gets paid in the DOD. There are no goods that get moved to the right spot without commercial friends. All these things are happening for all kinds of good economic reasons, but now you've essentially dissolved your perimeter. We have to help everyone get better.
That's just the nature of this universal connection. This network that we all live on. I was seeing the late 90s, and early 2000. 2001 was a big watershed event for us to both put this content in the public. But I also wanted to open what we were doing in a way that was very okay and observed at the time the open source was really taking off.
One Good Call
Tony: The idea was the government isn't going to pay everybody to fix this. We're going to have to find market-friendly community ways to improve everybody. If we wanted to be out there participating, we had to bring our share. Our share was, we've developed all this work, we're going to put it out there. This is what we do, and this is our contribution to the cause, was the way that I thought of it.
If I made one good call in my career, that was it. I pushed this idea of let's get outside the building, let's share what we're doing. We always got back more than we gave. People appreciated that. Back then, it was mysterious that NSA was out there in the public to participate in standards groups, provide content, go to industry events, to give talks.
It was a really, I think a sea change in the way we approach the mission. I had a part in that just based upon what I described, which is realizing we're not going to solve this through the red team. That's just a data point that allows us to understand something that we then have to translate into action.
Eric: To this day, if you go to nsa.gov, NSA is still releasing information for the betterment of the public, where you can get information and understand how to better protect your systems and why. That's continuing today.
Tony: I watched that carefully also because it feels really good. Many of these kids that came up in our group are the real leaders there now.
Security Risks Guidance
Tony: It is so amazing to see all the great work, the public-facing things that they're doing out there, giving talks, conferences, participating, and watching the behavior in government. If you watch this sort of thing, a lot more joint work between the folks at DHS, and FBI, very openly, publicly working together around incidents. Bad guys, reports on what's going on, advisories for this and that, and the other. That was just, I can't tell you how good that feels as an alum, to look back and see the momentum there.
When I mentioned, we released the security guidance for the public in 2001, which was also around the time of a thing called Security-Enhanced Linux that came out. I think December of 2000, not from my group, but from the R&D folks down the hall. It was a race to say who could be more involved in the public.
The guy that was the chief back then once said, we were first Tony. I said, yes, but we blew you away in terms of downloads. That SE Linux stuff is only for the really serious practitioner, mathematically minded folks there. But the spirit of it back then in the early 2000s was about openness and sharing, and how we translate all this into action.
Eric: It's 2022 right now, the adversary has clearly become more funded, more incentivized, and a lot more information is out there. You've talked about the government working better together. People have become much more knowledgeable. There are more wizards, there are more different levels of wizardry. How are we doing in your opinion? Is there anything you would've changed? What's working and what's not?
Capitalism in Action
Tony: You have a challenge, and everyone can find great examples of working together, there are lots of good stories. But some days, it feels like the problem's getting worse faster than we're getting better. I'm a bit of a hopeless optimist. You can't work for defense for decades without being a hopeless optimist, you love that kind of challenge. There are great signs and great opportunities for working together and great progress in a lot of ways. But for a long time, it felt not as optimistic as that.
I used to joke in my talks, and it wasn't quite a joke. If you want to see capitalism in action, look at the bad guys, not the good guys. What you saw for folks that follow criminality, is where criminals are going because this is a classic, that's where the money is.
There's a quote from a friend that I use with permission. If you know the name, Shawn Henry at CrowdStrike. Shawn's been a red group, he's a wonderful guy. He was essentially, I think, the first cyber executive at the FBI.
He was once quoted in the paper, and it was so good. I called him up and said, did you really say this Shawn, can I use it with attribution? He said, of course, you can. The quote is something like this, and this was long ago. It goes, "Anyone in organized crime, who's not moving into this cyber stuff ought to be sued for malpractice." It was so good. I called him, did you really say that? He says, "Of course, I did. Why would you run into a bank, draw a gun and put your life at risk when you could do this over the wire?”
Eric: Couple keystrokes.
[24:34] The Rise of Criminality and Security Risks
Tony: Much less risk, much greater return. He says something like this, I don't remember the exact words. Even if you're not smart enough to do it yourself, you're certainly nasty enough to kidnap somebody and force them to do it for you. It's a reminder that this is not going away. This is where the money is, this is where our national treasure is. And this is where your personal fortune is or whatever information the bad guys want, so this is going to happen no matter what.
Then you saw the rise of criminality. We're in the midst of the rise of criminality and you also see things like specialization. There's the very Darwinian, only the strong survive. You see the natural specialization, tool builders, reconnaissance, how do I monetize this? What we think of as economic optimization happening right across the criminal enterprise.
Eric: That's exactly what's happening.
Tony: Then you look at that, wait a minute. I gave a better out-of-date example of how we, with good intentions, did not share potentially useful defensive information. All we did was cripple the defense, not the bad guys. We're still struggling with some of these issues across our own industry, around the sharing of incidents. A lot of things we've gotten over. It used to be in the early days was this embarrassment of we've been hacked. If you haven't been hacked, you're not paying attention.
There's a lot more acceptance of the problem and awareness that there's a common problem and therefore a common good to be had here.
Lists of Recommendations to Handle Security Risks
Tony: We have seen progress in that, but I'd say there are still ways to go in recognizing what might be the most useful things for us to do socially so that we can deal with this problem at scale. These are classic community-wide problems. No company gets to solve this problem by themselves or for themselves, the government can't solve it by itself or for itself.
It naturally leads us to these systemic, complicated problems where you don't get to change one thing. You have to change a lot of things in order to make any progress. To take us back to the topic of the Cyber Safety Review Board, that was one of the issues we were grappling with.
You have lists of recommendations. But a lot of these are systemic kinds of things that are right. They don't neatly do A then B then C even D. It's a lot of things, and a number of them are long-term events. By the way, we've been staring at some of these long-term things for decades and they don't age well. Someday, you have to get started. We have to see how we take some of these lessons and translate them both into things we do short term, but also the things that we need to do long term.
Eric: Going back to the Cyber Safety Review Board, we've got 15 distinguished members of the board that picked Log4j as the first event. Would you call it an event or what would you call it? The first activity may be to review.
Tony: Yes, I forgot what we called it exactly. I just glance at that Review of the Event we called it.
How Do We Get the World to Listen?
Eric: The first major event, and from what I've heard, one of the nice things about Log4j is, it wasn't like Sunburst where there were specific companies involved. It's all-encompassing. It hits everybody. We had a friend of the show on Mr. Ford, who said "Log4j is going to be the most cataclysmic cybersecurity event of our time." That was his prediction back in December of 21, I guess when it first came out.
Based on its prevalence in Java code and everybody's using Log4j pretty much was the answer. So, you pick Log4j, the team goes out and does the research, includes a massive part of the industry, and you come back with an understanding of what happened, and how it happened.
I think it breaks down into a couple of sections, really understanding the factual information, the findings and conclusions, and then the recommendations. They're 19 recommendations, and they're pretty good recommendations in my opinion. How do we get out and get the world going back to not just red teaming? We understand what's happening now and nobody does anything. How do we get the world to listen?
Tony: Just to clarify one thing, so the board doesn't pick the problem to be solved that really it's tasked given to us from. I believe in this case; it was from Director Jen Easterly. There was discussion about appropriate problems, but at the end of the day, the board has chosen.
Eric: So CISA says, I want you to go out and I want you to help us with this?
Important Problems Around Security Risks
Tony: A choice is made and there's a legal mechanism in the chart, that allows that to happen. The board has chosen. There's a group of industry, folks, and a group of government folks and support and infrastructures provided by CISA to help us with the research and scheduling and all that kind of stuff. I will say when Log4j came up, I was like, that's interesting. I'm not sure that'll be that great an example.
It turned out to be a really great example from my perspective.
There are a couple of reasons for that. One is that it was so all-encompassing. It was a chance to look at a lot of things, both in its immediate sense and as symptoms of larger problems or challenges that the nation has, or the whole economy has around this. Just in hindsight, as we got towards the end, this has really been a rich discussion and really interesting. Let us talk about a lot of really important problems.
I think that the choice was wise. It certainly wasn't my choice to do this one, but I think in hindsight it really felt right to me. Also, thank you, Rachel, you mentioned having looked at my blog. I wrote a prequel deliberately before we started because I want to say, what do I think we're going to see? Not to prejudge or to be biased, but to look at it with a historical eye. I think I said something like,
I'll be startled if there's some technical result that we haven't thought about many times before. That's not to downplay the importance or the relevancy of the problem.
Tony: My impression is we've seen a lot of things happen in this industry over decades. Really bright people have looked at this. It's hard to imagine some unique thing that would appear. I think that turned out to be true. We're going to see illustrations of really deep, systemic, universal problems. That was certainly true also, but to say a word or two and the deliberations of the board are private to the board.
I can share with you just what an amazing group of people, lively discussion, and great cooperation from the people that we asked are by large for information. The rough model was the National Transportation Safety Board, which is long established. That's good because it gives you a visual or an image that we work towards.
It's also bad because it doesn't fit a hundred percent in this cyber business. There's no wreckage to be reassembled. You're not going to find a violation of a building code, acceptable building code or weakness in the rivets, or whatever turns out to be the cause. It doesn't end. You look and reconstruct an accident, you can either reinforce or get new information about all the mechanisms that manage risk. This element of transportation is evolved here.
We have much more complicated, gazillion variables, lots of behavior, no accepted standards for how a piece of software is written, and how it's operationalized. It’s a messy problem. That's the downside of having an image like that, but a great job of pulling all that together. Also, I said in the prequel that what would be really valuable, is the chance to really study something in detail and put it into context. We're all getting a little burned out on this cyber fatigue thing. Log4j, whatever.
[33:41] More Practiced and Thoughtful Eye to Look at Security Risks
Tony: There'll be another one next month, next week, or next quarter, that will consume our attention, a massive amount of information flowing in from threat analysts, the industry, and the government. We'll just be flooded with that. People will scramble, and they'll be on to the next one. It's rare to have this collection of people with this national attention. I spent a little extra time digging into these.
So, I really enjoyed being part of that. That's why I agreed to be a part of this activity because I just thought it would be a rare opportunity to really look at something.
Not leisurely, but more practiced and thoughtful eye to what is really going on here. What are the mix of things we need to do now, and things that we need to get started that have longer tails to them? I'm sure that every person on the committee would say we stand behind the work, we're really proud of what we did. Anyone and not all of us cared equally or knew equally about every issue and every recommendation that was in there, but a lot of great learning.
I felt I learned a lot from lots of different folks, contributed a little to some of these ideas, and help bring them in. At the end of the day, it's a work by committee, but I think a lot of interaction and a lot of great discussions, and a lot of really good intentions. Everybody that was there is someone who's got a significant role in the industry and feels a great sense of ownership about trying to solve these problems.
Rachael: Black Hat's going on this week. There were some folks talking about the report and one of the things that had come out of it with the software bill of materials, which I thought was really interesting. That's come up in the past on the show too. Asset management, asset inventory. How did you guys get to that perspective?
Tony: The bill materials are one of the things that are in there. Here's my perspective on some of that in the report. I can't go into any details about this, but I can tell you my philosophy that drives this.
At the end of the day, we all are making high-risk decisions at multiple points, whether we know it or not. The decision to integrate into pieces and modern software and systems today aren't really created from scratch, they're mostly composed of pieces. That's what allows us to have the scale and a worldwide network.
Eric: And have Log4J everywhere. It's a piece, everybody used it.
Tony: This is an amazing time to be alive in terms of information sharing and infrastructure. But a lot of that is empowered by the reuse of things, bits of code, and the use of the protocol. We can standardize the way things talk to each other. At a code level, the decision to integrate someone else's code, whether it came from a commercial source, open source, or a partner, is a pretty risky decision. But we often make those decisions. Either we don't realize we're making that decision or we're making it with really weak information.
The Real Questions About Security Risks
Tony: SBOM was really aimed at the problem conceptually. That is, what do I know about the security, the integrity, the quality of something I'm bringing in, which could have many downstream implications for risks later? SBOM is about transparency. What are the pieces that made up the piece of code that I'm integrating that will be reflected later in other decisions?
Other people are going to take other pieces of code and pull them together with hardware, with what is the business mission of this. What kind of data is it going to hold, and all that stuff, and pull all that together? I always try not to get hung up on whether it is SBOM or not.
The real question is whether that mechanism provides me with information to make these risk decisions down the road and then go at it from the other direction. There's a tendency in the industry. I'm sure this will shock both of you, to over-engineer things. I won't give any examples because people's feelings will be hurt here. But when and doubt about building a standard or a programming interface or whatever, we tend to, again, this is a natural conservatism in security.
When in doubt, try to solve every problem in every use case. I got my favorite thing. If you go take that approach, you get gigantic ponderous things that the vendors will never build, or they'll build. There are enough funds in it.
Eric: I'm not a developer, but I'm just not smart enough to take the 452 recommendations or the guidance that NIS gives out on a 900, 300, or 3000-page document on the proper way to do things.
Rachael: That's what we're calling The Fog of More.
Empowering Defense Against Security Risks
Tony: Thanks for the reference, the Fog of More was a talk I gave some years ago, 2014 or 2012. I was trying to come up with a way to convey it. Here's the irony, I'm speaking as an old guy in this. It struck me that the problem in the early days was, that I don't have any defensive tools or any visibility. What am I going to do? There's nothing to help me defend myself. The problem today is actually the opposite. We have too many. It's not the lack of tools or training or certifications or threat feeds or anything.
It's that the typical enterprise is overwhelmed. We're not empowering defense, we're overwhelming it. You can't make sense of it all. There are lots of reasons for this. Bad guys are good at being bad guys. They're very determined. There's a lot to be done. The technology's risky. When it was designed, we weren't sure what it was going to be used for.
Who would've thought, the internet we have today, you would never have designed it this way? If you ever thought it was going to be the economic backbone for the future of our nation, you would've built something radically different.
The designers of the original internet were not designed for the economic backbone of the future of our nation. It came that way. Here we are trying to retrofit security as we know it today, in a really complicated environment. We have a different problem. We're overwhelming defenses, giant control catalogs, and really complicated risk management framers. People are overwhelmed, that was the point I was trying to make with that talk. The pun is The Fog of War. That term was used by Clausewitz.
How to Simplify the Problem of Security Risks
Tony: The title came from, I turned in my bookshelf and I saw, it was lifting the fog of war. It was a well-known book in the DOD back in the day about the early days of what we then called quietly, the information age, and the availability of all these commercials. Remember when war fighting shifted to something you watched on CNN? Instead of government-unique, special sensors telling us what's going on, it becomes open public knowledge. That was the point I was trying to make there.
If you don't mind a bit of a slight branch here that really started to chew on me towards the end of my career. How do I help simplify this problem as opposed to producing more information that just adds to the confusion? Even the things I mentioned, I'm very proud of the release of the NSA Security Guides to the public by our team. In some sense, that's more fog.
Do I trust the NSA guys or the missed checklist or the CIS benchmarks or the vendors' checklist? How do they sort through all that? That's one of the origin points of the work that I'm still involved with, The Center for Internet Security, the CIS controls.
There's a bit of a comic book, superhero origin story of that. This is now a worldwide thing. I’d love to tell you, I had this grand vision of the future and what was important, but of course, none of that's true. I was just struggling with this. Once I released, or once we released that stuff to the public, that actually started my public speaking career, much to my surprise.
[42:12] Character Flaw Reveal
Tony: One does not go to NSA to speak in public, but we kept getting requests for interviews and that stuff. I find myself out there, and proudly telling the story of why we did this, and what we did and here to help you. People started to ask me a question that I'd never really pondered. This is so embarrassing, but I'm going to share it with you guys and your audience anyway. They would say things like, that's great, thanks very much, Tony. What do I do first?
Eric: Where do I start?
Here's my character flaw revealed to you. See, I was never responded to fix these problems. What do I do first is the question from someone who owns the problem. We've got to solve it because you got to the boss, you got to budget, you got regulators looking over your shoulder. I'm going, well, we told you what to do. There's this stuff and NSA stuff, and you're going, yes. But what do I do first?
I can't read thousands of pages of stuff and decide what applies to me. Wow. You know what? That's actually a great question. At some point, I came back, and, true story, I gathered five friends and most of my friends are pretty extraordinary. These are not ordinary people, these are the tech leader of the red team and blue team, really smart people.
Here's what I told them. "No one leaves a room until we all agree on a small number of things that all of our friends should do." All right. I will not allow you to solve world hunger or peace in our time at this meeting because that's what security people do.
The Core Foundation
Tony: You put us together and had about 20 flaws and things to do. I know five more. Well, I'm going to one-up you. I know that's how you get these gigantic recommendation lists and so forth. No small number. To me, small is five to seven, five to seven things argue until we get the five to seven.
Eric: To me, it's one to three. Does that say something about me? I think you missed some of your audience there if I was in it.
I couldn't get them to agree to five to seven, just to let you know, we came up with a list of 10. The list of 10 was on two pages. It was a letter that went, and I sent it to friends at the Pentagon and CIA, on The Air Force, and things like that. All it said was, based on our experience, if you don't know where to start, start here, what's the core, what's the foundation? I was trying to just be helpful in a low key because I just wanted to answer this question. What do I do first? So we said, never thought another word about it.
The team leader, peeks in my door one day, a couple of months later. He goes, do you know the name, Alan Paller? The late Alan Paller was the founder of the SANS Institute. He leans in my door, and he goes, Tony Alan got ahold of our list and wants to know if he can build a community service project down in it.
A Starting Point
Tony: If you're familiar with the SANS Institute, they do nothing on a small scale. They're a big machine, a two-page thing. I said, of course, it wasn't classified or anything obviously. Let me talk to our lawyers to make sure that we can stay involved and be part of the team. It turns from a two-page list to a thing with classes and conferences and posters, and all that kind of stuff. The whole theory of the case or the philosophy is the important thing. Can I simplify? To simplify, you don't want to just be simple. You want to be simple with meaning.
The goal is not to give something trivial to people, but it's a starting point. How do I build, what do I get going on? So, you want it to be intellectually defendable and part of a bigger stream. No 10 things are going to solve world hunger or peace in our time or cybersecurity, but you want to get them on the right path.
That was the notion there. That we have tried to preserve and is really a bedrock philosophy for what we're doing here is, and you have to believe the philosophy. You have to believe the principle and not everybody does. Wizards don't.
I grew up in a business where everybody's a special snowflake. The thing we learned in kindergarten or whatever. You're special. Yes, we are but in cyberspace. My firstborn is an English major. She tells me my grammar is off.
What I say is, in cyberspace we all have more in common than we do that's different. We're essentially using roughly the same technology on the same network. We are interlocked in these complicated business relationships that we don't even understand and it's constantly changing.
The Nature of the Attacks and Security Risks
Tony: The bad news is we have this shared problem. One vulnerability in a Log4j suddenly ripples across a gazillion machines, and so that's bad, the shared problem.
But the shared problem is also a chance for shared action, for the shared opportunity to say, but some of these problems are not unique to you. They're in fact systemic. They affect everybody. That gives us the opportunity. If you're an optimist, like some of us, that is to focus your attention there.
I think every responsible analysis that I could do that we could do back in my day around what is the real root cause or the nature of the attacks, the successful attacks that we have seen?
I think this is a classic 80/20 rule kind of thing. Most attacks we see are actually based on a relatively small number of the same things over and over again. I've said that before. The goal is, can I identify a small number of things that give me good value, that provide the foundation for defense? That's really the tricky exercise here. If you'll pardon one more riff, I often say we talk about share. Share is the most important verb in cyber security, share incidents, share this. I get it.
But I'm going to sound like a curmudgeon here. So, forgive me.
Translate Instead of Share
Tony: I have a talk I used to give. This will get you fired in the DC Beltway area. No, sharing is overrated in cyber defense, but that was the title. It's not because I'm a bad person or a curmudgeon, but people forget.
We don't share just for sharing’s sake. The point is to share so that people know how to take action. What should I do? I contend the real verb that matters is not share, it's translate. How do I translate millions of data points about badness into a relatively small number of positive, constructive things that you can do about it?
That's why you would want to take all this threat information in and subscribe to threat feeds and read all these threat reports. The average enterprise, in fact, the majority of enterprises don't have a threat analyst in their company. They can't read these threat reports. You can send them all the indicators of compromise and stuff. They can't do anything.
Eric: They don't know what to do with them.
Tony: The "translate" verb is the one that really matters. Can I pull together experts, and translate millions of bad things into a smaller number of positive, constructive things? You have to stretch a little bit, but this is unlike public health. You can't absorb all the data about pandemics and the transmission vectors of disease. We have to translate that into behaviors, wash your hands, get your shots, and avoid these kinds of encounters or locations.
Those are translations of fancy science and really smart people in the back room into behaviors that we can all understand.
People of Goodwill
Tony: Sometimes, science changes, and therefore we translate it into different behaviors. Remember to wash your hands and they’d be, how long do I wash my hands? Is it mostly airborne or is it contact-driven? Do I have to wash my groceries? Again, it hasn't been a very efficient process say relative to the current pandemic. In fancy words, those are a translation of science into behaviors.
That's what we do in many other endeavors here. Anyway, this notion of having grown up in big government agencies. Then, I had the chance which was unusual at the time at NSA, to work with lots of friends in the industry, make a lot of friends, and do a lot of cooperative voluntary work. But really seeing it in action, especially here at The Center for Internet Security, the power of volunteerism, the willingness of people to contribute to a common cause and create.
When I first spun out into the earlier nonprofit that we merged into CIS, my partner then was Dr. Jane Lewis, who was number two at Homeland security. We were talking about workforce issues. I said, "Jane, my experience in the industry is there are so many great people of goodwill, who will contribute their time. All they need is a little bit of distraction, to point them in the right direction to create a product, set a deadline, to manage it.
If we could just provide that, that's a really powerful engine in and of itself. We don't have to be a big company; we can be a Mazda-size company that really helps provide this."
[52:07] Simple and Actionable Steps Against Security Risks
Tony: The tagline I use sometimes is if you really want to get the best value from free labor, it doesn't come for free. You need a professional team to organize it, bring it structure, publish the documents, manage the interaction with vendors, and all that kind of stuff.
Eric: You were talking about sharing and then you said really, it's “translate”. Because sharing for sharing purposes, I think there's a piece there that I'll call simplify. I'll use a great example from CISA. I think lately with Chris Krebs and Jen Easterly and their teams coming out of DHS has been magnificent. Getting back to the basics, the shield's up, simplified. Therefore, making it simple, is the most impactful thing for me. This isn't a CISA commercial, maybe it is.
The four things you can do to keep yourself cyber safe, I think I've used that with companies. I've used that with government agencies, with people, my father, Rachel, and multifactor authentication. Come on. Are you using it yet? Nope. Not yet but simplify. What do you do?
Multifactor authentication, update your software. We're talking patching basics, enterprise, or individual? Doesn't matter, update. Think before you click and then use a strong password, use password managers. They're simple and actionable, and everybody can do it. I think if you look at the Log4J output from the Cyber Safety Review Board, you look at a lot of the work you've done over your career. They're basic things that can fix big parts of the problem, but I'd love your thoughts.
The Enemy of the Good
Tony: I a hundred percent agree with you. I think that security is an area, and I mentioned the kind of natural conservatism. If you have a career like mine, there's a certain level of paranoia that gets delivered with it because you get to see the worst. You see how systems get employed and I get it. But you can either let that empower you or paralyze you. Too often, it paralyzes. There's nothing I can do. I'm helpless.
Eric: Or it's overwhelming. To your point, I don't know where to start
Tony: If we don't present well, this is what I realized that many senior decision makers, in uniform with stars on their shoulders, will say things like, I heard everybody, I accept the risk. We're going to go. Well, in cyberspace, you know that's not a knowing acceptance of risk.
It's a frustrating acceptance of risk because these IT wizards won't help me. They just go, so that's irrelevant. You might be technically right, but you haven't helped the mission. That's where I've tried to stay on the right side of that equation. Wanting to help people doesn't mean you make things perfect.
I think security, and I know I'm not the first who said this, perfection has been the enemy of the good in cybersecurity for a long period of time. You're not going to stop every possible bad guy, but you do those four things. You do the top 10, whatever. They're moving on, going for easier prey. Nobody wants to be the lonely antelope at the edge of the herd. That's a crazy place to be. Take care of that kind of stuff. You're not going to ask people to execute rocket science.
Hard Problems and Security Risks
Tony: I'm a big fan of CISA and what the gender Jen Easterly is doing there and her team. There's a reason why our podcast, you mentioned that CIS is called Cybersecurity Where You Are. You have to meet people where they are, not ask them to come to the wizard side.
How do I translate, and simplify in a way? It doesn't mean you're solving it. You could take a view that said, that's the trivial stuff. I still know 10 ways around you.
If you do that, then you're the lonely wizard standing on the street corner, telling people what they're doing wrong. But if you want to help people to make progress, to get on the right path, to make it tougher for the bad guys, and give you more confidence in what you're doing, then you have to meet them where they are. Speak their language and address their problems in a specific way now. CISA is you simplify the behaviors that you ask people while you work the infrastructure.
There are some hard problems that have to be solved and you see some of them in that CSRB report improving the security of open source and improving the way we buy the software. Those don't happen in the next 30 days, those are long lead times, and need support, but you overwhelm people. You can't wait for those to get finished before you get to work. And you want to get people headed in the right direction, in a positive, constructive, valuable way.
Until the Right People Showed Up
Tony: It is valuable to turn on MFA. Does it stop every possible threat factor? No, but it’s modernized to the point where relatively low paying point and manageability by average humans. There's tremendous value there. We need to start moving down more of those. I’m a fan. I wouldn't have agreed to serve on the CSRB unless I was a big fan of the work that they were doing.
Tony: Oh yes. But there's always some hold down. That's okay.
Eric: I'm working on her. Get off the TikTok videos and use multifactor authentication Rachel. Come on. It's not that difficult. Okay, Tony, we're on time. I'd love to ask one question. You've got an illustrious career. What's one thing you wish you had done differently or sooner over your career that you just undone or your teams?
Tony: That is a tough question. I don't know if I have a good answer for you here. At every stage, I'm doing the best I can with what I got.
Eric: But hindsight 2020, if we had released that data instead of in 2001. In 1983, we could have stopped the adversary for the rest of the time. I’m making that up.
The Right Circumstances
Tony: Here's my experience. There were some things that I learned. Number one, I'm a terrible estimator. I think we can get all this changed in the industry in two years. Two years turned into five to 10 years. Even ideas that I thought if I could go back, I hit a couple of them, at least pretty well early, but the time wasn't right. It wasn't until later that the right people showed up. All right, or I saw the right knowledge or the right opportunity or the market shifted in a different way, that ideas just became, it was their time.
There's a saying, every idea was once an idea ahead of its time or something like that. I'm not saying I knew these, but when I look back to things, I just realized a lot of this, maybe if I'd done something different, it just wasn't the right idea at the right time. When I look back at my career, I had one chance to do it when I retired from NSA, and this was somewhere in my retirement remarks, at some point.
I never wrote the one paper that would change people's minds and I never invented something. Anything useful. I think, truthfully and humbly, everything I did was a function of the multiple people who happened to come into orbit at the right moment in time. Maybe, that's what it took. I'm not smart or dynamic or clever enough to have maneuvered the world in a way that I would've liked. I don't know that even with hindsight, I could have said I'd done any of that.
[60:13] An Ongoing Theme on Security Risks
Tony: Everything seemed to happen when it was supposed to, even when I was ready for it earlier, it turned out the right circumstances weren't in place. I'm at peace with what I did. I think I did some useful things in my time, and I hope there are a couple more yet to go.
I’ve made some amazing mistakes in my time, that I know I can't undo, but I learned a lot from, which really shaped the rest of my worldview. So, I feel okay. I was supposed to do that, and here I am. If I ever made a mistake that I regret, it was maybe there were a couple of people I didn't pay enough attention to.
Eric: I think we've all made that one.
Tony: I've fixed a couple. I've gone back to reach back to people in my life. I'm old enough to want to do that, to really feel that if something really bugs me. A couple of people have reached back to me, by the way, with some misunderstanding we had years ago that I thought was a simple thing and they felt bad about it. They've come back to me, and I really appreciate that. It’s a very human, warm thing to do to people. At my stage in life, I appreciate that, even if maybe they took it worse than I did or whatever. So, I'm okay.
Eric: I think those are very real, fair answers. Rachel, timing, you can do multifactor authentication now, or you can wait till you have a banking problem or something else. Don't wait until it's too late and you learn from examples.
Tony: I have stumbled upon an ongoing theme here.
Obsessed with Algorithms
Eric: We're just trying to help her. It's a big one. Ben and the dogs talk TikTok videos. She spends a lot of time still doing that.
Rachael: I love TikTok. I'm obsessed with the algorithm.
Eric: Great discussion today. Thank you so much for your time, your wisdom, and the stories.
Tony: Thanks for having me on here. This is the best interview/podcast to be on. They're just great conversations. Maybe, there'll be another occasion, or maybe you'll be guests of mine sometime in the podcast.
Rachael: To all of our listeners, thank you for joining us for another amazing podcast discussion. Stay safe until next week.
About Our Guest
Tony Sager is an SVP and Chief Evangelist for CIS. He leads the development of the CIS Critical Security Controls™, a worldwide consensus project to find and support technical best practices in cybersecurity. Sager champions the use of CIS Controls and other solutions gleaned from previous cyber-attacks to improve global cyber defense. He also nurtures CIS’s independent worldwide community of volunteers, encouraging them to make their enterprise, and the connected world, a safer place. In November 2018, he added strategy development and outreach for CIS to his responsibilities.
In addition to his duties for CIS, he is an active volunteer in numerous community service activities: the Board of Directors for the Cybercrime Support Network; and a member of the National Academy of Sciences Cyber Resilience Forum; Advisory Boards for several local schools and colleges; and service on numerous national-level study groups and advisory panels.
Sager retired from the National Security Agency (NSA) after 34 years as an Information Assurance professional. He started his career there in the Communications Security (COMSEC) Intern Program and worked as a mathematical cryptographer and a software vulnerability analyst. In 2001, Sager led the release of NSA security guidance to the public. He also expanded the NSA’s role in the development of open standards for security. Sager’s awards and commendations at NSA include the Presidential Rank Award at the Meritorious Level, twice, and the NSA Exceptional Civilian Service Award. The groups he led at NSA were also widely recognized for technical and mission excellence with awards from numerous industry sources, including the SANS Institute, SC Magazine, and Government Executive Magazine.
Listen and subscribe on your favorite platform