Cyber Conundrum: The Higher the Wall, the Higher the Ladder
This week we are joined by Dr. Herb Lin, Senior Research Scholar, CISAC and Hank J. Holland Fellow, Hoover Institution at Stanford University – and author of the book Cyber Threats and Nuclear Weapons. Herb shares his deep expertise in cyber conundrum, policy, and security to shed light on key questions that should be on everyone’s mind, such as “Why are innovation and cybersecurity opposites” and “Why are we always behind in cybersecurity?”
He also breaks down why complexity is the enemy of security, cyber war vs nuclear war, three roads to ruin, and the role of a Chief Luddite Officer. Prepare for your mind to be blown!
Cyber Conundrum: The Higher the Wall, the Higher the Ladder
[01:08] Is There Positivity on Our Path Ahead Despite All Cyber Conundrum?
Rachael: We have Dr. Herb Lin. He's a senior research scholar for cyber policy and security at the Center for International Security and Cooperation and Hank J Holland fellow in cyber policy and security at the Hoover Institution, both are at Stanford University.
I also want to mention for those out there, go to Amazon and look up his book, Cyber Threats and Nuclear Weapons which breaks down the cyber risk to the US nuclear enterprise. It guides readers through the little understood element of the risk profile that government decision makers should be anticipating. For example, what might have happened if the Cuban Missile Crisis took place in the age of Twitter.
Herb: Glad to be here.
Rachael: Well, we were talking a little bit before we got on and I'm just going to open with it, Eric. I always like to ask our guests, do you have positivity for the cyber path ahead and will we get ahead of this threat? And I have to ask you, Herb, why are we always behind in cyber security? We're always chasing?
Herb: Well, it's an interesting story so bear with me. One of the things that we know about computers, which have been around for a long time, is that we always want them to do more. We want our computers to be faster. And we want them to be easier to use, we want them to have more functions. We want to do more. It's always doing more and that's highly understandable.
The Cyber Conundrum We Face When We Want Computers to Do More
If you look at the computers that you're using now versus the computers that you were using 10 years ago, that's true. It does a lot more for you and so on. And those things are absolutely useful. Imagine what would've happened to our economy and so on if we didn't have video conferencing during COVID? If COVID had struck 10 years earlier, which could have, our technology would really have not been in very good shape.
We would've adapted. But it would've been a lot worse. Everybody would acknowledge it. So the point is we do a lot of stuff better now with computers. There is no question about that. But here's the bug.
When you program a computer to do something more, you have to add more programming. So the systems that you're building become bigger, they become more complicated, they're more complex. There's more ins and out to them, the number of lines of code grows, gets larger and so on. And there is a saying among computer security people that no one really disputes.
Complexity is the enemy of security, that there are more things to go wrong. There are more things that you can forget about. You can configure a system in different ways and there are more ways to do it and one of those ways is going to be wrong.
And so, given that there are more things that can go wrong, the adversary, the bad guy has more ways to get in and more things to take advantage of. And so this quest that we're always on for more functionality, inevitably leads to systems that are more complex and hence less secure.
Nobody Buys Security
Herb: And so we're pretty good against the security threats of five years ago, but it's not five years ago now, the security threat has evolved too. And so what has happened is that, and this is true at any given time, is that we're always playing catch up here because we're always wanting more.
Our investment in security is always insufficient. Why is it insufficient? Because nobody buys security, they buy functionality. They want their systems to do more. I know how to make the systems perfectly secure, I turn it off. If it doesn't do anything that's perfectly secure, but it's useless.
Rachael: Unplug it from the internet.
Herb: So that's not the hard part. The hard part is getting to the point where you can use the system and still be secure. But nobody cares about security because you can't tell the difference between this thing is working perfectly well despite the fact that there are attacks on it by the bad guys. That's good cybersecurity. And there's no attacks on it at all, which is no cyber security.
You can't tell the difference. But people can tell that you can format this document better. You can now do a web conference or something. They know they can do that. So that's where all the incentives are. People buy functionality. The problem fundamentally with security is that it's a tax. It only costs. It's never a profit center in this.
Eric: It's interesting, Herb. I was talking to someone new in the industry or thinking about joining the industry last week. And they were asking me the differences between information technology and cybersecurity.
Preventing Bad Things From Happening
Eric: And I started my career in IT. I said the exciting part of information technology was I was always building something. And I was solving problems. I was making things better, faster, more interconnected, more seamless. And I was reducing costs. I was enabling capabilities. In cybersecurity, you have none of that. At best you can say you're defending to allow those activities to happen securely or safely. But everything you're doing is on the defense really.
We're not in the commercial world, we're not hacking back. We're not attacking the adversary. It was interesting to them that perspective, they never thought about it, but you're absolutely right. It's a tax. It's a supporting function of a supporting function. IT supports the business, whatever business you're in, cyber security supports IT to allow it to support the business. And I think by definition, you can't get ahead.
Herb: I agree with that entirely. It's not that you're trying to do good things, you're trying to prevent bad things from happening.
Rachael: So you're always the defender though, Herb. And Eric knows I love this question, but what about more on the offensive?
Herb: Well, some people think that going on the offensive is the right thing to do. I would be all in favor of it if I can figure out how it would help. And sometimes it does help. So for example, it may be, and this is speculation because I don't know what's really happening here. It may be that some of the lack of cyber attacks on Ukraine that everybody including me expected at the beginning of this war, that hasn't happened very much. It's happened some, but not in the way in which we had expected.
The Right Thing to Do to Combat Cyber Conundrum
Eric: Not to the scale we thought, especially getting outside of Ukrainian territory.
Herb: That's right. And so maybe somebody unknown who has been attacking the Russian cyber attack infrastructure to suppress it. Maybe that's working. There's at least one documented instance of this in the 2018 election where US cyber command was said to have taken down the Internet Research Agency, the Russian troll farm and certain things didn't happen then. It's always hard to make the inference that did they make those things not happen? Who knows?
Eric: And that's part of the problem.
Herb: That's right. But it's certainly possible.
And so sometimes using an attack as a defense to suppress an incoming attack is the right thing to do. But often that's not the right thing to do. And that's a case by case basis.
But by the way, the psychology of offense is different than the psychology of defense. The psychology of defense is what Eric said earlier. You're preventing bad things from happening. But if you're successful on the attack, on the offense, you can make something happen. You can actually see something happen. At least somebody will say, "Hey, you did a great thing."
Eric: It's like car insurance you never use that's. You have no idea if your insurance company is going to be there when you need them, how good they are, how quickly they react or anything else, but you keep paying them.
Herb: That's right.
Eric: I'm reminded of a meeting we had from this conversation. I was with Dmitri Alperovitch of the Silverado Policy Accelerator and CrowdStrike fame.
[10:37] The Adversary Always Has a Taller Ladder
Eric: He's been a frequent guest on the show. He had a slide, it was probably 2010 or so, we were at NSA in the Friedman Auditorium, 400 people, sold-out crowd. And he puts this slide up that basically shows a wall and an attacker with a ladder going over the wall. He said, the bigger the wall, the taller the ladder. The adversary will always have a taller ladder. I think that's it. It's a different perspective, but the adversary can always bring in a new tool or technique or you name it.
Herb: And over time will get in.
Eric: They have to if persistent.
Herb: Let's say they have to do it and they have a limited amount of time to do it then that's a problem. They may not be able to do that. And you could certainly imagine some situations in which the timing matters. So for example, let's say I'm trying to use cyber attack to turn off some air defenses so that my bombers can get in. If I turn off the defenses after the bombers get there, that's not very helpful. The bomber pilots will get really mad at me. I have to do it before the bombers get there. That's a different question, but you still have these issues that eventually they'll get in if that's the goal.
Rachael: That's frightening. And you can't outspend the problem either. I think I was really reading one of the articles you were quoted in, Herb, on the financial sector. I think it was one of the large banks who was saying that they spent a billion dollars a year on security up from $400 million just a few years ago.
Financial Services Institutions Are Better Off Than the Rest of Us
Rachael: Obviously, financial services operations are doing better because they are putting in these investments, but it's not fail-safe.
Herb: That's right. And the financial services institutions actually are better off than the rest of us in doing this because they actually have a metric for understanding what they're buying.
So they lose money and they can make a trade-off. They will say, is it worth a million to save $10 million? The answer is yes. Is it worth a million dollars to save $100,000? The answer is no. And they can measure that because their bottom line is measured in dollars, but we don't have that. How can you measure that for electric power or a water treatment?
Eric: And the reality is it really doesn't get measured as a result. I would say even if they were able to measure, they don't have the budgets directly tied to revenue that a bank would have, a financial institution would have to do something about it.
Eric: So even if you did measure it, which they don't, you've got that problem.
Herb: I think that's right.
Eric: It's a great world we live in. So Herb, I want to transition over to your book, which honestly full disclosure, I have not read yet. It's in my queue. I've got a huge queue to get through you. You're right at the top though. Cyber Threats and Nuclear Weapons, how do you compare them to one another?
I've always thought about it. Nuclear weapons, at least up until recently, we had the concept of mutually assured destruction. We still have it, I think, although you never know how far people are willing to take things. Where with cyber, you don't really have that same level of deterrence.
Cyber War vs. Nuclear War
Eric: Additionally, nuclear weapons are hard to come by. They're very expensive. They're far and few between when compared to cyber attacks. You go out and buy malware on the market and you can choose your targeting and your timing and everything else. If you really do want to do catastrophic damage to a nation's economy or businesses or something else, you probably can. How do you think about them in similar, in different ways? How should we think about them?
Herb: Well, so you're asking about a comparison essentially between cyber war and nuclear war. And let me comment on that because I do have some background in both domains. That's not the subject of my book. My book has to do with the cyber threats to the US nuclear enterprise.
But what you're asking is a fair question and so let me comment on it. As you correctly, point out, avoidance of nuclear war up to this point has been based on the idea of deterrence.
And deterrence is fundamentally a psychological concept. It tries to influence the other guy to not do something.
Eric: Because it would be in their best interest to not do it.
Herb: That's right. The idea is if you attack me, I'm going to attack you. And the fear of being attacked in return keeps you from attacking me. That's fundamentally the idea.
Eric: We don't have that cybersecurity.
Herb: Well, you certainly don't have that at every level. So for example, right now you and I are talking on a computer link. My laptop is open, you probably have desktops or something like that open in there. And what's happening is that we are right now being subjected to cyber attacks all the time.
Can We Deter Cyber Attacks at a High Level?
Herb: There are people or machines actually attacking our machines right now. Now mostly our internet service provider protects us. I have some protection on my computer and so on. You do too. And so mostly those things bounce off, but there's still a whole bunch of attacks. So we are clearly unable to deter cyber attacks at a low level.
Can we deter cyber attacks at a high level? Well, that's an interesting question. Despite all the predictions about catastrophe and cyber Pearl Harbors and so on, we haven't seen them. We haven't seen the electric grid go down for months at a time over the entire Eastern Seaboard. Well, I used to live in Washington, DC. I had my electric power went about two days a year. That is not great reliability. And I'm pretty sure there is no cyber attacks at all of that.
Eric: I think it's Potomac Edison now, but I'm with you. And I think that a lot of DC people are still experiencing the love.
Herb: So the issue is that there are small scale attacks and so on, but maybe doing it on a large scale is really hard. So we don't know why there hasn't been the large scale catastrophe yet. Some people will say, well, it's because of deterrence. We've threatened other people enough that they don't dare do it. Well, maybe that's right. But it's of course, very hard to tell.
Herb: Your comment about being able to create large scale destruction with cyber over a long period of time, that's an interesting question. It turns out that that's pretty hard, it's hard to do that. You have to assume that there are a couple of key nodes that you can attack and keep down and so on. Maybe that's a lot harder process to do than people think. I'd say the answer is people actually don't know how catastrophic it could be.
Now it's possible to spin out a variety of scenarios and I can't disprove them because every step could be a plausible thing. But whether or not it's ultimately something to spend a lot of time worrying about, it's a different question.
Eric: So how do you then look at protecting nuclear weapons from cyber attack? Really going back to the topic of the book, essentially. And why would a nation state want to attack our nuclear arsenal?
Herb: Well, so you could imagine that the bad guys, nuclear-armed nations, people who think that we might use nuclear weapons against them might want to use cyber attacks to inhibit or disable or disrupt or degrade our ability to use nuclear weapons against them. That would be the incentive. So for example, if our nuclear weapons were arranged in such a way that an adversary could press a button and they would all go down, that would be a good thing for them. So you could imagine that that's the incentive.
But we do not arrange our nuclear weapons like that. You can't just turn them off. So when the common conception is that you press button.
The Cyber Conundrum in Turning Off a Button
Herb: That the president presses a button and then all of these missiles get launched. So the cyber attack problem becomes, how do you turn off that button? How do you cut that button? But that's not how the process works at all.
The president has to issue an order and that order has to be formulated properly and so on. And then it has to be transmitted to the forces, blah, blah, blah, all those sorts of things. There are many steps along the way, and there are many redundant paths and so on to get the orders out. So all of those places, every one of them, to suppress nuclear weapons coming from the United States, they have to get all of them. You have four out of five channels they can suppress, the fifth one gets through. So that's the fundamental principle of that.
They could also try to attack the delivery systems. So if an airplane is trying to deliver a bomb to some place, if you could shut the plane down, even if you'd gotten the order to go drop it, the plane doesn't take off, for example. So that's a possibility too. In principle, there are all these ways of using computers as a vector for degrading and disrupting the operation of our nuclear force, you can see how that would be a good thing for an adversary.
[21:49] Fixing Cyber Conundrum Through Command and Control System
Herb: How do we protect them? Well, that's a very interesting question. One part of the system is what's called the command and control system. This is the stuff that gets the orders out to the forces, from the president to the forces in the field. So the present day nuclear command and control system is one that is not connected to the internet. It's actually really old technology. In fact, it's so old that very few people know how to hack it. They don't know how to program it. And they don't know how to hack it. They don't know how to get parts for it, and so on.
Eric: Protection itself.
Herb: And this is security because the technology is old.
Eric: Security through obscurity.
Herb: That’s great. And security through obscurity actually does work. Don't count on it but it's one layer of protection. So I don't know if you guys are old enough for this, I am, to know about the eight inch floppy discs. You remember the soft floppy disc that could bend?
The air force has been known to buy parts, I don't know for nuclear command and control, buy parts on eBay. So a lot of the technology is really old. It means that people can't find the manual for the parts and so on. So all that helps. Now, you can't sustain this forever. So you want to have a new system at some point, which you're going to assemble. The old system also, by the way, is one where you have point to point connections. You have a solution.
Point A to Point B Cyber Conundrum
Herb: How do you get a message from point A to point B? And then you build a system for that. Then you build another system to get from point C to point B. And then what happens if A wants to talk to D.
Eric: It's got to go through B and C.
Herb: So you have all of those problems. So the plan is now that they want to this all on an internet like network. Now what I mean by that is just an IP-based network. It's not the internet. But it's IP-based. Which means it has the same protocols for communication and so on. That means that all of the tools that people have built for hacking the internet could in principle run on this because of the same protocols.
So now what you're relying on is not the inability of these tools to run on this separate internet for nuclear command and control. Internet like saying it's not really on the internet but it uses IP. You're relying on keeping people out of that. And that's all. That's a reduced layer of protection. You get fewer layers of protection there.
It's probably the right thing to do conceptually because we understand IP and it's been debugged for a long time and so on. That's probably the right thing to do but it introduces dangers. So those are the kinds of dangers that you have to worry about. Then there's the dangers about the cybersecurity with regard to the weapons platforms that you're building.
Issues on the Quality of the Software
Herb: All you have to do is look at the reports about software and the F-35, which is one of our dual capable airplanes that we're going to be using to deliver nuclear weapons. There are all kinds of interesting stories about how the computer would continually reboot and thing couldn't take off and stuff like that. Those aren't cybersecurity issues, but they point to some issues about the quality of the software. There are millions of lines of code in F-35 and how are you going to deal with all that? So there're all kinds of issues like that and cybersecurity is going to be a big deal.
Rachael: Herb, while we're talking theoretical. I've mentioned this before, Eric will know, do we need to go a little bit back to the caveman era? Do we just need to unplug from the internet?
Eric: You want eight-inch floppies?
Rachael: Hey, if we need to. They're not going to invest in the security they need, let's say the utilities, they don't have the money, whatever, what about just going back old school where you take a person to manually do everything?
Herb: Right now I'm in the market for a high-end refrigerator. Do you know how hard it's to get a refrigerator that doesn't have wifi capability?
Eric: Do you know how hard it is to get a refrigerator right now?
Herb: Yes. I understand.
Eric: Due to this chip situation, you might get one without wifi temporarily.
Herb: I don't want my refrigerator with wifi. You can get electric toothbrushes with Bluetooth productivity, just what I wanted. Why? So there's a lot of uselessness in this. Now, the problem is that it can be justified at every point in the system.
[27:28] The Cyber Conundrum of Going Old School
Herb: Yes, wifi is good for this. But on appliance, I don't want that capability because I worry about security.
Eric: And it's hard to disable often times.
Herb: That's right. So you talk about going back to old school. Well, the whole point of having technology is that it makes your operations better and fine. So I'm an advocate of something, which I think a lot of people think is inherently right. But I'll give you the objection to it also. Which is that you want a bells and whistle system that does all the things you want. And you also want a separate backup system that does the simple stuff reliably and well and so that people don't touch that. That's what you want.
And you have to have both systems available from the top. You need to have a system that will do all the bells and whistle stuff, that do all the optimization and all the new functions and so on. And then you want something that just carries the basic function and they have to be separate. But the problem is that's expensive. You have to build a new system. You're talking about building two systems.
Eric: Redundancy in aircraft, you're talking weight.
Herb: That's right.
Eric: In many things, you're talking additional complexity if one breaks, training. I'm with you.
Herb: That's right. And how do you deal with that? So the answer is, the right thing to do is to be willing to make that investment. But if that adds 40% to the cost of upgrading the nuclear command and control system, is anybody going to do it? So that's the kind of problem we have.
Eric: I've been in meetings on commercial products, which is not my background where you're talking a fraction of a percent in cost increase to put some level of cybersecurity capability into a product. I won't go into details. Organizations don't feel they can do that and remain competitive.
Herb: That's right.
Eric: Wifi would be interesting, even if 80% never connect it to their network, it's a selling feature. It's future capability. It gives the business the ability to pull marketing content out, what's in the refrigerator, how often it runs, who names it, supports service information and the like. But cyber security, if you add a dollar to a refrigerator, that could be a deal-breaker for an organization.
Herb: That's right.
Eric: And a whole separate redundant network or a command and control network versus a nice to have, hey, I can see the pictures of the food in my system or order something from the grocery store when my milk is low. You're right, it's 40% maybe, 10%.
Herb: I didn't say that. What I was saying was politically feasible, but intellectually it's clearly the right thing to do.
Rachael: Because you're involved in policy and I'm fascinated by policy. Do we need to get to a place where things like this are mandated for businesses?
Eric: Wifi refrigerators?
Herb: Well, here's the problem, there would be a revolt. Commercial revolt by vendors online if you said you cannot include certain functionality, that's ridiculous. You can't say that. Don't innovate. You can't tell people that. So what you have to do is you have to create a market incentives to pay attention to security as well. Right now all the incentives are on the more features, more functionality.
There’s Never a Trade-off
Eric: It's the same old IT argument I was illustrating to that relatively new entrant into the market. We're always building more but you have to secure it. Or I guess you don't have to secure it. You probably should though.
Herb: Well, that's the point.
Eric: You probably should.
Herb: You should, but you don't and mostly you'll get away with it.
Eric: And most people don't secure things adequately.
Herb: That's right.
Rachael: Well, I go back to the beginning of social platforms and it was this great idea. No one really understood it and then fast forward 20 years later, and we're like, holy cow, they have all my personal information and stop stalking me because I search for a pair of Adidas sneakers. Is there a level of responsibility in innovation where maybe you need to take a step back, security by design, things like that? You can't innovate just for innovation's sake.
Herb: Sorry, security by design isn't even the issue to me. A lot of people say it is, and it's certainly better than what we have now, which is build the system and tack on security afterwards. It's better than that.
Eric: The bolt on approach.
Herb: But even security by design doesn't solve the problem. Why not? Because the security people never get to argue with the people who are demanding functionality. So what happens in practice is the people who design functionality, who want functionality say, here's the functionality we want. And then they say to the IT people, do the best you can on security. There's never a trade-off.
Eric: If they say that.
Herb: That's right. They will start to try bake it in.
The Cyber Conundrum in Innovation
Eric: The product manager for the refrigerator you may be buying probably said, we must have wifi because our three nearest high-end competitors do. We're putting wifi in here. And I would be wailing to bet at least a nickel, they never talked to anybody and said, "How do we secure this?"
Herb: That's right.
Eric: Because I've seen these devices come out.
Herb: That's right. And whether or not the security cost is worth it. What you have to have is somebody in security business, in the C-Suite, who's arguing to say the risks of this are too high. Now of course, what this means is that the security person has to know something about the business. And the other C-Suite people have to know something about security. How many times does this happen? So that's the problem.
Eric: Rachael, to your comment from a couple of weeks ago on the show, do we need to hire a business person who can bridge that gap? My argument was absolutely not. That's ridiculous. Everybody should be thinking about how they bridge that gap in every role.
Herb: They should be. They absolutely should be. Whether or not they will, that's a different question. So hiring this one person may be better than nothing. It's not the optimal way to do it.
Eric: So I'm going to ask the question, Rachael hasn't asked you yet, how are innovation in cybersecurity opposites? Or why are they opposites, is maybe the better question.
Herb: Because innovation demands complexity and complexity means less security.
Eric: And we'll take that innovation, we'll take new capabilities and features over security.
Putting a Thumb on the Scale
Herb: That's what Silicon Valley is built on. So how do you put a thumb on the scale on the other side? My answer to that, unfortunately is a very unpopular thing among my Silicon Valley friends is you have to impose some sort of liability on them. That says that if there are security problems that you have to be responsible for them in some way.
We could argue about the nature of the liability. How much does it have limits, under what circumstances. But the fundamental point is they have to be responsible. They have to find some way of holding the firms accountable for it in a way that doesn't impose outside regulation that says, you must have this feature, or you must not have that feature. That you can't sustain.
Eric: But I thought you were going down the regulatory or compliance route until you said that.
Herb: I know. In the end, the problem is that cybersecurity is so complex that the regulation, the direct regulation, you must have 25 character passwords or whatever it is that you're going to mandate, is not sustainable. You're always going to be behind. You need to be in the marketplace to be affecting this.
Eric: So what about regulatory, and I have no idea how you would even construct this, how you would write it. But what about regulatory controls that say you must do X amount regardless, not specifying technology, but to protect your customers. Or there is a liability on you, we'll stick with refrigerators. As a product manufacturer, there's a liability component to this if you put a device out that becomes compromised.
Innovation vs Security
Herb: If you include liability as part of a regulatory apparatus, then I'm willing to say fine.
Eric: It almost has to be on that liability side. I'm thinking as we're seeing liability starting to get pushed to gun makers and things like that. Well, they didn't do anything wrong. I don't want to get too political here. But they're liable for the product they created and any harm that may come of it. You'd almost do the same thing with a refrigerator.
Herb: That's right. Something along those lines. Now, as I say, we can debate how much reliability and under what circumstances and the nature of it and all that sort of stuff. And I have happy to engage in that debate, but first I want to establish the principle that they are, in fact, liable. They have the responsibility and they need to take some responsibility in the marketplace for lack of attention to cybersecurity.
Eric: So, Rachael, would you agree that innovation outruns security every day of the week?
Rachael: Outrun security?
Eric: Well, it gets weighted more heavily and people will choose innovation over security.
Rachael: Always. It's bigger, better, faster. To Herb's point, it's always going to, and then we'll figure it out as we go. I think that's the prevailing thought but we don't want to stymie innovation apparently, but sometimes I just wonder maybe you have to go back to the stone age a little bit. I kind of like it. My grandmother's house where I'm staying, she has no call waiting. So if I want to call and talk to my mom, nobody is beeping in. It's kind of nice.
[38:25] Finding the Social Consensus to Combat Cyber Conundrum
Herb: There is an old saying, which I wish that people would pay more attention to which says something like, there are three roads to ruin. Sex is the most fun, alcohol is the fastest, technology is the most certain and people ought to internalize that.
Eric: Nice. And we've had alcohol and sex since the beginning of time. Well, I guess we've had technology if you think about things like fire and harnessing water and everything else. We've had that also. Rachael's writing down show notes right there. Anytime you mention sex and alcohol, it's guaranteed to be in the show.
Eric: So where do we go? What do we do?
Herb: My sense of it is that we have to find some social consensus that you have to hold the technology firms accountable for harms that they were reasonably foreseeable. And then you could argue about what's reasonably foreseeable. I'm happy to get engaged in that argument. Other people, you and I will have different views on that. That's okay. But I want to establish the principle that under some circumstances the technology companies and the users of technologies ought to have some responsibility for providing a safer and more secure cyberspace.
Eric: I agree with you. And the older I get, the more time I spend in the industry, the more I see alternate mechanisms to deal with cybersecurity problems, diplomacy. We had a show on cryptocurrency couple months ago, I guess, and the role of the SEC and financial components to it.
A Multifaceted Societal Problem
Eric: What I find is people who have not been in the industry very long, or really working the products, they just want to make better and better products every time. But I do think this is a multifaceted societal problem. It's not just the defender's technology can catch up or try to catch up with the offensive side. It's got to be a whole of world problem.
Herb: Yes. I think that's right. What's interesting is that a lot of people talk about this. As they say it has to be a whole of government problem. No, it's a whole of society problem.
Eric: Well, exactly. And society, not just the American or the Chinese or the European society, the world society. It almost goes back to the mutually assured destruction concept, except it's now the whole world instead of Russia and the US or Russia and the US and China.
Herb: I'm not sure destruction is the idea, but the fact that we all have a stake in it. Everybody has a stake in it. That's absolutely right.
Eric: But will you ever get the world, it was hard enough in the cold war to get two superpowers to agree not to end the world, will you ever get the whole of the world to come together? How do you do it is probably the better question. How do you get the whole of the world together to say we are doing this for global society?
Herb: That's right. And I think that's really tough. That's great.
Eric: Does it take a catastrophic event or does that not even matter?
Rachael: Well, we look at what's going on in Ukraine and the global response of how it's come together. You characterize that as a catastrophic event, then it could show some interesting indicators.
We Need People Who are Technology Skeptic
Eric: And we got most of the world, I guess.
Rachael: One would hate for it to come to that. You don't want it to come to that, but as we know with cyber, typically the response or the investment comes after there's been a significant event, unfortunately.
Eric: Herb, thoughts as we are wrapping up here?
Herb: Every cyber security study that I've heard been a part of has wound up at this point of saying, well, we need a really catastrophic event to catalyze action. I'm no longer even convinced of that. I used to believe that. And to me, if we haven't seen it already, I'm very pessimistic about people actually taking any action. I think this idea of more and better technology is way too powerful.
Eric: I agree with you. For the last decade plus, people just buy technology because they feel it's their job. It helps them get a new job, whether it's effective or not. We just deployed whatever the latest cybersecurity widget is, the high speed, low drag, very exciting widget. I can now go to a bigger company and say, "Hey, I just did this here, I'll do it for you and make more money." The incentives are all misaligned.
Herb: I'm not sure I actually believe what I'm about to say, but you could see some benefits to it. Maybe there ought to be in the C-Suite a chief flood light officer. Somebody whose job is to say, no, we shouldn't get that. We shouldn't do this technology. And to have a good and structured argument as to why we shouldn't go down that path. That's not an original concept, but you get the idea. You need people who are technology skeptic.
Being Skeptical About Technology
Herb: I have a PhD in physics from MIT, people say to me, "How can you have a PhD in physics from MIT and still be skeptical about technology?" And I look at them and I say, wait a minute, how can you have a PhD in physics from MIT and not be skeptical about technology?
Eric: I'm the same way. Pick your technology, it can be used for good or for harm. Something as simple as water or fire, which we've talked about on the show. So I'm with you, Herb. I think you can understand it better if you're more well educated on the topics.
Herb: And you can understand how to screw up.
Rachael: Exactly. I like that, chief flood light officer.
Eric: We're bringing the eight-inch floppy back? You might as well bring the five and a quarter and three and a half while you're at it and don't forget the zip drives.
Rachael: We'd love to have you back on again, for sure because there's so much to unpack here. I look forward to reading your book. I've already ordered it on Amazon. And I think after we read it, we should have Herb back on and have some even more in depth conversations on Cyber Threats and Nuclear Weapons.
Eric: Herb, how are you thinking about the wifi refrigerator? Where are you going there? How far have you gone? I don't need a brand name, but how are you thinking through that process at this point?
Herb: My first concern is that I want to be able to get it without the wifi. The second is I want to be able to turn it off. That's easy. But I haven't given up yet on not having the functionality in there at all.
The Most Important Source of Problems Is Solutions
Eric: So I just hooked in an electric vehicle charger up at my house that requires wifi so they can do meter monitoring, I get discounts on my electric. And I was like, oh, I don't want to do this. So I hook it up on a separate network.
Herb: So that it doesn't affect the rest of your house. Yes, all those things are good things to do, but at least you are thinking about it.
Eric: But it's got to be on a password-protected network. It's got to be on this type of network. It gets closer and closer to my home devices and networking system every time. And they're just forcing me in a direction that Rachael would not want me to go in her new role.
Herb: Somebody once said that, I can't remember who it was, the most important source of problems is solutions.
Eric: I like that. I may use that in my net meeting when the CFO is banging on my head. Anyway, thank you for the time today. This has been fascinating.
Rachael: Thank you so much, Herb, for joining us. This has been a wonderful Monday morning conversation. To all of our listeners out there, thank you so much for joining us yet again. As always, don't forget to smash that subscribe button. Then you get a fresh, very fresh, piping hot episode delivered to your email inbox every Tuesday. So until next time.
About Our Guest
Dr. Herb Lin is a senior research scholar for cyber policy and security at the Center for International Security and Cooperation and Hank J. Holland Fellow in Cyber Policy and Security at the Hoover Institution, both at Stanford University. His research interests relate broadly to policy-related dimensions of cybersecurity and cyberspace. He is particularly interested in the use of offensive operations in cyberspace as instruments of national policy and in the security dimensions of information warfare and influence operations on national security.
In addition to his positions at Stanford University, he is Chief Scientist, Emeritus for the Computer Science and Telecommunications Board, National Research Council (NRC) of the National Academies. He served from 1990 through 2014 as study director of major projects on public policy and information technology, and Adjunct Senior Research Scholar and Senior Fellow in Cybersecurity (not in residence) at the Saltzman Institute for War and Peace Studies in the School for International and Public Affairs at Columbia University. He is also a member of the Science and Security Board of the Bulletin of Atomic Scientists.
In 2016, he served on President Obama’s Commission on Enhancing National Cybersecurity. Prior to his NRC service, he was a professional staff member and staff scientist for the House Armed Services Committee (1986-1990). His portfolio included defense policy and arms control issues. He received his doctorate in physics from MIT.
Listen and subscribe on your favorite platform