Get a Break from the Chaos of RSA and Meet with Forcepoint at the St. Regis.


2021 Insights And Predictions, Part 2

2021 Insights And Predictions, Part 2

We continue our review of 2020's top Government cybersecurity trends like how to deal with the insider threat and baked in AI Bias. Mike Gruss, Executive Editor, Defense News, and C4ISRNET & Phil Goldstein Sr. editor for FedTech and StateTech share their 2021 insights and predictions in cybersecurity.

Episode Table of Contents

  • [01:38] Successful Cybersecurity Strategies
  • [07:52] 2021 Insights and Predictions About the Not-So-Well-Intentioned AI
  • [14:39] 2021 Insights and Predictions on How Government Takes the Lead
  • [20:48] 2021 Insights and Predictions on Technology and Security
  • [27:01] Rapid Fire Questions With Mike and Phil
  • About Our Guests

Successful Cybersecurity Strategies

Carolyn: We're going to pick up where we left off last episode with our 2020 review of cybersecurity trends. Likewise, our 2021 insights and predictions with Mike Gruss, Executive Editor of Defense News and C4ISRNET. Also with Phil Goldstein, Senior Editor for FedTech and StateTech.

Carolyn: We're going to pick up where we left off last week and where we were headed was insider threats, basically. We talked about missions always going to win, the more security pieces we put in control. And we talked about this with the other predictions, people are really ingenious. They're going to find a way to do what they need to do.

Carolyn: Dr. Cunningham has written this insight, and she said something that I really liked. She said that, "Successful cybersecurity strategies will stop trying to use technology as a unilateral force to control human behavior." Instead, "We have to better understand these behaviors to find ways to mitigate their risk to organizations and organizational assets."

Carolyn: Phil, I'm going to toss this one to you first. What do you think about that?

Phil: I think that that's very true. This comes down to government agencies building surveys that ask questions of users. Not just about their behaviors and how they interact with applications and security tools, but why. What motivates people to take shortcuts or use shadow IT or not comply fully with security protocols? Then pivot from that and to determine how solutions can be architected.

Phil: That either solve for that problem users say they're experiencing or try to build in some way to accommodate that behavior. Now, this contrasts with this desire that we've heard among agencies, especially within DOD and the intelligence communities to move to a zero-trust model.

2021 Insights and Predictions on Movement to Zero-Trust Model

Phil: And require authentication all the time for access to the network and access to resources. There may be some organizations that can say, "You know what? If you want to get access, you have to play by these rules. No exceptions." That's probably more likely to fly.

Phil: Mike can chime in and correct me if I'm wrong. It's probably more likely to fly in a military organization than a civilian one. Where you have a tradition of sort of chain of command and following orders. But if you want users to not do work security measures, you need to understand why they would want to. Then re-engineer around that deeper understanding.

Mike: You're right, Phil. What we talked about last week, just where the demands of telework or the demands of the pandemic. Or the demands of a leader who maybe doesn't have quite the same understanding. That's where the issue will be.

Eric: Phil, do you see a difference in behavior between state and/or federal employees? Contractors, agents?

Phil: State, local IT officials, and IT leaders say that training and awareness is a huge part of what they do. Particularly in terms of security. They tend to not have as many resources as federal agencies, from a budgeting standpoint, to apply to training. You are probably more likely to see gaps or lapses. That's one of the reasons why local governments, for example, have perennially now, for the past several years, they’ve been huge targets for ransomware attacks. You're just more likely to get somebody to click on a link or go to some malicious website. Probably also less likely than you might be in a federal context.

Next-Generation Endpoint Security Tools

Phil: Although this is changing based upon conversations that have those kinds of next-generation endpoint security tools to protect users when they're working remotely from home. But some state governments are starting to put those kinds of tools in place. We should be focused on how to get people to understand. Get IT leaders to understand that they need to meet their users kind of where they are.

Eric: Dr. Cunningham, who's a behavioral scientist not an IT professional, sums it up well. We can learn more about what motivates behavior. How people ultimately choose to behave, in her recommendation. Because we're really talking about people, we're talking about humans. If you crank the security down too much they're going to square down and get the job done some other way.

Eric: If you don't have enough security, well, they'll get the job done. But it may not matter. Part of what you're saying, Phil, is at the state level have less security, so they may need to go off-road less. Still, they may be less secure in some cases. They just don't have the talent or the infrastructure or the budget.

Carolyn: Having context around it is so important. Doing that behavioral baseline so we understand why people are trying to do what they're trying to do. Instead of just locking it down.

Eric: Most people go to work every day intending to do a good job. They're trying to get their work done. Whether it's writing their article, or making sure their team's okay, or creating marketing material for the organization. They go to work and they try to get it done. But they may have to go to Dropbox or outside some corporate system to get it done.

2021 Insights and Predictions About the Not-So-Well-Intentioned AI

Eric: Yet they don't necessarily understand or care that that could create a risk for the organization. They're just trying to do their job, and they're well-intentioned. Most people are well-intentioned in this world. I'm going to live with that, at least.

Carolyn: Let's talk about maybe the not-so-well-intentioned AI. Machine-based learning AI, I've heard a lot about this lately. Just that there are so many baked-in biases to AI, so how can it be objective?

Eric: Mike, I think you were talking about zero-trust being ubiquitous everywhere. It's the new buzzword, it's everything. AI and machine learning have been for the half decade, at least. I don't know that I've seen a lot of AI in cybersecurity. I'm starting to see it in the photography side. On the machine learning side, I think we're seeing more and more. Humans are still involved, and you can still fake the system out.

Mike: There was, one of the Air Force's intelligence officers spoke on this last week. I'm going to just read a paragraph from one of my colleague's stories, it was from Mark Pomerleau. This is talking about AI bias, and he said, "In a national security context, the consequences could be dire. For example, what if the military builds an intelligence algorithm. And that’s unintentionally biased toward Russian intrusion methods versus Chinese or Iranian."

Mike: "What if the military builds an algorithm to locate ballistic missiles? But developers only use North Korean imagery data to train it. Will it be able to accurately locate ballistic missiles originating from other adversaries?" The story goes on and it talks about just, even in customer service. How there are biases in someone with someone who's learned English as a second language.

Male Voices Versus Female Voices

Mike: Or male voices versus female voices. There’s a lot of opportunity just in thinking, "Hey, this is how I think. This is the way it must be done." Obviously for years, we've talked about diversity of thought in the office making for a better workplace. What we're going to see here is if there’s a bias in AI or people only thinking about problems the same way.

Mike: That's where we're going to run into problems. We've only seen this intrusion look like "ABC" and we haven't really gained that word. That's how we see it, but that's not actually what's happening. You're going to have to turn things around and look at it from a different perspective.

Carolyn: Just even in facial recognition software, how it has a hard time with darker colored skin. Did you know Kodak knew about that? We're talking decades ago. They didn't do anything about it. Until, I think it was a furniture company, started saying that their customers were unable to identify the different wood species. That's when Kodak started to address it. Talk about crazy biases that have been baked in for so long.

Phil: It's very clear for the data science community that they need to do a better job of addressing bias in machine learning models. At the AWS re:Invent conference last week, they said that they were going to launch this thing called SageMaker Clarify.

Phil: This basically gives you insight into your data and your models. It's designed to analyze data for bias before you start data prep. So that whoever's working with the data can spot these kinds of problems before you even start building your models. It's a big problem.

8.2 Million Predictions For 20,000 People

Phil: There's been a ton of research on this that I was reading about. In terms of, it depends upon who you have building the models. There was this research from Columbia that tasked 400 AI engineers to create algorithms. That has made 8.2 million predictions about 20,000 people.

Phil: It's the study that was put into the Navigating the Broader Impacts of AI Research. This was at the 2020 NeurIPS Machine Learning Conference. They basically found that, for example, male programmers, prediction errors, were more likely to be correlated with each other. Basically, it indicates that the more [13:00 inaudible] your data science team is, the more likely a given prediction error is going to recur.

Phil: Building more diversity into your machine learning team reduces the probability that you're going to have biases that compound. Ultimately, you can't turn all of this over to the algorithm. You need to look at this from another perspective. Bring that human intelligence and human judgment into your understanding of the algorithm. The risks that you're creating with it, the biases.

Phil: Maybe this involves having some kind of a red team analysis of your machine learning model to call out the bias. Then say, "Hey, we're people who are not as closely connected with building this model. We're coming at it from a somewhat outside perspective." We can tell the people who are building the model, "Hey, you need to watch out for this. Because you didn't catch this when you were initially building the model."

Phil: So, it's a big problem. Machine learning has become a greater part of how the government does its work and analyzes data. It's going to become more critical to address.

2021 Insights and Predictions on How Government Takes the Lead

Eric: Diversity is good in the workplace, but it's also good in the algorithms.

Carolyn: Are we going to see the government take the lead? Require the diversity that we're talking about as these algorithms are being built. I'm talking about basic diversity. People from different genders, different ethnicities, involved in this. Is that going to be a thing?

Mike: I'm the curmudgeonly old journalist who's a skeptic about everything. The government will do it when they're forced to do it. When either Congress says "do it" or when they run into a problem. Hopefully not on the battlefield, but somewhere else where there's a mistake and there has to be a correction.

Carolyn: Is that your 2021 prediction? That government's going to do it when they have to do it?

Mike: No. I wrote all these down. I have good predictions for 2021.

Carolyn: Let's go to your predictions, Mike and Phil. Give us your top predictions for 2021.

Mike: We touched on some of this last week and some today. We're going to see, one, a continued emphasis within the military and the intelligence community on information warfare. We've thought of cyber operations and, to a certain extent, cybersecurity is kind of a separate domain. We're going to see that tied in more broadly to information warfare and maybe have that integration between other services.

Carolyn: We've always had information warfare, but cyber has put it at warp speed now.

Eric: We've also always had the biggest budget and way to convey the information we wanted to convey. Now, it's cheap and easy. Adversary can do it.

Greater Level of Thoughtfulness on Classifications

Mike: We'll just see an increased emphasis on that. We're seeing generals run these commands now. That shows the level of importance that the military is putting on it, and I think it will continue to escalate. Next one I wanted to bring up. We'll start to see, at least I'm keeping my fingers crossed on this for some folks. Maybe a greater level of thoughtfulness when it comes to classification.

Mike: It's been so hard to work on classified material at home during the pandemic. There are folks who still need to come in to do classified work, there’re certain aspects you can't escape. I'm not sure technology necessarily is the answer. Maybe the easier step is to make sure only the material that needs to be classified is classified.

Mike: Maybe people will be thinking more about that rather than have someone come in an office and maybe be at risk or be unnecessarily out. Especially as everyone expects the next couple of months, even though a vaccine's on the way. Things could still be pretty ugly with the pandemic. There will be a greater emphasis there on what is and what parts of anything need to be classified.

Carolyn: I actually love that one. Because even where we work, it can be an issue. Honestly, a lot of times people just go the easy route and just leave whatever the default classification is. Rather than think about it and think, "Does this really need this classification level? Or, can this just be public?"

Eric: But it's safe to over-classify. Plus, a lot of times, the data even if marked appropriately, it's on a classified network.

Taking Unclass From a JWICS

Eric: It's really hard to take unclass from a JWICS or a secret-level network and share it. You've got to print it, all the markings are there, it's unclassed, then you scan it. I hope they go this direction. But I'll tell you, the CMMC team, who I'm a big fan of, they're putting out CUI classification markers now. Now, that it will be shared, but the number of markers they're putting out there. It's just going to make the problem. There are a dozen-plus categories, 20 categories, so it gets complicated.

Mike: It gets into what we talked about earlier. If things are too complicated, will they just not get done? Or, will they resort to something else? That's a conversation we're going to be having more in 2021. Then we talked about Cozy Bear, I think we're going to see more international. We're going to see more alliances lead to faster attribution.

Mike: We've seen that, obviously. The United States isn't the only one who's facing these problems and isn't the only one with solutions. We're going to see a lot more on this, kind of, countries who are of the same mindset. Some greater cooperation there.

Eric: Do you think we'll do something if we get some level of basic or relatively assured attribution? Do you think we'll start doing more?

Mike: Yes. Will we start doing more publicly? That's a different question.

Eric: Not graded, right? Get the job done.

Mike: My last, this is my counter-intuitive prediction for '21. We’ll see tech fatigue because we're spending all day on Zoom and all day on networks or somewhere else. We're just going to see folks who kind of walk away from this.

2021 Insights and Predictions on Technology and Security

Mike: Say, "I don't want to do this. I don't want to do these meetings anymore. You're going to need to come to the office." Or "You're going to need to do this." There’s a chance to have this emphasis on technology and security right now kind of slip away because folks just get tired of it. They're like, "we can just do it in person."

Carolyn: I have my phone on permanent "Do not disturb" because I've just had so much coming at me.

Mike: That's what I think could happen next year. Particularly from, maybe it's even a year out, but from a budgeting perspective, or from even a line of effort. As a government official might say, I wonder how long technology and cybersecurity can stay in the spotlight when people are just getting [21:25 inaudible]

Eric: I had a friend this weekend who's been in the industry for decades. They're contemplating just going into nursing. It was like, "Whoa. Wait a minute. That's a left turn right." You've been in this industry a couple of decades, you're nearing retirement, and you're going to go to nursing school? I'm just tired of it all. COVID has worn them down. We'll see different manifestations, if you will, of what this looks like. But it's tough. Zoom's tough.

Carolyn: All right, Phil. Give us your 2021 insights and predictions.

Phil: Zero-trust security is going to become more in vogue on the civilian side of the house. I don't know how many "deployments" there will be, quote unquote. You can't really deploy zero-trust. It's more of a paradigm shift that you make and move to in terms of your policies and authentication and all of that.

We Need to Hop on This Bandwagon

Phil: I just think that it's going to become more popular. DOD and the intelligence community tend to lead on these things. Once civilian CIOs and CSOs take a look at how that's working, they're going to say, "Oh. We need to hop on this bandwagon, too." You'll probably see a few agencies, by the end of the year, move in that direction.

Phil: There's just going to be a lot more discussion about it, about best practices. Everybody's going to have their own sort of flavor of zero-trust. I do think that you're going to see movement there. We're starting to reach an inflection point and I think that the pandemic accelerated that.

Phil: It’s something that's kind of flown under the radar a little bit but it's one of these wonderful government acronyms. QSMOs, the Quality Service Management Offices, which are for developing shared services within the federal government. One of those is a SOC as the service. You're going to see greater uptake of that shared service.

Carolyn: You think the government will do SOC as a service?

Phil: It's a shared service for the government, it's an internal shared service. There's shared services for HR and payroll, for example. This is the security one that I think is being run out of DHS. Those are two big ones. You're going to probably see a greater emphasis on cloud security since you're going to continue for much of the year. To have a lot of people working remotely using cloud tools for collaboration, file sharing.

Phil: I know that FedRAMP is in the midst of sort of rejiggering some of their classifications. For what low, medium, and high means in response to some guidance from NIST.

Cloud Security Will Be a Hot Topic

Phil: Cloud security is going to be a hot topic.

Eric: Do you think it'll come more from the CSPs, or more from the vendors that are in the space, the security vendors? Or the customers? Will Amazon and Microsoft provide more? They will, but where do you think that big thrust will come from?

Phil: It will probably come more from the customers and the security vendors than the CSPs. The CSPs will sort of continue to add security into their platforms as kind of a matter of course. They will make announcements at their big conferences about all the wonderful enhancements that they're making. To Azure and AWS and Google Cloud and Oracle Cloud and so forth.

Phil: But you're going to see sort of the pressure for it come from the federal customer. That's probably going to be in concert with the security vendors that they're working with.

Mike: If I can just piggyback on Phil, I know you asked that. But I think security operation centers as a service is not far off for the government. We've seen this move toward enterprise IT as a service. In the military, it's really only at the experimental phase right now but, to me. That's a natural outgrowth of what comes next.

Eric: Even this CDN Group F program really goes to that in a great way. We're seeing it. I'd like to see it move a little faster. Let the governed govern and give them secure capability.

Carolyn: We’re going to end it there. Another great hour with you guys. Thank you so much. I want to end this episode where Eric and I decided to try something new for 2021.

Rapid Fire Questions With Mike and Phil

Carolyn: We want to end with asking our guests some rapid-fire questions. So, I'm going to start with you, Mike. I'm going to give you just some quick questions. I just want first, top-of-mind answers. Just one or two words. What are you reading right now?

Mike: What I'm starting is a book called Sunny Days. It's about the history of Sesame Street and the children's television workshop.

Carolyn: Another one for my list. All right. Do you have a cybersecurity must-read?

Mike: A book on cybersecurity that's a must-read?

Carolyn: It doesn't have to be a book. It could be, for example, your articles.


Mike: Yes, of course. You should be subscribing to C4ISRNET's cybersecurity newsletter, which comes out every month.

Eric: I'm a subscriber.

Mike: Good. But the other thing I would add is Garrett Graff. He had a really excellent profile of General Paul Nakasone, who is the head of cyber command. That's worth a read just to get a little bit of insight into what's happening on the cyber perspective. My own brands, I would say you should follow [28:36 inaudible] They're doing a great job covering cybersecurity and cyber operations for C4ISRNET.

Carolyn: All right. What have you binge-watched and loved?

Mike: I'm in the middle of season two of The Mandalorian right now.

Carolyn: The last episode was so good. What is your guilty pleasure, other than Mandalorian?

Mike: Jelly beans. I can't help myself. It’s literally one of those foods that I cannot stop and I just eat until I get sick. And then I stop.

Carolyn: If you had a magic wand, what is the one thing that you would do or change?

Biggest Cybersecurity Impact

Mike: This is such a terrible answer. One is the political arguments that start, "If so-and-so did X, can you imagine blah, blah, blah?" Right now, I would just get rid of all of those, just drives me batty, the "if" hypotheticals.

Carolyn: Last one. Biggest cybersecurity impact in the last 12 months, good or bad.

Mike: I guess I would say the outreach from the CMMC office. The way they've talked with the industry has been, this is a bit of a hyperbole, but almost revolutionary. I don't think we've seen the government have the same openness to what industry was doing before. If that provides a playbook for what happens in the future, to me, that would be pretty interesting.

Eric: Katie Arrington. Big.

Eric: Okay, Phil. We have three minutes.

Carolyn: All right. Phil, okay, you ready? Same one. What are you reading right now?

Phil: I’m currently reading this wonderful book by John Dickerson called The Hardest Job in the World. It's about the history of the American presidency and what it means to be a president. Kind of the job qualifications that we as voters should maybe look more at when we're going through campaigns. Evaluating what actually is needed on the job. It's a really fascinating history.

Carolyn: Cybersecurity must-read.

Phil: Obviously, C4ISRNET is a great one. But I would say KrebsOnSecurity is a great blog and Security Boulevard has some great stuff. There's no shortage of really great commentary that's out there.

Carolyn: All right. What have you binge-watched and loved?

Phil: My fiance and I are currently in the midst of watching The Flight Attendant. It is on HBO Max, based on a novel of the same name. Really entertaining. Great escapism in these dark times.

Cloud Security or Endpoint Security

Carolyn: Good one to add to my list. What is your guilty pleasure?

Phil: It used to be, and still is to a certain extent, a cheeseburger. I have, in the last couple of months, really gotten on the Beyond Burger train. To try and cut down my intake of red meat. Beyond Burgers are phenomenal as long as you season them, put some hickory-smoked cheddar on it. It tastes just as good as a regular burger.

Carolyn: Totally agree. My friend said if you add bacon, you can't tell the difference. What's one thing you would do if you had a magic wand? What is the one thing you would change or do?

Phil: Expunge misinformation from the internet. Or make people unable to take in and believe misinformation.

Carolyn: Last question. Biggest cybersecurity impact in the last 12 months.

Mike: Can I change my answer? I want a coronavirus vaccine that tastes like a milkshake.

Eric: Done.

Carolyn: Biggest cybersecurity impact in the last 12 months, Phil.

Phil: The pandemic, it's shifted everybody who can’t work remotely to remote work. All the different cybersecurity considerations that come out from that, whether it's cloud security or endpoint security. It's really changed the way a good portion of us work and operate in the world. That obviously has security implications.

Carolyn: Thank you guys, thanks to our listeners for joining us and we will talk to you next week.

To The Point Cybersecurity was recently named one of the 30 top Federal IT influencers 2019 and 2020 because of fantastic guests. We are always looking for great thought leaders to interview. Please email me with guests you would like to have on the podcast

About Our Guests

Phil Goldstein,  web editor for FedTech and StateTech. @philgoldstein

Mike Gruss is Executive Editor, Defense News, and C4ISRNET. @mikegruss

Listen and subscribe on your favorite platform