[0:42] Evolving SEC Cyber Regulations
Rachael: Please welcome back to the podcast Rich Etri. He is the Chief Innovation Officer at ECI. He has more than two decades of experience managing it within the financial services industry. Rich, welcome back.
Rich: Excited to be here, Rachel, and great to be here with Audra as well. This is such a relevant topic. I feel like I spend half my days talking about SEC cybersecurity and then the other half talking about AI. So it comes almost naturally these days. I go to bed thinking about it. I wake up hearing about it. It's like a stalker. These are so, yeah, God, these topics are so relevant these days.
Rachael: I mean, it's wonderful to see the evolution. I think the last book was on April 22. And at the time you were the only person I've met who read the whole 250-page proposal from the SEC on their proposed cybersecurity guidelines. Now here we are, fast forward about a year and a half later and they're moving forward. And I'd be really curious since you have that lens from the proposal to now. What have you seen evolve for? What are any differences, if any, between the proposal that went through the comment period and what we're seeing today?
Rich: I don't think there were any kind of fundamental changes. I think what happened was there was a kind of deeper clarification around some of the broader statements that the SEC was proposing in there. I think one of the areas that I think people were hoping for some clarity around which there really wasn't any.
Navigating the New Reporting Rule within SEC Cyber Regulations
Rich: And I think we are going to have to live with it, is the reporting rule. I think that we were happy to see a reporting rule in there because it was very ambiguous before there was. I mean it was a completely gray area. Now they give a little guidance about when to report. But I still think it's a little bit of a concern for organizations and there is still a little bit of gray matter there that I think has to be sorted out over time. But I think fundamentally the premise of the rule hasn't changed.
And I think in general, this is really what the industry needs. I hate to kind of say it, but I think sometimes if you're not forced to do it, especially in the smaller organizations, they just don't. They're not going to move forward and do it. And a lot of this rule is really to me, best practices. I know we talked about the last time I was on the podcast, there was nothing in there that I was like, oh my God, why are they asking you to do that? That's crazy.
Only really advanced firms and large organizations should do it. It's like these are things you should be doing anyway. These are things that I've done in my previous role as CIO and CTO and things I recommend to our clients every day. So I think it's good that we finally put a stake in the ground and are starting to drive this forward. Because the threat landscape only gets worse. There are new vulnerabilities every day.
Challenges and Strategies in Adapting to SEC Cyber Regulations for Varied Businesses
Rich: Threat actors are basically companies now they're going after they operate like businesses, they generate revenue, and they have cost. This is a full-blown business now and if you don't see that treat it as such. You're going to get compromised and it could be the downside of your organization.
Audra: So question for you about the adoption of the SCC requirements. What do you think are going to be the biggest challenges? Because businesses are all different sizes and have different kinds of budgets to be able to apply to these kinds of changes. What do you think are going to be the biggest challenges that they'll face?
Rich: I think when you read through the document, the SEC talks about your cybersecurity program quite a bit. They use the word program throughout in talking about it and they also mention right-sizing your cybersecurity program to your organization. So I think the expectation isn't that you have a CISO if you're a 40-person firm. But they do expect you to be doing certain things to be able to manage that risk. I think building out the program, I think this is what gets a lot of angst into people and it's something that is probably a little unfamiliar.
We were talking about that briefly before the podcast. I think there's been a lot of purchasing of software and cybersecurity tools over the last several years. A lot of companies are making a lot of money off of that. But no one has really kind of brought together all of the components of what is a good cybersecurity program. So that you could manage it, report on it, and have controls and governance around it.
[5:30] Core Components of SEC Cyber Regulations Compliance
Rich: And I think when you begin to think about that, that is where firms started, Okay, well how do I do that? How do I bring that together? Because when the SEC comes in, you're going to want to be able to report on that. So you're going to want to be able to say, hey, here's my program. Here's how I align with industry standards. Here's how we score against those. Our risk register. How we're managing vendors and all those components that make up a good program. And that I think is that kind of daunting part that gets people a little concerned.
Audra: So what kind of core components would your expectations hold for robust products to bring together saying no one's doing it? What are we missing?
Rich: Well, it's not that no one's doing it. I think large organizations are doing it. I advise our clients and the ones that I advise are doing it. So I think that it really starts all with good IT hygiene. And I've always said this before to our clients and when I've kind of managed things, that cybersecurity really starts there. If you don't have good IT hygiene, I don't care what kind of tool you have, You're going to have a lot of risk
Good IT hygiene means you have policies, procedures, and controls around your environment. So documenting how you manage it, how you manage change in your environment, how you manage permissions. Technology, and devices, and making sure that they're accurate is the biggest piece. And that's the foundation everyone needs to start from. One thing that I've seen a lot over the years is people go and they buy these boilerplate policies and procedures and that's great.
Aligning Policies and Procedures with SEC Cyber Regulations
Rich: Everyone needs a starting point and a lot of organizations are similar. But what do the regulators look for, are you actually abiding by those policies, procedures, and controls? So if you say you're patching monthly, show me the report. Are you patching monthly? Oh, you're not. You're only patching quarterly. So you have to make sure that those policies and procedures really match what you're doing. And that you could actually, the control is the ability to be able to report on that and show evidence of those policies being implemented properly.
And I think starting there to make sure those are complete and really are aligned with your IT environment. The other problem I see with 'EM is often that as change happens in the environment, you implement new systems, you bring on new businesses. You don't update those policies and procedures and as a result, they're out of date as well. So make sure that those are aligned. And that you have a process to make sure that you update those on a regular basis is going to be the foundation for building out that program. And that's something you don't really need to do with technology. It's really about that kind of people and process piece.
Audra: So in terms of the big changes around the new regulations. Particularly with reporting incidents, even if they take place outside of the country. How are these companies going to be able to do that? Because sometimes companies aren't fully joined up.
Reporting Challenges Under SEC Cyber Regulations
Rich: They're not. And I think that that is generally going to be, the reporting piece is going to be a challenge. I think larger organizations that are dealing with threats every day and risks every day are going to have to really come up with a good framework to bring those to the forefront. And have someone make a decision on whether or not they need to be reported on. If you have a US employee who's traveling to London, their laptop is hacked, and they had investor data on there.
You're not sure if it was leaked or not, that's probably something you're going to have to report to the SEC. You might say, well, the laptop was encrypted. I don't think they got at the investor data. We're not going to report an audit. That's where that gray area comes into play. Firms are going to make, oh, it wasn't in the US and they're going to begin to make decisions around what needs to be reported and what doesn't.
I think one thing that people need to be conscious of is when the SEC does come in for examinations, they are going to want to look at cyber incidents you had. So they're going to want you to be able to pull reporting from your platforms to be able to see what kind of alerts you have. And if you didn't classify something or report something. They're going to dig into it a little further and say. Hey, we saw you had an incident here, tell us a little more about this.
Compliance Challenges in the World of SEC Cyber Regulations
Rich: And they'll ask questions in terms of like, Hey, what was on that endpoint? How do you know nothing got breached? And you may not get fined, but I think it's going to be a kind of shot across the bow that you might want to rethink your policies. But I don't think they're going to be just happy with the fact that you say you don't have any incidents.
So part of this program is that you know how to monitor your environment. They're very clear about that. They even talk about managed detection and response. They talk about extended detection and response on endpoints having IDSIPS. So they're going to want to look at that. The data from those platforms. I think this is where it becomes a little more, I think serious, is that prior they just issued basic guidance around. Hey, you should be doing these things.
And they would come in and ask you if you were doing 'em and they'd be like, okay, you are now. They said, look, you should be doing these. You say you're doing these, show me you're doing these right. And I think that shows me a piece is where the rubber's going to meet the road. But I do think this reporting piece is still going to be difficult.
I think it's going to get refined over the next several years. I don't think this is the end of this rule. It's really just a starting point. Look, fundamentally, I think the rule is great as I mentioned. I am not sold that it's comprehensive enough. However, I've always operated under perfect is the enemy of good.
[11:38] Materiality in SEC Cyber Regulations
Rich: We don't really have much right now. If we could just get to good, our corporate environments would be that much better. And I think there's a lot of criticism around the rule because it's not perfect. And I think my view is I'm just happy that we're going to try and get to good.
Rachael: Exactly. I would love to dig in a little bit too on what classifies as material. Because that seems to be somewhat of a gray area as well, Rich. I mean, let me know if you agree, but in everything that I was reading. It's it's not crisply defined and I don't know that it could be given the landscape. And who knows what's going to happen, but it seems to leave a door open for the SEC to dig a little more maybe. And how would a company have any, I guess, rebuttal or ability to refute? How do you have a conversation on what's material or not if it's not a clear definition?
Rich: I think that there's some kind of context in there, particularly around financials. So in the past, the wire front was a big issue. And we still see it's amazing how often still happens today. And that was something where you could kind of take the hit on your balance sheet or the operating company. You wouldn't report that or even tell investors. And I think now that's kind of clear.
We don't care if you could financially handle it, we don't care if you have insurance, if there's anything to do with finances. So if it's wires or misappropriation of funds, whatever it might be. Business email compromise that resulted in any monetary kind of loss, you have to report that.
The Gray Areas
Rich: So there's no gray area there. And that was one I think that they were looking to really close the gap on. Because a lot of organizations were suffering from that and they weren't reporting it because their view was like, well, I could afford it, I paid for it.
It doesn't impact investors. I don't need to share that with anyone. And they're like, no, you don't. I think where it starts to get a little gray is if you do get ransomware. It encrypts a server, you detect it, remediate it, restore the data, and close the gap on whatever the issue was that allowed those threat actors in there. And there was no impact on your business, do you need to report that? Some lawyers I've spoken to feel like you do.
I think you just got to kind of look at it case by case and have a really good internal panel and outside counsel to kind of think through the spirit of the rule. And whether or not you should report it. My view would be always err on the side of what's the harm in reporting it. I feel like incidents like those happen from time to time. And if you could articulate how you've closed the gap. Sure, they might come in and do an audit. But that's only going to put you in a better place. Right?
Rich: But I think that area is where it gets a little gray. I think. Look, if you get malware on your computer but your virus protection EDR tool catches it and quarantines it. There's no need to report that. That happens every day probably in most organizations someone clicks on a phishing link, they enter some credentials.
Clarity Amid the Complexity of SEC Cyber Regulations
Rich: But you get alerted on it and you reset their password. I don't think things like that need to be reported. I think it's also the breadth of what the issue is. So say your organization is heavily phished and 30% of the employees get it, and 20% of them click on the link. You've had a bunch of credentials compromised. That to me then is probably something that you're going to have to report versus if it was one employee.
Rachael: That makes sense.
Rich: But I think it's hard. That's the area where, and I think you kind of hit on it, I don't know if we'll ever land on like. Hey, these are the 20 things. If they happen you need to report. I think my hope is that the SEC is a little, I won't say lenient. But more kind of giving guidance as they go through these examinations and say, you know what? I think you should have reported this. I can understand why you didn't. But going forward we expect that.
And then making some of those use cases public saying, hey, we've been seeing these types of trends. We expect firms now to be reporting these and that's where they need to be a little more forward-thinking in how they kind of engage with businesses. I don't think there's any criticism, it's probably that they see things in businesses and trends. They should share them and provide guidance to the broader market.
Fine Lines and Disclosure Times
Audra: So the way you're describing it, it sounds a bit like when we got involved in the implementation of GDPR. And the whole thing of when do you report a privacy breach and when do you not? And that side of things except with GDPR, there were definitive guidelines around fines. Do you think they'll work their way up to where people are not reporting appropriately to start fining them?
Rich: I think so. I think it's probably not going to come out of the gate unless it's so blatant. So if it was one of the scenarios I mentioned where you wired a million dollars to a bank account and you didn't report that they're going to fine you. And I think although they tend not to say, hey, here's the starting point. Definitely, Europe and the UK are kind of more prescriptive when it comes to things like fines. I don't think they'll say, this is what we're going to find you.
They tend to look at the organization and the size of the organization. And then they kind of align the fine accordingly. If you look at some of the fines they issued for text messaging for business purposes. The JP Morgans of the world and the big banks got hit really hard, right? Some smaller organizations I know that have been fined, it's been more relative to their balance sheet.
Rachael: And I have a question, sorry, jumping in here too. I love to talk about corner cases and I'm just wondering about your perspective on the disclosure rules of what four days or so. I've been reading these articles that the FBI had some concerns about reporting something so quickly.
[18:13] Balancing SEC Cyber Regulations and Investigative Realities
Rachael: And then does that open the door to providing a vulnerabilities roadmap to a company that is going through an attack? I mean, what's your perspective on that? Is that a corner case or is that a big concern?
Rich: I think it's not to be the person in the middle, but I don't know if it's a corner case. It's one of the areas that I have a little bit of concern about because it depends on the type of issue, the FBI expressed it first off. Sometimes it may take, depending on the type of breach. It could take a while to figure out what the root cause is and everyone is going to be very reluctant to report something unless they've gotten all the information.
Having spent my career on the other side of the fence, I've been involved in incidences before and you want to know everything before you go to the regulators. I think that rule, my hope is that over time that evolves a little bit. And maybe we can classify certain types of incidents and create a reporting framework around it.
I think that if it's a significant incident and an affirm took longer than four days. But they still reported it, I don't think that they're going to get dinged for not reporting it in the four days. I think they would get dinged if they didn't report it at all. But I think you could report it and say there's going to be a follow-up report in several days. So maybe there's an initial SAR-type report that you file. And then subsequent to that you have two weeks to kind of follow up with a final report and analysis on it.
Analyzing Unaddressed Aspects of SEC Cyber Regulations
Rich: The other piece, which the FBI is concerned about is if it's a kind of zero-day vulnerability that's being exploited in the wild, do we want to be reporting those and making them public before the software companies have time to kind of close the gap and patch those? And I think that is a little bit of an edge case.
But we're seeing more of those today and I think that needs to be accounted for going forward. As I mentioned earlier, perfect is the enemy of good. And I think this is one of those items that keeps this rule a little bit on the good side and not closer to perfect is defining these a little more clearly. Because zero days happen quite a bit.
There's a couple that came out this week. You don't always hear about 'em as much anymore because they do happen so frequently. But giving the software vendors time to kind of close the gaps on these before it's made broadly known, really helps businesses. And we need a little bit of time to be able to close those gaps.
Audra: So jumping into the final regulations that they came out with. Forbes did a recent opinion piece that highlighted five considerations that were overlooked by the SEC's final regulations from the potential for insider trading to board involvement in cybersecurity. Do you think it's a concern that they haven't regulated around these areas?
Rich: Well, I mean look, they have regulated around the board, right? They are making the board accountable for cybersecurity. And that's why I really think it's important that you build out that program that we talked about earlier.
Board Accountability and Expertise in SEC Cyber Regulations
Rich: Because you're going to want to be able to report to your board on a consistent basis. And I think that's another when you develop the right program. And have the right reporting framework for the security framework that you're using. Whether it be NIST, ISO, or CIS, they see consistency and can be able to provide guidance and you could also show change over time. So my posture was X and now it's Y and allows them to make decisions and investments.
I think some of the issues originally I believe the SEC wanted someone with cybersecurity expertise on the board. And I think in general they got pressure in pulling that back. I think there were a lot of big companies that were like, look, we have board members, we respect them. They'll get cyber security training, but we're not going to punt them because they're not a cybersecurity expert. And I think that kind of made sense. I do think that board members need certifications and I think cybersecurity is one of them. I think you have to have some sort of cybersecurity certification to be able to sit on a board think is fair.
And I think that rule is going to evolve over time. So I think it'll be directors of public boards or independent directors of private companies will have to have some sort of certification maybe by, I think one of the training bodies is the NACD. They'll probably put something out and they'll have to kind of take it.
The Role of SEC Cyber Regulations in Shaping Board Governance
Rich: But I do think the board needs to be accountable because if they're not. Then it doesn't drive that tone from the top is big. Anytime you want to hit something implemented, you make the board accountable. Back in 2009, we had the mortgage crisis and how private funds were investing in securities associated with and tied to mortgages and derivatives. When they implemented controls around risk, they made the board accountable and right away everything changed. It was like, okay, we want to look at these types of reports, we want to see this on a monthly basis. We want to see this on a quarterly basis. When you're the one who can actually get dinged personally, it becomes a different environment.
And I do think they've done a good job in laying the foundation for that here. But I do think they need to build on it and make sure that there are the proper certifications and accountability. I think where the rubber meets the road is where an organization has issues. And they see that the board wasn't really looking at managing that cyber risk. They need to go after them or else it's going to set a precedent from there.
Audra: So at least they're working ways to drive behaviors, which I think is positive. One of the other things that the S C C regs included was a proposal, an AI proposal regarding employees to monitor for and eliminate conflicts of interest when using the generative AI tools. Do you think this proposal is adequate in addressing the threat of AI within public finance companies and is it even feasible?
[25:01] Exploring the Uncharted Waters of AI in SEC Cyber Regulations
Rich: No, I mean it's not even close to something that needs to be thought about around generative AI. As I mentioned, I think at the beginning of the podcast, I've been living cybersecurity SEC and AI for probably the last several months. AI presents a whole kind of different set of risks depending on the organization, and you really need to understand how you're going to be using these tools, what business are they aligned with and really putting its own kind of cyber framework around it. One of the things I've been working on with clients is kind of a secure compliant way to use chatGPT.
You could go into chatGPT, and you could ask questions about code. But how do you know that code isn't malicious? How do you know that the questions you ask the links and the information it's providing are not bringing you to a malicious site? There's a way to do things like prompt injections and other types of things. Threat actors are very smart. They've already figured out that there's so much excitement around these large language models that are part of that kind of world and they know how to manipulate them.
And so you really need to almost look at it. I think the SEC had something the other day. I believe it was one of the private fund publications around, they're going to go in and as they're doing examinations now they're looking at how firms are using AI. So they're not going in and saying, hey, we're going to find you. But they are starting to now kind of try to understand how firms are using it so they could begin to I think, put some guardrails around it.
A Call for Comprehensive Guidance
Rich: And they're looking at things like if you're a trading organization and you're using AI to kind of generate trade ideas, do you have a kill switch? What are the guardrails that you're putting around those kind of models to make sure that it doesn't stray one way or the other? And that if you have to kill it, it kills, you can shut it down quickly. And is there surveillance?
So one of the things I've been working on with a tool that we're developing is we could provide surveillance around how people are using AI and chatGPT. So compliance officers could see, hey, we're going in, we're entering this type of data. Here's the kind of questions we're asking the tool, here are the answers we're getting.
So you can then modify those to be able to prevent things and enhance it a little bit. And those were a few things they mentioned, which I think are kind of foundational to probably what they'll begin to discuss going forward. But that rule I thought was a little premature. Because there's so much around AI and its use cases I don't think it scratched the surface around giving people good guidance around it.
It probably only created a little more of what I think were reservations about using it. So you have organizations now that really want to use these tools and they are powerful. Look, I would want to be enabling it across my organization as well, but then they're like, whoa, SEC is saying we got to do these things. Let's hold off before we start leveraging these tools. They could be the kind of fire hose for innovation sometimes.
Safeguarding AI Models and Data in Compliance with SEC Cyber Regulations
Audra: Question though, in terms of your experience of looking at implementing AI across your businesses. Are you considering how to actually protect the actual machine learning models and techniques and stuff that's behind this and the data sets that you're actually feeding them with? Are you making considerations on that or could make recommendations to people who want to do something similar?
Rich: Yes. I think it really started when we began to engage a lot more around generative AI. The first thing that I started with was what do you need to do? So I call it organized. So how do you get yourself organized first? What are the use cases you're going to use this for? Begin to create a steering committee and really clearly define the use of it. Because oftentimes what happens is, especially with technology like this, everyone's got ideas and you end up in that kind of position where you're trying to boil the ocean.
Finance has an idea, accounting has an idea, and marketing, sales, and trading. Everyone's got ideas, boil it down to that one use case and then begin to define the acceptable use of AI for that use case. And then define what is the underlying data and then how are we going to store that data. Where is it stored today? Who has access to it?
Those are really, I think people, we all want to skip to the cool stuff and Microsoft's coming out with copilot soon and we're really excited about that. I think that's going to change the way people work forever.
The Exciting and Challenging Path to AI Implementation
Rich: Now I think fundamentally it changes that, but if you don't do those things and there's several more you probably need to think about, you're going to end up in a bad spot, meaning you're going to have data loss, you're going to get the wrong results When you don't understand your data, who has access to it? How is it updated? I call it the way around data. If you can't answer those questions, you have a risk of using AI against that data. So when you begin to answer those questions and develop those use cases. I think you could end up in a really good spot.
And then now you have an example of how you've developed one use case. And you could expand that into other areas going forward and maybe find some things like, hey, we want to find tune it here. We didn't think about when we're using this type of investor data, we need to store it here or encrypt it here or do those types of things. Because data residency is really important when you're using a lot of these AI tools and how you deploy them. You need to align with all of the regulatory environment, even though it doesn't specifically call out AI, it calls out the underlying data itself. I think those are things that people tend to overlook as you're diving into these things.
I always end up being the one to the negative one. They're like, oh, I thought we could just start using it, but I really do want you to use it. There's a way to go about using it. It's challenging. It's challenging because look, it's exciting stuff.
[31:48] Understanding the Essentials of AI Use Under SEC Cyber Regulations
Audra: It's also the whole thing that individuals who are using it probably don't necessarily understand the methodologies behind it. Everyone's just like, oh, I'll use it for everything. I want to do all these different things with it. And on a personal basis, lots of people are using it in really entertaining kind of ways, but you just sit there going, what are you actually trying to get out of it? So it is that key use case. How do you really want to use this?
Because there is also the whole thing of how you want to use it. What are you training it with? Because I always believe in the garbage in garbage out principle. So if it's not clear how you want to use it, or what you're training it with, you may not find the answers that you want.
Rich: Well, that's a really good point. And it is one of the things we kind of try and articulate as well is if you don't have your house in order from a data perspective, it's kind of that garbage out thing. You also bring up another good point. The SEC did mention in that kind of private funds memo that they do want you to be able to prove how you should be able to see the lineage in terms of how you got to your answer with AI. So it's not just about like, hey, it made this decision and then we traded based on this. It needs to be like it made this decision, here's what went into it. Here's the decision tree it be able to kind of at least prove your work.
Building Trust in AI
Rich: I always go back to it when my kids come home and they have a 98 on their math test, they just wrote the answers down and didn't show the work. Right? You got to show your work, and I think that's a really good point to bring up.
Audra: It also makes it much more traceable and therefore then you can be accountable if it's traceable, what you've actually done, how you've done it, what you've fed it with, and what you're querying that data with. So that's brilliant.
Rich: As an investor, if you have a company where you're investing your 401k or your retirement funds. I'd want them to be able to have that in place for AI and look like someone who loves AI and has a certain level of trust in it. It's again how it's applied. So I want to make sure that you're using it correctly. I have to make sure my investment is being protected properly. And I think that's where the SEC is going to start is to begin to make sure that investors are protected as firms begin to use these tools.
Rachael: It's an exciting time, but also I guess a scary time. Who knows where this whole AI thing's going to go? Does it morph and evolve and more deeply entrenched into our day-to-day lives?
Rich: Everyone has their concern. I try and tell people, to think about it as similar to Excel. When you go back and you read the articles about when Excel first came out, people are like, this thing is garbage. It's never going to take fold. No one's ever going to use it.
SEC Cyber Regulations and the Growing Impact of AI
Rich: Can you imagine? I actually had the article, it was in one of these computer magazines. I used to keep it in my old office. Now that I work from home, no one really has offices anymore. But it's cool to look at that and now you think about it, can you even get a job without Excel? I mean, Excel is, I don't know if our economy in all seriousness could function, but without Excel.
And I think some of these AI tools, especially some of the things you have with Coot, I think those things are basically going to become the new Excel. So I don't think it kind of replaces jobs, but the people who know how to use them are going to be the ones who are going to excel in their roles. And I think that security's going to play a huge, huge role in that.
Audra: And I also think there has to be the human review piece of it, because my concerns with the use of AI is people are overly trusting of the technology that they don't check it, and that's insanity. AI is only as good as what you've done with it. So we did have a situation where we were discussing, well, what should our AI policy be? And so someone asked chatGPT, what should they write me an AI policy? And then they sent it out and they kind of went well. And I was like, did you even check the content of it? Well, it should be good enough. And that's where I get super, oh my goodness.
SEC Cyber Regulations and the Perils of Blindly Trusting AI
Rich: One of the things that we worked out at our platform is the hallucinations. So these models are designed to answer the question correctly. They're going to fill it with what if they don't have the data or the data's incomplete or even inaccurate, they don't care. They're going to use and they're going to interpolate part of that response. So when you're using it for business purposes, you need to know to what level this answer is accurate. Is it a one or is it a five? Because if you're going to use it for sending something out to clients or using it internally, I mean there's a lot of risk there.
And you're right, people just think they use it like Google, so they think whatever comes up on the internet, must be true. I was reading an article a couple of weeks ago and look, I'm an old man. So when the internet first came out, do you remember there were all of those stories about people on the internet? All these false stories like Mr. Rogers and Chuck Norris and all of that. And people were saying, this is what's happening with chatGPT because people just believe anything that's coming back from it. Absolutely excited to use it so that they're not questioning the answers.
Audra: Exactly. So that's where my paranoia of AI lies there has to be an expert human review of what is actually coming out of it before you push that out.
Rachael: But for some people, it's a lot of extra work.
Audra: Why worry about it? It's good enough. Exactly. What could go wrong? Sounds compiling.
[38:24]Unraveling the Complex Web of SEC Cyber Regulations
Rich: Tell that to the lawyer who created, what was it, several briefs using chatGPT, and it made up false cases. People need to understand that stuff is, or on the flip side. The Samsung engineers entered information about their chips into the model to see what it would kind of come back with. And now the chip data is now in the model forever. So you can go in there and ask it questions about Samsung like chip information. So I don't think people fully understand how these tools work and why it's important to be a little bit prudent around it and not reading the things. It saved you from having to do it in the first place. You could take the 15 minutes to read it. Right?
Audra: So considering that you're living the AI with cybersecurity on a regular basis, what keeps you up at night about that combination?
Rich: I think what you just talked about, but in particular to code. So I've seen trends where people now look, my son is a computer science major in college and he was doing some homework I wouldn't let him use any of the tools. Because I want him to learn from the bottom up. But he would ask me questions and I would go on chatGPT, and I would see the code that it spits out and it's very good. But I think from time to time, as I mentioned, there's a lot of risk in using that code. Because you don't know the repositories it came from and how it kind of brought it together.
Safeguarding Your Organization from Code Risks in the Age of AI
Rich: There have been a lot of cases where organizations have used it, there are links to libraries that are malicious. So there's a lot there to be concerned about. The code piece is concerning to me because I think people in the rush to use it are not bringing the code together properly and could be creating unnecessary risks to their organization.
Some of the things that we're starting to talk to clients about, and this was even before AI was really around putting malware tools around your coding, your app dev environment, and DevOps and integrating it more closely so you could scan your source code repositories. And now I'm like, look, if you're going to use copilot for GitHub or chatGPT and those tools, you need to kind of layer in this.
And I mean, tying it back to the SEC, they're going to be looking at, not necessarily AI, but as I mentioned earlier, they want you to rightsize your program to your business. So if you're using algorithms to execute on trades and you're using tools like chatGPT and copilot. They're going to expect you to have some controls around that and they're going to expect to see that you are looking for malware and things like that.
That to me is going to become the de facto standard. I think the other thing with AI that generally concerns me is I don't want to see people get lazy, right? So the tools are good for work optimization, but we still need to challenge things and be critical around content and produce, and just not similar to what you were saying, take what these models feed back as gold where I see them.
A Year of Evolution and Implementation
Rich: The first really strong use case is really just what the large language models really excel at when looking at large unstructured data. So if you have a million documents that have legal terms in there, if you could train that model to go in and say, hey, show me where I have NDAs where I can't do this right around this date. That's great because having and being able to sift through large number of documents and find those answers. But to use them in a more kind of pervasive way, I think now is a little too early.
Rachael: Agreed. Well, I would love to come back in a year, rich, and see how everything has played out since our conversation. I have a feeling in a year this is going to be a really, really interesting conversation to pick up.
Rich: Yes, well look, yes, even look, the SEC one would be an issue to pick up because it's going to be codified next month. So I think next year will be a year where it'll start to be included in examinations and we will start to see where the rubber meets the road. So look, I'm excited. I think as I mentioned earlier, it's about time some of these organizations get their act together from a cyber perspective.
And look, insurance is getting more and more challenging these days. So the premiums are going up on cyber insurance, and I think this is actually going to help firms. I was talking with a client last week who we advised and implemented the program as we talked about, and they're like, Rich, my cyber insurance went down.
Deciphering SEC Cyber Regulations
Rich: I'm like, see? So I think that people don't also understand there's a cause and effect and the ones that don't have cyber insurance, you're going to have to get it. But I think that the spending around cyber has always been, I think, something that has held organizations back. And that's where I think the board play also helps a little bit because, in the end, they're the ones who help drive the purse strings a little bit. So I think this will give IT departments who have struggled with getting funding, give them a little bit more of what they need to get that environment up to snuff.
Rachael: Absolutely. So it's an exciting time. I love technology and I love evolution. Just keeps things interesting. Well, Rich, thanks so much for joining us again, this has been so much fun, and thanks for breaking down these SEC guidelines because it's a 250-page proposal. That's a lot to have to go through. So thanks for reading it and digesting it for our listeners.
Rich: Anytime it'll put most people to sleep.
Audra: It'll help those nights. You can't sleep because you're worried about how people are using AI scoop in the proposal.
Rich: People, you using AI to mock your voice, all that stuff, it's love.
Rachael: To lose over, right?
Rich: Maybe I could join the next podcast with my AI replica.
Rachael: That would be so much fun. Yes, absolutely.
Audra: And then we can ask you your AI replica questions to see whether you agree with the answers.
Rich: Probably not.
Rachael: Well to all of our listeners out there, thanks again for joining us for another awesome conversation. And again, thank you Rich for joining us.
About Our Guest
Rich Itri is the Chief Innovation Officer at ECI. He has over 22 years of IT executive experience, spending his entire career managing IT within the financial services industry. Prior to joining ECI, Rich was Managing Director and Chief Technology Officer for PJT Partners, a boutique investment bank, Principal and Chief Information Officer for Sky Road, and held Chief Information Officer positions at Arrowhawk Capital Partners and Arbalet Capital Partners. Over the years, Rich has developed and managed innovative, business-aligned platforms, that drive revenue and operational efficiencies. Rich holds positions on several Advisory Boards and volunteers his time to help non-profits leverage technology.