Now, we've certainly talked about gout Gap analysis up to this point, but we're going to spend a little bit more time going into it. So if you'll remember the idea of current state versus desired state, I know where I am. I know where I wanna be. Now, how do we get there?
So ultimately, once again, this goes back to the idea that you're not gonna walk into an organization as a scissors. Oh, and there being no policies, no controls in place, you're gonna come into an existing environment. But it's gonna be your job first and foremost meet with the business leaders
to meet with other senior managers,
learned the objectives of the business and then set about finding what you're, ah, objectives, objectives should be
and how we're gonna use those objectives to reach our long term goals.
So excuse me. So, once again, that phrase just current level Current state versus desired state Gap Analysis, Gap analysis, gap analysis.
Oh, and speaking of Gap analysis, one of the ideas that they come back to again and again is that we use tools like the capability maturity model integrated four gap analysis. So if I say, See mm, I I want you to immediately think Gap analysis.
So what we have in this particular slide
is we have the Software Engineering Institute. So this comes to us from Carnegie Mellon, and they gave us the C M. M. My capability maturity model integrated.
And so the idea is you get audited in your assigned or your indicated as operating at one of these levels. Initial repeating, defined, managed, optimized or repeatable Sorry, repeatable to find managed optimized.
what that tells you is that based on the maturity of your process capability, maturity model, that's what you're being evaluated on. And often this is for software development departments or organizations. So they're not evaluating your code.
Just like when we talk about audit doesn't look the product. It looks at the process.
So ultimately, the idea here is the more mature you're process, the better your product will be.
And particularly with government agencies, I'm expected to have a c m m i of a certain level. Usually it's level three. Now, if I'm working for NASA, Yeah, you know, software development for NASA. Yeah, we want to go ahead and get that. You know, five.
But ultimately, most companies wanna three because that's what their customers require. All right, so ultimately we're getting the evaluation on her. People are processes and technologies, and what you're going to see is from one initial you have
almost no issue in no mood, almost no controls.
It is a very lax environment. You don't take security seriously. There's no policy. Um, everything's chaotic and nobody shooting for level one. Yea, we got audited and were chaotic, right? Nobody's looking for that.
But as you move all the way up to optimize where you have lots of control in place, comprehensively implemented, things air automated to get rid of that human element were always looking to continually improve. Well, that's a lot of effort that takes a lot of time in a lot of money.
So what's your again want to do is just strike that
balance Now I would know the 135 here, um, in red,
not necessarily people in processes, but I would certainly have the idea about how security controls air implemented at each of the layer let up each of the levels. You don't need to make a flash card with them but get the idea and really feel comfortable with the vacuum here. We got nothing at one
all the way up to continuous improvement at five.
And then I would have just of 23 and four. Okay, so that is our c m m I.
And ultimately, what we're going to be looking at is we in order to determine are we meeting those objectives? Once again? Some terms come ups. KP I k r i n k g I.
These will spend more time on with risk monitoring, but I just want to kind of throw those terms out now. K p i ke performance indicators.
I'd like you to have the term operational efficiency.
Okay. KP eyes air all about operational efficiency.
Now, Kay, our eyes
are triggers. What am I looking for? To indicate a risk is about to materialize. A key risk indicator Worried about thunderstorms? I look outside and it's dark and windy thunderstorms probably gonna happen.
And then key goal indicators. Kiko indicators are after the fact. Did you meet your goals or not?
when we talk about key performance indicators,
uh, let's say that my objective is to patch
100% of my systems. Okay, By the end of the second quarter,
it's the end of the first quarter and I only have 10% patched. That's an indication that I'm not going to meet my objectives that addresses the performance and says, Whoa, we may have a problem.
Okay, key risk indicators, then key goal indicators. So where is K P? I is more about operations. It's more about you've got an objective. Are you gonna meet it? Often? Risks can impact your ability to meet performance indications. So, for instance, let's go back to that idea about patching systems. Okay,
it's the middle of the first quarter, and you're only 10% complete while you're behind schedule.
So you're K. P. I will say, Hey, you're 30% behind schedule.
You're k r I, though before that
maybe, um, you get an indication that your patch management strategy is no longer effective or you get an indication that there's a staffing problem with the technical team or whatever, so that might tell you that risks are about to materialize.
You're getting a key risk indicator that tells you you may not get your
key performance indicator so Sometimes those could work together, but at any rate, you wanna have those terms. K r I k p i K g I.