CISM

Course
Time
8 hours 39 minutes
Difficulty
Intermediate
CEU/CPE
9

Video Transcription

00:02
So as we move forward and talk a little bit more about monitoring and controlling risks, one of the important elements that were gonna watch for our key risk indicators
00:13
if you remember when we worked with the risk register and we determined there many fields that you can have on your risk register lots of information to collect. But something that's incredibly beneficial is to document into monitor triggers and triggers or key risk indicators
00:32
are indications that a risk event is likely
00:35
doesn't necessarily mean that a care I causes the risk event.
00:39
But it does mean, Hey, it looks like that risk you were concerned about is going to materialize, so it often is a good um,
00:50
it's a good trigger or good alert to tell me. Okay, go ahead and implement your mitigation strategies, Right? So, for instance, I'm worried about being over budget on my project,
01:00
and I have we've determined that if the project is going to be over 15% over budget, the project's gonna get canceled. Well,
01:11
if halfway through, we're 10% over budget
01:14
well, when I hit that market being 10% over budget, I want to know because I want an early warning system before I get to the point where the project's cancelled. Right?
01:25
Just like, if you're gonna have a picnic
01:26
and you've got
01:29
1000 people invited and you get out of bed in the morning and dark clouds are everywhere and you see thunder and lightning.
01:36
Well, you've got a chance to make a decision. Hey, let's move this picnic inside, right, because you know what the warning signs are.
01:44
So we call those key risk indicators and ultimately, we're going to determine the key risk indicators back in the identification phase of risk when we're developing this risk register, because those risk indicators are going to be most essential
02:02
in the areas where we have high exposure. Right? So I want to know early on, or particularly for risks that I have very little tolerance for loss with.
02:13
You gotta let me know early early on so that I can correct the ship, so to speak.
02:19
All right, So it Excuse me,
02:23
these will help me identify the biggest risks. So I'm gonna associate k r. I's with those risk events that would have the highest probability and highest impact.
02:34
Maybe I'm concerned about turnover on a project. We've got key staff in place, and the success of this project is gonna be keeping this team working together.
02:45
So our greatest threat. We've determined this project to staff turnover,
02:49
right? So I might set a k r i to give me some indication that maybe staff turnover is likely.
02:55
It's often associated that the more people call in sick
03:00
and as attendants starts to decline,
03:02
that's usually good indication that people are dissatisfied with their jobs. And so it would likely follow, perhaps that we're going to see that turnover.
03:10
So what I might say is when attendants on a weekly basis drops Balu 95% of staff,
03:20
that would be something that I want to be notified off. I want to examine her there. Any special, Ah, influences here Or is this something that is indicative? Ah, and if so, that gives me enough time to meet with my staff and determined
03:32
Hey, how can we you know, how can we make this environment better?
03:38
Ultimately, we're gonna, um, use thes key risk indicators to have our attention drawn towards those risk events that are likeliest the ones that have the biggest impact
03:50
and we're going to ideally implement them in such a way that we could make changes quickly and efficiently.
03:58
So when I monitor usually what I'm looking for is what are my K R eyes, you know. Are we there? What would alarms or sounding so to speak?
04:08
Okay, They're also very helpful that if you look backwards like after the fact and you're looking at how a project was managed, you can go back and say Hey, you know, we were over budget and if you go back and look, you can see,
04:24
you know, little points where you could have told in the past Hey, this was coming. This was coming. So sometimes, yeah, after the fact, that's a little bit late. But when you're backwards looking and you do the examination, the lessons learned, this will help you more for the project's moving forward. Right?
04:40
So you get information. You know, unfortunately, when we talk about learning, we often learn the hard way.
04:46
But as long as we learn, right,
04:48
um, this
04:50
adaption adoption rather of K our eyes will help us with trend analysis. It will help us communicate transparently with their stakeholders
05:01
and ultimately if I can identify that risks are materialising early.
05:08
I can make corrections. Then ultimately, this is gonna help me reach my objectives much more likely, right? So managing risks monitoring risk. We gotta stay on top of things. And that's what our k r I's air about.
05:23
There are lots of different types of K our eyes you can use when we talk about information security. You know, all of a sudden I'm seeing a high number of software scans of port scaling on my network.
05:33
Something's up with that.
05:35
But or I scanned the network, and I find 10% more unauthorized devices on the network this week than last week.
05:46
It's taking longer than expected to deploy security patches. Or, um, I'm finding a higher number than normal of unpatched systems. Right. You see how important this is to identify early on?
06:01
And when we look at the various risks and our risk register, we're going to document thes triggers were gonna determine these triggers or K arise. I kind of use those words interchangeably, but we're gonna document those were gonna document how often we're gonna scan and then, ideally, okay,
06:19
when we see these risk indicators.
06:21
What's our contingency plan? What do we move into so that we can be prepared for this risk event that happened.
06:29
So in order for RK our eyes to really be meaningful and relevant within our organization,
06:35
you know, it has to be
06:38
not proprietary, but it has to be unique to our organization. Right. So, um,
06:46
we want to make sure that we take into consideration our risk culture. We want to make sure that we take into account our organizational view on risk. What our tolerance thresholds are, you know, certainly areas with physical or with the legal or regulatory compliance.
07:02
If we're looking to be, if if if we're headed in the direction to fault,
07:08
out of compliance, we want to know that, of course, very quickly, so that we can shift back into compartments.
07:14
And as a matter of fact, I mentioned several times, you know, we want to know this ahead of time. Well, you'll notice that timing is one of those optimization ideas for K r eyes. We've gotta have this in a timely fashion so that we can make our changes at the appropriate, you know, in order to make
07:33
a significant impact.
07:36
All right, sensitivity. We want to make sure that as we're managing these, we allow for normal tolerance, right? Like there's a normal threshold of tolerance. We don't necessarily want the alarms to sound
07:49
every single time. You know, I think about for sensitivity. I think about the fire alarm and in my house with smoke detector.
07:57
And every time I cooked piece of bacon,
07:59
the fire detector goes off. It's too sensitive, right? I can't have someone yelling. The sky's falling every time. There's a vulnerability scan directed at my network. In the same respect, we have to make sure that anything that indicates a little bit more sensitive or more determined and attack those get notified.
08:18
So when we talk about these ideas of sensitivity,
08:20
finding that range off,
08:24
what's a level where we really do need to be
08:28
notified
08:30
and being notified in such a way through timing that we can respond appropriately?
08:35
Frequency? How often do we monitor for these K R eyes once a month? Once a year? Do we do it on? Lee is the, uh,
08:43
the result of an event. You know how frequently we monitor really has to do with a risk events, visibility,
08:56
pay. And when I talk about the visibility of a risk event,
09:00
there are a lot of negative things that can happen that are invisible to me. Meaning they happen. And I never know about it.
09:07
You could get a 1,000,000 pieces of malware infected on your system and not even know when it happens now. Chaser Good. You'll know afterwards, right? But, you know, I could get a root kit on my system and not even know what it hap.
09:22
I could get a virus or a logic bomb.
09:26
So because thes things that are harmful
09:30
can materialize and yet not be determined easily, I've got a monitor on a very regular basis. Right? Think how often we scan for malware on our systems.
09:41
Often we do it every night, right? Even, you know, ideally, we've got a live skin, but we made do a more thorough skin every single night. Why?
09:52
Because
09:52
you you have to do it regularly because you don't know if it's gonna happen or not. Now,
09:58
processor utilization being over 99%. When that happens, I know.
10:03
So that's a very visible of them.
10:05
So it's not like I have to scan every second for processor utilization, being over 95%.
10:11
When it happens, you'll know,
10:13
right? And I would tie that in kind of a a distributed denial of service attack. When you get D dust, you know it. You can't do anything.
10:22
But when I get a malware infection,
10:26
I may not know for days and days and weeks and weeks, so I don't have to freak. I don't have to scan every day and I beat D dust.
10:33
I do have to scan every day, determine if I have malware, hope that makes sense, and then corrective action, ideally again, in my risk register thes k r. I's simply alert us Hey, go into your contingency plan because this risk event is about to materialize, so hopefully that's documented as well.

Up Next

CISM

Cybrary's Certified Information Security Manager (CISM) course is a great fit for IT professionals looking to move up in their organization and advance their careers and/or current CISMs looking to learn about the latest trends in the IT industry.

Instructed By

Instructor Profile Image
Kelly Handerhan
Senior Instructor