the next framework that we want to take a look at. We're gonna shift our focus from the international standard of ice. So 27,005 toe a more US specific n'est National Institute of Standards and Technologies and specifically, we're gonna start with looking at 800-39.
So here, managing information security risk for the organization
mission information system view. So this is a very holistic approach to managing risks. All right, So as we get started yet, there we go very comprehensive process. So the steps here we start with framing the risk. You know what risk is
getting the context just like we saw with the ice. 0 27,005 The first thing that we do is figure out our context. And that's what framing the risk is again. Different terms, but doing the same thing.
we respond to them, and then we continue to monitor. So if you look at the diagram for nest 800-39 what you have is framing in the middle and the reason that's in the middle. Instead of a 1234 kind of illustration you'll see is that all of these processes
can feed back into another.
So, for instance, when I monitor that feedback needs to go into my response, because if my response wasn't sufficient, well, then I need to do something else, and I may need to go back to the assessment process. So I really like this diagram because risk management is not a 12345
So this gives you the idea that really, out of the four processes
start with framing and then we assess, we respond and monitor.
So once again, if we talk about framing risk, what we're trying to do here is to get a risk management strategy for our organization. We're looking to figure out based on who we are as a company, what our environment is, who our stakeholders are, what type of risks were subjected to.
We want to figure out how to develop a strategy
that's gonna help us exist within the threshold. Right. So we've got to talk about things like risk assumptions,
right? How does that work from our stakeholders? What sort of assumptions are there? You know, do we assume that we're protected based on having a $50,000 firewall. Are we looking at our risks and making an assumption that, um,
the upcoming election won't change our environment? And I worked at the State Department for a while
and I gotta tell you, elections would just, you know, if if the administration's change we went from going 100 miles an hour this way to 100 miles an hour that way. So our risk strategy had to indicate that we're in a very volatile environment and that may need to change.
So the bottom line is, every organization
has their own assumptions about risk, about consequences, about impacts, how likely certain risks are and even bigger than that. What's the basis for these assumptions, right? Why do we assume this is an experience? Is it that we're using outside sources, whatever that may be,
right. But we've got assumptions related to risk.
Then we have to think about the constraints and constraints. That's a limit, right? So are there things that keep us from properly assessing risks or properly responding to risks? Do we have to come into this environment maybe with the software development project
and we're developing software that runs on Windows seven,
as opposed to later versions.
That's nothing we can do about right. There's nothing we can do about that. If we're providing this software for a client, Ah, they have a certain budget toe work with them or certain time constraint.
That's all part of our context
and then risk tolerance. We don't input into the risk colors of the organization. Remember, that comes from senior management, the board of directors, maybe steering committee. And their job is determined things like risk, appetite and risk tolerance. So risk appetite is the
amount of risk we're willing to accept as a company.
And you know, that could be qualitative or quantitative. We can say, You know what? We can't exceed
any potential for loss greater than $1.5 million
or we can just say we're very risk conservative organization. All right, so when we're looking at risk appetite that comes from the board of directors now, in addition to that, we have risk tolerance and we mentioned earlier, so I may have a very conservative approach to risk.
But this one particular area, because the pay office so high
I may be willing to have a higher risk tolerance. So usually that risk tolerance is something outside of the standard risk appetite. But for specific project, perhaps, or in addressing a specific risk, we have to know that information. So framing is
all of this collecting the information that's gonna help me best respond to risk.
Now, some other terms here, um, factors there certain factors that are gonna influence risk the market politics of the economy.
We've got to think about what threat sources air out there and what threat Vince could materialize. Really? None of that matters if we don't have any vulnerability, Fry, because if we have no vulnerabilities, we have no risk.
Problem is, this isn't a perfect world. We're gonna have some vulnerability. So we need to identify those,
um, predisposing or pre existing conditions may make us more vulnerable to risk. Maybe we've had
Ah, maybe we were compromised in the past and did 1/2 hearted job of repairing those systems to return them to operations. Well, right there were bringing weak systems into a current environment, so that's kind of pre existent.
Now, some terms I do want you to know volatility, velocity, proximity and visibility.
Okay, volatility, velocity, proximity and visibility.
All right. Four. Volatilities.
It's a very changing environment. Like I said, when you work in the political realm,
changing of the winds will change your approach. So we're doing something one direction. And all of a sudden the market changes. The political winds changed. Um, regulations change. So if we're an environment that just to change from one factor
can affect our impact, everything we do, that's a very volatile environment, right?
You don't have that consistent. This is the way we do it. It's the way we always do it. That's the way we do it for this week
and we'll see about next week.
how quickly the risk event comes on.
You know, for hurricanes, you often have weeks notice for hurricanes, right? Ah, Hurricane Lance is developing. You know, it looks like he's gonna hit the coastline in two weeks, So that's not a ah ah risk event that has a high velocity
By the time the tornado siren is going off, the tornadoes come and gone right, because it's a very quick, very fast risk event. It has a high velocity
how far from the trigger that indicates the event is about to happen to the point where it does happen.
Okay? Or you can also talk about that in terms of how long between the risk event
How close is the loss to the risk event? Well, with the tornado, it's immediate
but malware. You may get infested with malware and not know that for weeks to come or even longer. Perhaps so. That would be, ah, further proximity
and then last but not least, visibility's really important, you know, just like we said with malware, you make it infected with malware, and it's weeks before you know, Well, that's not very visible. Not all malware that you get infected with your system wipes things out and keeps your computer from booming right?
Their logic bombs that sit there on your system, dormant. They're very sneaky for a long time.
So for visibility for those events, and yet for there's risk events that aren't very visible, then we need a strategy toe, actively monitor on a regular basis. But these air, all factors that have an impact on risk and when where I did identifying risk,
we've got to think about these