Hello and welcome to Episode 11 of Conferences of the Effect of Sea. So competency 11 risk.
Just a quick reminder we will be skipping next week.
Uh, and due to the syriza conferences that are occurring on do we will have our final session on leadership Compass. See, 12
um, August 15th. It's been an awesome, awesome journey out with you all, and I'm looking forward to finishing this off. Really? Well, um, Ed, really excited. Take it away
already. I guess next week with the black cat and def con, I hope people take the time to enjoy the conference. My observation is
I have been going to these conferences now since the eighties used to better the old and CSC Conference in Baltimore on the waterfront in the
late eighties. I've been doing it forever. The time you spend preparing for a conference is pretty vital. I find if you go to a conference with the plan,
then it's a much more effective use of your time than
just kind of going and say I'm gonna check out the vendors and
picks him talks much more effective to to spend the time in advance before you go
um, listing out. What do you think your priorities might be in terms of the types of companies or specific cos you'd like to visit with
which talks were going to make sense? I I always do that, like with my little team, a tag cyber. We we divvy up the work
fixem areas that makes sense for me. It's almost like the difference between walking in a hardware store
to just walking up in that walk up and down the aisles and look at stuff versus you know you're gonna build of, Ah, bird cage or something. You know exactly what you need. It's a very different experience in a hardware store when you're building something
as when you're not. So keep that in mind. You know, I have so many colleagues and friends who go, you know, just in sort of breezily out wander up and down the aisles,
and that always strikes me as a spectacular waste of time. So I hope you'll do some prep time. I think far too few few people
prepare for conferences, they just go. So I hope that's useful.
Now, Um, we're competency 11 and this is an interesting when our last two
are the most challenging,
Uh, risk is our competency 11. And the question that should immediately come to mind is,
Do I mean
risk is a competency where you're a risk taker?
What I mean? Risk is a competency where you're a risk avoider and and I get that the business school answer is, well, I'm a risk manager. Risk optimizer me bull ***, and you're either.
We all know
that there's not three types of people. There's two types of people there. People who, like may
are always an hour early for everything, because I've thought through every possible scenario
that could cause me to be late.
So my family jokes that I'm always way early for everything.
But I do that
because I factor worst case scenario into everything I do. Blah, blah, blah, blah, blah, blah, blah, blah, blah
you. So she You all know that person,
But then you also know
that person who does the obvious were my first bosses at Bell Labs person I just thought was fantastic. And he's one of my favorite bosses. His name's Tom Curtis Legs retired. Now what wonderful manager.
But man that dude, if you're going to travel?
Um, he'd be the guy rushing through the door on the airplane. Is your flying somewhere tie untied?
Bag's still kind of disheveled Justus. They're closing the door and the airplane's gonna push off. You have a smile, ear to ear and just delight in the fact that he didn't waste one second of time
waiting in some stupid airport. He optimized his morning at nice breakfast, blah, blah, blah, blah. But you know that person, too.
Um, there's not too many people in between, you know, you're the one or the other, and I know some of you say, Well, usually this or that. But we all have our tendencies.
I think you can be an effective C so with either.
But I really believe that the former
is a little better suited to the job of your crazy risk taker. Then I recommend you become an inventor, you know, with sales going, the marketing, you know, become someone who,
can parlay that kind of risk into advancing the business.
But I really do think that the chief information security officer generally
it's someone who's doing everything he or she can do to avoid risk. I you know, I'm just saying this because you we all know that person. Most of my friends
to do this job
are that type of person. But not always. You know, there's there's different tribes. My friend Gary McGraw
some time ago, while he was over it. Sigil think they're synopsis now,
came up this concept of different tribes of si SOS. And I think we talked about that at the beginning of this course, and I showed you some pictures and we talked about
the fact that there's different tribes and I think there's also different tribes of risk.
So there's no question that, you know, you can point to some people, you know,
we're in this role. You are in the top
or one of the top positions managing information, security, risk for corporation.
Who would have this tendency of being, in some sense, a risk taker and being willing thio to be That person just hops onto the airplane 30 seconds before Gosh, I would I would sooner die than do that. I I feel like if I'm on Lee 30 minutes early for something that I'm late. So
I think for the most part there are these tribes,
but I'd say be reflective of your your own
And if you are a risk taker, you may have to compensate a little bit because your job
in the cease of position,
yes, to manage risk and yes, to accept some risk but do not like it. I think you're not supposed to like risks. You're supposed to be, um, categorizing risk in the Debit column
of the ledger, not the asset column of the ledger on def. You working financialservices. You know that in many cases risk
is considered an asset,
so we'll start with that now. We always right are
little sentence here just to kind of codify our belief
around risk. And there's a couple of key words here that I think we need to focus on. So first is all we redid defectives. He so understands that risk
primary driver in prioritizing safeguards and then the second word near that's important is that it must be properly balanced.
It's a risk to be bounced, and then the other one is cost, like balancing with cost constraints, needs in business. So, really, the way this all works is that when you try to understand risk when you try to understand,
Um, you know the potential scenarios that could be
negative to a business or to an organization
that it becomes your goal, too.
Prioritize how you didn't I handle this sort of thing, cause you let's face it, you you're gonna have cost constraints. The business is gonna have to run. You can't just shut down the business to shut down rest. That's an absurd,
you know, Scenario makes no sense. So,
So prioritizing is what this is all about and also kind of tailoring those safeguards to the business. Let me give you an example.
I do a lot of consulting
consulting around. See so, um,
team design Or, you know, I probably have
more clients, and you'd imagine who contact me when they're putting together either a new team or their mid sized business. That's getting bigger. And it's their first team
or it's a first time. See, so are someone who, for whatever reason, needs to make some strategic decisions about
security. What I always find
is they go to these generic kind of sources like a course that someone might be teaching on setting up, you see, so organization.
And it lays out a sort of a formula
for what kinds of construction being a team. And I am don't sort of guilty that attack cyber like I have my 50
areas that I list in my research that I think are important for sea. So but the really, really, really successful see so starts with the business works backwards. Like I I had a client. I was talking to this morning. I think I could describe this without
giving away who they are, even close. But
you're kind of in a business that involves satellites.
They're talking to me about all this I t security stuff and how they
work that in. And I was listening. And then I just asked, What is the scenario that has your the most freaked out
and the scenario that had them the most freaked out with something that didn't even come anywhere near i t security? It was
hijacking the Mission Control and, you know, affecting
were negatively affecting
the trajectory or the orbit of one of their satellites, either in a act of war, act of vandalism and act of sabotage.
It was the scenario they were the most concerned with
and all the i t security controls that we're putting into their land. In some sense, we're orthogonal to that problem because the the mission control was separated
from the i t part of the business. So when I asked them, were they sure it was separated? Lot of quiet and and and everyone listening to my voice gets the point here
that their risk
Is that the assumptions they were making
that you know, nobody could quote unquote get to the control system for the satellites.
Um, the risk is that maybe they're wrong. Maybe there are ways they don't know about that. And that's what they should be focusing on, not thes conventional I t security controls on the local area network that air so comfortable and familiar and
consistent with compliance documents, and meet the expectation of anybody who had asked him. That's about what they're doing.
The correct way to manage risk is to start with the business, start with what you do.
What are the things that can happen that keep you up at night? Really, And then the things that don't you know, then you gotta figure that out
like there's a big difference, for example,
between a negative consequence that really is not good for your business, your shareholders, your stakeholders, your employees.
But it just stays there
versus a negative consequence that does all the things I just said, but also maybe could kill the lives of a bunch of people like a plane. Crashing
is the kind of thing that has consequence way beyond the organization of the airline versus Let's Say you print comic books for a living. I mean, there's not too many scenarios
where you know it's gonna be loss of life there. So So the risks are different that say safeguards are gonna be prioritized differently. The cost
availability is gonna be different for security.
I'm in the needs of the business or just different to different missions. So
really ineffective. See? So does not just, you know, stamp out missed compliant
programs that include all the things you learned in Sands.
Oh, our cyber Harry. You know, cyber has amazing courses that you could go through, but they're not gonna have a course in how you
You know how you you know you control satellites. You know, maybe Maybe Leaf does have. Of course I'm guessing not.
But if that's the business you're in, then that's where your risk is. And you're not gonna get that from. Of course, you're not gonna get that from a book. You're not gonna get that from a template.
You're gonna You're gonna derive
your understanding of that risk from the business. Do you guys follow? It's extremely important.
Now, before we kind of get into the habits or the thoughts of the same principles
that I've pulled from. You know, the last 30 years of doing this
for C says,
there's a little crib sheet that I use whenever I'm making a decision about safeguards and I want to show it to you and we'll go through that quickly and then we'll get to back to the risks thing. But But this is a crib. She that has been helpful. Tonight's carry this around in my pocket.
And when I was making a decision, I would always wonder which of these eight lines
Um, I follow it
and let's go. Let's take a look here. You see what we're going to start in that dot in the middle?
So we just start with some unity, unity or origin, which is an abstraction of wherever you are. So I make no
absolute or even relative judgment about your security being that dot in the middle, you just are. We are your some mid tier bank and you do what you do on security or your
dIsa and you do what you're doing security or a little company
or you have an amazing program. You're like Varieties and R I B. Emerson. It was great, great security, and it's really awesome. Whatever you're out on your somewhere.
And then a decision has to be made about security.
And it strikes me that if you look at two variables and I know you look a tw n variables, I know that any decision has a vector of consequence.
But let's just look it, too. Let's look at
from making a decision about security. Am I making security better or worse? Uh, you know, I mean, I was kind of deliver it, but it's true, like making it better or worse.
Then we'll know that their scenarios we're making it worse, you know, you decide
you're going to allow 1/3 party to come into the enterprise using source based authentication at your gateway
and just trust that there's not going to be I p address spoofing
and they come in and you just guy, I got no choice. I mean, I'll do. I am authentication not gonna get into my systems, but they're going to get through my gateway if they just advertise a source i p address that is within range.
You've just decreased security in your enterprise. When you decide to do that period, you just d'oh makes it easier. But there may be a zillion reasons why you do that. It might be reducing your spend
by some spectacular amount. That's why people outsource.
So you see the little arrow from the dot from there down to the bottom left to the little circle, says remove system security and accept risk means I'm decreasing security.
We're not talking about risk here, and I'm risking the title. But this is just a made a decision
about security and the spend went down, which is good but also security. When dance. It's so sort of rational. It's not like I have two things happening in different directions.
Now let's say I go the opposite part of that line. So instead of the bottom left top right,
there's the one we're all comfortable and familiar with. You're gonna
introduce a security system you buy from some vendor. You
introduced some new control costs money if by it license it, install it, management. But
you have more security, so you'd go all right,
and increasing security. I'm increasing my spend all makes sense in them. You know, in this sort of context, it's it's a perfectly rational decision. So along that line that line that is has slope one. Essentially,
it's they they don't make sense, right? You there. If I take security out and I save money, I'm cool with that. If I add security and I've spend money, I'm cool with that.
Those are both scenarios.
That makes perfectly good sense to me.
Um, on the other hand,
on the other sort of access Here, let's look at the top left
where I'm making a decision. You're I'm decreasing security, and I'm also increasing my spend. You can see how that would be just spectacularly unpopular and an unreasonable decision,
and I'll let you think through a lot of different scenarios where that happens. But it does happen. You know, you might have, for example, a severe insider problem. Let's say you're
that insiders are hollowing you out and creating sabotage. And your solution is I'm gonna do more security training,
um, and teach people more about the security systems that we've got in place so that they're better operated.
Well, if you believe you have a sabotage system than your training the saboteurs to understand your security, better decreased security increased spent. I'm making that scenario up. You could make up 10 that are maybe a zillion times more appropriate or better scenarios. But get the idea and then the bottom right here is
it is also sort of an unusual one, but certainly a good one.
You know, there's a case where I'm increasing security and reducing spend.
I'm just that every vendor at Black Hat next week will tell you that they're in that category. They're going to say you can you buy our tool and you could get rid of that. You know that loser thing that you have from our competitors? A bunch of jerks
get rid of their thing. Are things cheaper? It's better you move into that innovation Best case,
the straight up and down here, like the
on this chart
to the right of the Y axis. The positive X values, um,
all look like pretty good decisions. And on the left,
you know, the negative X value strike. Mia's not so good.
And as I said, I was carried this little crib sheet around. When I was making a decision, I would always ask myself, Where am I? Here? It's not. This is not, you know, deep methodology. There's nothing mathematical here. There's nothing all that terribly foundational.
But I was found this a useful crib sheet for me because I can't tell you how many times I would just make a decision because I was busy
where it seemed like the right thing are is being talked into it in some sense. And then, boom, you know, consult my crib sheet and come to the conclusion. Wait a minute.
I'm moving down into the left. That doesn't seem like such a good
decision year. Do we really want to do this?
sometimes you have no choice. Like if you look at the, um, the Axis
that cuts through the origin that starts in the bottom left hand. Ghost with a cop, right?
Um, you could argue that
anything below that seems somewhat rational and anything above that I don't know.
But there's a lot of there's a lot of interesting ways that you can look at this everything below that why the X axis is a reduction and spend everything above the excesses of an increase in spending cell on you. Get the point.
So, um, so you might want to consider this. I thought I'd show this to you
because it's it's really not from any book. It's something that I wrote down
and I liked it, and I've been showing this to students for 20 years and just sort of sharing, and for some percentage of them,
it may be a useful management tool.
Now, let's spend a little bit of time here on risk principles. We've got about 40 minutes left,
and I'm gonna dig you through, um, a bunch of principles that
I think a security ist cyber security manager executives he so needs to needs to understand. You don't necessarily have to be something that
comes naturally to you and look Let's let's face it, this is different than a c R o. In a bank,
Sierra Rose and banks have a different, different job. This is not what we're talking about here.
So risk comes in many different flavors. Information risk is different than finance risk. And yes, I do understand
that when you're measuring information, rescue many cases, the units will be dollars as we'll see in a minute.
But they are different.
And a C R O,
you know, is a position that
that is a lying with a fundamentally different purpose and mission
that the information risk issues that a cease A needs to deal with. So keep that in mind as we go through this. This is not I'm not training Sierra rose here, nor could I ever consider myself
even qualified to talk to a c. R O looks less to train you on it.
But what I am qualified to share with you
is that in a lifetime of working as a C. So, um,
I do understand the principles that have helped may and help some of my friends and peers get through some pretty tough situations, including understanding when things were presented to you that may just flat out be nonsense and you should be able to smell when something is nonsense.
Which brings me to
first principle here.
So Monte Carlo estimates,
using Excel, spreadsheets and little scripts that you're right in excel.
really considered a best practice.
In fact, there's so many risk platforms that are better.
But I would say it's the most common practice, right? I mean, you
you get excel with your office 3 65 subscription.
It's really easy to code little expressions, math expressions that will,
you know, take your estimate of likelihood of a particular thing happening.
Bounced off of the likelihood of some range of potential loss to the business. There is a
between 10 and 25% chance,
uh, that we will lose between 10 and $15 million
you know, in the next six months, you know, as a result of some sort of credential problems, you know? I mean, people make those statements and I gotta tell you him put a picture up here of the dice.
That lesson is really on. Leah's good as the gases that you make right, and
you know, there's a lot of interesting things. We can sail share one in a minute, but the bottom line is
when somebody provides to you, uh, some sort of a risk statement
that's based on some estimations that are made
in order to support a very large number in many cases of Monte Carlo simulations.
Be careful because
people doing this
are going to remind you that they ran a 1,000,000 of them.
So there's a lot of money Carlos simulations,
but they're only gonna be good. As good
is the *** that went into the mill. It's taken example.
Let's say I have a population of integers that represent whatever. Let's say these air, ah, lost values.
Now it's got a bag of numbers. Imagine, and they have a low and a medium and a median and high. And so
let's take them the middle of those numbers. We'll just call it the Media.
It turns out
that in that population,
if I just pick five of the numbers five elements in that population at random.
So I've got a bag of numbers. There is a medium.
There always is,
and I just pull for close my eyes and pull out five jelly beans. You know, five numbers.
If you do the math, I'll explain what the math looks like in a minute.
It turns out there's essentially in 94% chance, 93.75
that that median will live somewhere between the highest and lowest values in that random sample.
Think about that.
Get a bag of jelly beans, the roll numbers.
It could be whatever then it would now, however big you wanted today
that I reach in
and it pulled five out.
and we know that there is a median in that bag. So if it's a numb bunch of numbers from,
you know, one to a 1,000,000 I take all the values, I can calculate the media. It's basically in the middle
it turns out pulling five jelly beans out. There's 94% chance
that if I line the jelly beans up,
the lowest number in the highest number
will live on either side of that median. 94% chest. How how is that?
Well, think about it and in any in any population. And if I flip a coin,
um, and it comes up heads, that means I picked a number that's above the median
or if it comes up tails and I pick the number below the median.
Well, then one coin flip can simulate whether the first jelly bean that you pulled from the bag is above or below media.
you could argue that if the sample is one that I have 50 50 chance that I'm above or below the median right
now, it's that picked too
well. If I wanted the median to live between those two numbers, and I would have to roll one number that's higher
and one never. That's lower than the media, right?
What are the chances of that will do the math I'm flipping a dice are flipping a coin,
it turns out, in any random sample of five, if all five numbers live above or below the median,
that's the equivalent of rolling heads five times in around or rolling towns five times in a row.
And do the math 93.75% chance
of doing the heads five times. Isn't that interesting? This book down here on the left. How to measure anything in cyber security risk?
Excellent book. I see you guys see Stew McClure, McClure there from BlackBerry and my good friend Dan year from in Q tel.
but really kind of a cool concept, right? Didn't you can play these games with the numbers, and you can have people speaking very confidently and backing it all up. But what it comes down to is this.
Then, when my favorite books in graduate school had a lie with statistics,
I still think that, you know, it all comes down to what is you're talking about here? I mean, you can dazzle with numbers and you can dazzle with percentages and you can
dazzle with statistics. But the question is, did you even make reasonable judgment in the first place?
Like those values that were sitting in the bag? Are they reasonable? I get that. Maybe there's a lot of them, and you did everything you could to come up with things that are reasonable.
But again, your overall these Monte Carlo simulations, where I run through and spend the dial and spend the dialling spin the dial, do it a 1,000,000 times, using the probabilities that you assigned a beginning and just let it run through and just sort of
that taken to account the probabilities that you coded into the numbers. All right, I'm gonna get some sort of a graph,
and I'm going to come to the conclusion that overall, you know, our cyber security is pretty good. And here's why. And blah, blah, blah, blah. Ball used all those Monte Carlo's to support that.
Like I said,
effective C so recognizes that those estimates are on Leah's good is the underlying. And don't fool anybody. These air gases thes air, not measurements. You'll know *** well
that it's a bunch of gases. I just did it, Charlie. See So cartoon last week
where I had Mary and Charlie talking and Mary,
you know, Charlie says, How'd you come up with these incredible numbers? And she said, Well, I just hit random divide on Excel on a bunch of made up numbers and everybody kind of laughed because they know that that's so true
in the way people do risk Now Number two.
Um, you could do all these numbers. You can come up with estimates. You can come up with all sorts of things, but I still believe that in 2020 and we'll see whether artificial intelligence changes this
but experience human judgment
I still think can improve risk related decisions. I think it's one thing to do the gearhead stock. Look, I sat on the board of a large bank
numbers put in front of me, and I get that there is this
this data driven approach
that is not only recommended but is in many cases demanded amongst executive teams. I get that data driven guy, right? We're all kind of gearheads here.
I'm not avoiding the man
saying that they're the human judgment makes a big difference. Here's an example.
So there are a bunch of this. I love this
picture rights. 50 years ago, we land on Moon
and this book by Gene Kranz. You should definitely have him buy. It's a great book.
Failure is not an option, really is a term
that I think has become synonymous with NASA at that time. But the reason I bring this stuff, it's because there was a scenario that occurred literally
50 years ago, almost to the day, Um, a couple weeks ago,
where this panel here, this is the
This is the Neil Armstrong and Buzz Aldrin were staring at as they were guiding the lunar module
to the surface of the moon. This was their, ah iPhone screen,
his travel bleak. You know, I guess if you put your hand there, I don't know why I pulled my phone in front of this thing. And it's probably the size of a couple of iPhones, right?
But as they were guiding and descending to the moon,
you know, started blinking. An unusual alarm. It's a 12 01 and 12 02 alarm. I hadn't known about any of that. And some of you know that my early
earliest job when I was in grad school, I was writing flight software for the space shuttle.
Um, inertial measurement, inertial guidance software with my friend fell the plant.
I remember there was some old Apollo software that found its way into some of the real time executive programs that were being built them. But at any rate,
those things started blinking. A 12 01 12 02 alarm, which basically says
the computer's overloaded.
It's taken in too much data. It's taken in too much telemetry, and now it's just it just can't handle it.
And as they're coming down, you know, really close. Neil Armstrong says I need to read on that alarm.
And they went to one of the engineers and said,
Dude, what are we D'oh!
Um, and again, this experienced human judgment. You see, this
was basically that It looks like we're still reading
inertial data, inertial guidance data, forgetting that it seems like things are working.
You know, this 12 1 12 02 had not been a problem during any of the preparation.
And they thought,
Let's go from Luckily,
they decided the human decided,
take the risk, go. We're down their land
knowing full well that you could strand two astronauts on the moon to lives could have been killed a human being.
I made that decision, and it turned out to have been the right one. My guess is, if that had been programmed today and that alarm had popped up, they probably would not have landed.
So I'll let you decide whether that was right or wrong, and I'll let you decide whether if you were the programmer,
you would have made the same call. But that is where that's the decision that was made,
um, and you know something, Here's here's a good one for you.
What I said,
Are you the kind of person who
is late or early? You know, you that person and I'm that person who's always early. So I was thinking of every contingency. I drive my family freaking crazy
with every contingency for everything.
I have so many contingencies, and I think it's just drives me crazy sometimes. But let me ask you a question. And I watched an interview with Buzz Aldrin on this New Jersey guy.
I'm asking, What were you guys thinking about when you landed on the move, you know,
and everybody sort of laughing and looking going on. It must be an amazing feeling of land
I feel like it's great. You must have just been this incredible sense of relief and best of all, just looked at him like what I'm talking about
that the instant they landed on the moon,
they had to prepare for an emergency takeoff just in case. That's what they did
when they landed.
They weren't sure if something was damaged. They weren't sure that if they had to just haul *** out of there quickly and that was the procedure not to sit there and relax, but to think of what could have gone wrong here. And do we have to get out of here quickly, then credible. A love
that story because it's so indicative
of the kind of thinking that I recommend for for all of you. That's my advice. That
that you think that through you, that type of person
I just paused, looked in the chat. I see there, some people talking about frameworks like fair and so on. Absolutely keep in mind, I'm not here to train you necessarily unrest frameworks. So I want I'm not gonna cover fair, but it But that is a wonderful framework. There's no question that from a risk perspective,
there are some really good
conceptual and engineering models that I think you should pay attention to. A lot of lot of vendors use that. I'm more thinking this moral leadership course. I try really hard not to do
the computer security. I think I'm about 95% successful there, I said, ready to begin in. Leaf and I were talking and I said for this course
I wanted to be pictures of things like
you know, the command
screen and the lunar module instead of you know, the things we see every day. So I hope that's okay. I'm not gonna cover fair, but that is a good mom.
So at any rate, this is the
I love this picture. And I think that's a very useful story because it does illustrate the way you should be thinking about risk.
Let's go to the next one.
Um, you you should not be doing reckless gambling with speculation.
So I like talking about this one after the scenario we just went through.
When you're sitting in the sea, so roll and somebody says, Hey,
um, we need to launch. We need to launch the service. But
none of our audit systems air working. We're going to be collecting Know what? It logs. No telemetry.
Um, and we won't have any eyes or ears into whether or not we're under attack.
Should we launch?
Well, that's a risk management decision, but it shouldn't be reckless. Gambling
like reckless gambling means
there's no It's just a just a flip of the coin, totally with no means of controlling it. I can't affect our influence whether a 50 50 probability exists When I flip a coin, it's just that's scandalous.
So risk management, Not that, And this is a great book to read. This is one of my personal heroes. Bernard Baruch. I love his story. I've read
two of his biographies, loved his time that he spent helping President Wilson and President Roosevelt. Wonderful, amazing man.
I'm and I work just a few blocks from brute college
where I'm setting up a little internship program brings from the students over there. But but Bernard Baruch was a speculator,
a financial speculator who who's that was his propensity. He's the guy who's a risk taker.
And yet during wartime
he adopted a very different stance. He became a risk manager, so you can be a speculative type person as he is. He became a millionaire very quickly at a time when being a millionaire
probably met, meant more like you're being a billionaire in the Millionaire.
But he's really interesting man and you know, if you're like me and you read computer science stuff all the time
what a wonderful break.
Thio Thio have ah, book like this Now I remember
Ah, year after I was married, I've been married now for 30 34 years.
Um, I had this book,
and my poor wife puts up with my weirdness, but I took this. We went somewhere on a trip and I lost this book, the one that I'm showing here I was reading at 34 years ago
so upset. And then I went into a used book store in New Hampshire.
I was at a security workshop in 1990. I think it wa ce
and I found the book. So I It's not an easy book for I'm guessing maybe now that there's Amazon. There was no Amazon then, but it's such said, I remember the how happy I was when I found this
this book by Bernard the Roof and part written in sixties or something. But he's somebody look at if you're a risk manager. If you do, we do. These are the kinds of people you should learn from Amazing God
and that risk managers must understand and manage Worst case scenarios. I, you know, already started traveled through some of this,
but let's just like my kids love these kinds of books. Like Worst Case Scenario Survival Handbook, like you're sitting on a plane and the pilot, you know, as a heart attack, what do you d'oh like?
But I gotta tell you, that's the kind of drinking game that you should do with other si SOS. You should be thinking worst case scenario. And here, of course, is Hurricane Katrina Zim packed
on New Orleans. And I think that you had way too many manager city managers
that had not thought through the worst case scenarios. Listen, let me say this. I want you to hear it. And I want you to memorize this
when you are the sea. So it is not your job to manage the average scenario. It's your job to account for the worst.
That is, that's the job.
Let others in the business say the chances are we'll be okay.
That's not your job to job to say. Well, yeah, I understand that we're gonna optimize said some sort of a bell curve. Maybe we're at the top of the bell curve is most likely case. Okay, optimize your resource is, But there's also a scenario where things can go very wrong and be very, very, very bad.
And and the job is to figure out how to manage that without a prioritize how to account for that sort of thing. And I love Ah, Lane stink chief. Impossible Scenario office, Others air. Really? I'm gonna actually have to steal that.
That's a really good point.
But, you know, that's kind of the idea here is impossible in a sense,
to just have all the solutions for something like Katrina.
But that's the job. The job is. What are you gonna do if something really wrong happens here? I know back in telecommunications,
um, you know, I spent a long lot. Which part of my career in Tel. Com.
Most of the telecom companies would have response procedures, disaster response procedures that would start by flying. Resource is out to the affected place. Like if there's a hurricane war,
catastrophe or some natural disaster, you fly. Resource is there. That's chapter one. But that paragraph one sentence, one word, one in chapter one fly out to
what would you do on 9 11
on 9 11 all airplanes are grounded.
So did you think through that? Obviously didn't take the worst case scenario into account. The worst case scenario is not flat work because you can't fly out because there's no planes.
So you get the point
like having that tendency, a willingness to really consider the worst cases
that is your job. Even if the most likely case isn't going to travel anywhere near that boundary condition
we have. You know, you don't expect a 9 11 kind of thing every week, but you should be planning for that. Makes him should be part of the equation.
No, the next one
Great Miss Risk management starts with organizational culture. I think many of you who do this sort of thing would would agree. It might even suggest that this should be the first,
first bullet here, that the organizational culture really does dictate whether or not you got good response. And that means trusting
the decisions that are made
from a risk perspective. That's what it means. It means trusting those decisions. And look, here's a great book. I hope you hope you've read this with Michael Lewis.
Well, my favorite authors
of The Fifth Risk came out recently, and this story is basically
he went, visited a bunch of agencies to to read the transition plans that they had created for the Trump administration,
and it gave him an opportunity to really dig in tow. How some of the organizations,
um, that that supporter of the United States anyway operate on dhe. Here's an example. One that's the Department of Commerce. In fact,
that manages Noah
that takes this picture that shows our hurricane brewing,
you know, off the southeast coast of the United States.
Um, and it's a It's a lovely book. It's a It's an easy short read. I think some of the chapters were printed in the Atlantic, and Michael Lewis is such a great reader, whether you like him or not. And I know there's some politics involved in some of this, so you sort of critical of Trump
stepped. Aside from that, this really is not a political book. It's a celebration
of agencies in the civilian agencies in the U. S. Government.
And what an amazing job they doing managing risk. And it's all about culture if you work it. No, uh, that's the organization that does the
You know what the culture is like there. It's a culture of data. It's a culture of accurate interpretation of trends. It's and it's a culture of trusting your eyeballs here like Look what's happening on this picture. I mean, pretty obvious you got an issue here. If you if you
play the time
then that eye of that hurricane is off to the right somewhere.
And we all know that if in organizational culture is flippant
that it might say a who knows what's gonna happen. But if you follow the data and if you study trends and if you do it long enough, you know exactly what's gonna happen. And that's part of the culture of an organization. Not everything is obvious is this. But if you see trouble brewing with the human resource related
issues and one of the business unit,
it's something doesn't seem right. And and you know that things air are not right and they're heading in a direction. It looks like it's gonna explode on the coast of your company.
How do you take action or not? Well, tell me about the organizational culture, and I'll tell you if you took action
and that's the point.
So to do this properly really do have to have a culture that understands and trusts
the guidance that comes from the feudal who are managing risk. I think that's absolutely essential.
Look at this one. Risk transfer has always been an important management options.
Well, you know what that means today means this right?
look here on the left. The invisible bankers written in the eighties. It's an explanation of the insurance industry. I may have shown this to you earlier in the course.
I think it's a must read. Um, Injured Tobias, the financial writer and
that is active today, wrote a really cool book called Getting By on $100,000 a Year. That was a joke.
It was like, Who could ever say,
You know how getting by on $100,000 you're right In the eighties, it would be like saying, getting by in a $1,000,000 a year this year.
Well, you know, with inflation now, it is tough to get by. That does here for a lot of families, and Mom and dad both work making 50 K each a couple of kids.
They're probably struggling. 100,000. He wrote that book, but this invisible bankers was a description of the insurance industry that I think it's just still spectacularly good.
Many goes back and explains the history, and I put this picture. This is a carved tablet from the Code of Hammurabi that included certain codes around insurance for sailors who were venturing off and would, you know, pay a little extra
in case something went wrong so that their AH loved ones would be taken care of insurance. That's where insurance comes from. You know, these basic decisions around
how to account for risk, how to manage it and, in this case, how to transfer it.
So, First, cyber security teams, you need to understand what risk transfer all means.
You'd understand how it's done and you know it. Also, the business relationships around the business decisions around it, like here's my said this before and I'll say it again.
Well, as long as C says don't have to pay for insurance, they're always going to be for it,
right? If you have an insurance group that buys your insurance
hey, by Oyo by as much as you want, I'm all for it. You know, nobody's going to be against,
um, buying cyber insurance
if you don't have to pay for it.
But as soon as it starts maybe hitting the
the budgets of security teams where maybe then at that point you might see some differences
in whether, you know it's just a given that you're going to go ahead and pay the money
to an insurance company
to transfer risk over to them. That may not just be an obvious given the way it sort of is today. So take some time and understand risk transfer RL. And in the context of cybersecurity, that means insurance. And that means reading books like The Invisible Banker. So you have more than just a
a little baby surface understanding what's going on. Look, the reason I show you these books
is because I think you have to travel around an issue to understand it.
You can't just bull's eye something. Get a briefing on it and be an expert.
You have to travel the issue. You have toe nibble around the edges. You have to look at the adjacent issues. You have to read some background. You have to reflect
on on its history. Then you understand something, maybe 5%. That's why freaks me out when anybody considers himself an expert in anything. I've made a life of trying to understand cybersecurity. And I'm at the point now where I'm willing to say I think I have some expertise. But anytime I give it talk, the first
thing I always want to say is
God, there's probably 20 people here know more about this than May. But because it's so hard to really become an expert in anything
extremely difficult thing to dio,
I'm some such a non expert in everything I can think of cybersecurity. I've spent four decades trying to kind of guests over insurance. You're not going to be an expert, but you can travel it. You can learn enough so that you can make some valid decisions.
Last one here,
risk can lurk
in unexpected places. Right.
Um and how you handle that?
I think as it is interesting, it could be reflective of your integrity, your judgment, your
your style is one of my favorite example. So I'm sometime ago. This is right off the T mobile website. Um, you know, a few years ago,
um, t mobile got wrote a letter that consumers saying T Mobil CEO and experienced data breach now I want to read that sentence to you again.
T level CEO
unexperienced data breach.
it says in the second paragraph we have been notified by Experian,
a vendor. That process is our credit applications that they have experienced the data breach.
So then the bottom there, obviously I'm incredibly angry about this data breach. We institute a thorough review of our relationship with experience. Now go back and think about this. You tell your a T mobile customer
Do you know anything about Experian? You care who they're Frankly, do you give a *** who they are? You sided with T Mobile. If they want to do credit applications with 1/3 party, then go for it. If they want to do credit applications internally, then that's just a business decision. If they said we've been notified by our T mobile business unit
that does credit applications that they experienced the data breach,
I'm really annoyed that this business unit had this data breach. That's not really T mobile. It's just a business unit to you'd say it was ludicrous.
So the risk lurked in 1/3 party. I totally get it here. I mean, I
I can I can understand if you feel is pain here, I get it.
But this son of the Wei works, right? I mean, if your third party gets hacked and then your customers or hit a za result, that's your data breach. This should be T. Mobil CEO on our
data breach. Not experience. I think it's an interesting When the risk lurks an unemployed expect the place of late third party.
It's an interesting kind of reveal
when some sort of decision is made either announcing something about the breach or how you manage it or how you respond to it or how you,
in some sense, you know, uh, deal with the impact of that breach. I think it reveals quite a bit about the executives, about the culture, about the team, about everything. So
so that that that's something that you should be looking for scenarios like this and try to understand whether or not they,
you know, our our instructive in the way this stuff goes on. An interesting footnote here at the bottom of this letter, which I've cut off,
um, T Mobile. And again, there's no picking on T Mobile. There may be some people here who work for T Mobile and I spent my life in the same industry is you and I take no pleasure
in problems and every company Intel Commons had issues. So this is not nothing to do with
t Mobile, but, uh, as you always see in these kinds of breaches, some identity protection was offered in the service that was offered specifically when you click on it.
That said, the service is brought to you by guess who
experience. So I guess I thought it was just a
an interesting way
of handling this sort of thing. And And I think probably not that the picture of the guy up here, creative guy.
And I guess he's gonna try emerge a sprint now, creative person.
But I thought in this case, maybe not one of his better days or better decisions in handling this third party risk.
Let's go to our case study Now Had some fun writing this one.
So here's what the case study for less and 11 really amounts to.
you know, our hero talking to somebody Jeffrey,
who works for a small company in Philadelphia, and he says that they design and create compounds, they make chemicals. I made them organic cause
you know, it just seemed like that be maybe consistent, more consistent with the story here. So make these organic compounds
And he says you have news into interesting business. We make these aromatics and you sell them blob lives Nice two factor authentication
star data securely and encrypt. That was a pretty standard stuff thing as you
is. Well, wonder our chemists last year was combining a bunch of things. I had him for her, combining a bunch of acids, you know, doing presumably some testing on something or other. Just because you're developing organics does it really wouldn't be using
some dangerous other chemicals.
Person was combining things and and they made this
crazy explosive that blew a lab to bits, literally blew the place up
and going back and reviewing what it happens. They realized
that make it and created something interesting, like a bomb. Somebody called Department of Homeland Security and D O D. And they said they had come take a look at this and they reproduced the
the explosion. And the military folks were actually quite interested in this and said, Wow, this is really interesting. We make this cool bomb
and the security guy says, Listen, IIs is I'm protest a little bit and said, Hey, you know what A pretty security, serious security problems here.
We're gonna be developing weapons,
you know, bombs. That's a little different than aromatics for perfume.
And, you know, it was kind of really pushing back here. And and he says here that what happens is, um,
you know, they did, in fact, start to see some very unusual activity coming in.
Um, you know, and and And the the guy was tells his bosses, we need to really dig into security problem. And the boss gets furious and says, You know, I'm gonna bring 1/3 party in. You just step away in its investigation with the FBI. You're hurting our business.
What's going on here is a new line of business, and you're creating nothing
And at that point it you know this story, short events, what is what is the issue here? What was the sea? So do you've all been in these scenarios where you smell trouble brewing in an area where the business is very excited to see new rabbit
you got this guy running or guy or gal running in aromatics company? Suddenly he's got D O D contracts that look like they could be worth more than the whole company
and security person is freaking out. Oh, hey, there's some problems here. Those break ins. There's issues. There's this. There's that.
This does not look like a good situation.
So the CEO responds by saying, We're getting rid of you were bringing an M ss it. So
this is an interesting one because this is a case where you see a senior leader deciding not to worry so much about risk
and just forging forward because there's revenue. So it's a good one. Frito, sit down with your team is trying to go through. What would you do? Have you ever been in a sin ish in a situation like this, would you stay in the job?
Was the CEO being reasonable?
You know, do you think senior management is good understanding of risk? And if you know, is this the kind of thing where an external nemesis would be? Maybe they'd be better up. Maybe you're gonna get the same sort of judgment from an M SS even though my observation
is it An awful lot of em assesses cell,
um, one size fits all solution without really digging into the underlying risk.
They manage devices, they collect telemetry that notify you of alarms. And whether you are a chemical company or a comic book company, it's gonna look the same.
So that's why I made this an M assess, because I do know that most MSs is air. Not gonna be providing risk based service is so
so. I hope this is a useful one for you. And I hope, you know, just looked sort of clothes here with the top of our our.
I hope that my signal light my thoughts here
on rescue important for you
because you really do have to decide, you know, area risk taker risk avoider.
Even if you're a risk taker, that's fine. You can still be very effective in the role for recognized that the job the Ceasar job is probably the management activity.
It's probably more conducive to somebody who's extremely nervous about worst case scenarios, tries to avoid risk as much as possible. And while you do trust the data you also factor in, so you're human judgment based on experience.
So those are my thoughts on resc? I'm sort of monitoring the chat this week. A little more. I see. You know, a couple of things here.
Um, I think we're in good shape posting links to the course materials. Absolutely. Conduce that.
And thanks for some of you who are confirming some points Ellis, we're gonna skip next week. But make sure you come back in two weeks because that's gonna be the week. We talk about leadership.
And I think that is the most important competency of the 12.
And if you want to pick one, that's the one that is the most essential Youto have
leadership capability to be a CC. You cannot just be a manager. You have to also be a leader. So? So everybody enjoy black hat next week, Def con
um and we will see you in two weeks. Everybody have a wonderful couple of weeks and we'll talk to you. Say thanks.
Incident Response Lifecycle
Want to build on your foundational knowledge of cybersecurity incident response? By taking this Incident ...
5 CEU/CPE Hours Available
Certificate of Completion Offered
CISO Competency - Compliance
This is the tenth course in Ed Amoroso's Twelve Competencies of the Effective CISO, which ...
1 CEU/CPE Hours Available
Certificate of Completion Offered