Time
1 hour
Difficulty
Advanced
CEU/CPE
1

Video Transcription

00:02
Hi and leave Jackson. Welcome to the fifth of our series of 12
00:07
uh, competencies of the effect of sea. So by head Amoroso,
00:11
Conference C five discretion Trust. I hope everyone had a happy Father's Day and look forward to today's session to get away, Ed.
00:19
Okay. Hi, everybody. We're up to our fifth lecture.
00:25
They certainly go quickly. So, uh, I'm looking forward to this one. This is a hodgepodge, Like I've been
00:32
teaching forever on DDE.
00:35
Some lectures are cohesive, like there's ones
00:39
where when I look at the shape of the lecture, look at the shape of the message, it's clear, like, if I were here talking to you about my Kerberos or something, that it has a beginning, a middle and an end, and it's usually at the end. People feel very, very satisfied that there was this topic that we covered. But when I lecture on this on this,
00:59
this, um,
01:00
capability or competence here attributes the sea. So which I use the term discretion here,
01:06
I find it's all over the map. So I want to give you the preliminary
01:11
kind of summer here that
01:15
you're gonna see me jumping around a little bit. But maybe it's good metaphor for for the topic there is a
01:21
that there's a personality kind of trait or attribute that I believe
01:26
is essential in a good C. So
01:30
and it's what what my my cousins in Brooklyn would say, you know,
01:34
be able to keep a secret. I remember one time hear a story about one of my aunts. My uncles had been out doing something
01:47
and and they were newer. And one of the neighborhood folks came and asked my aunt, You know, uh,
01:55
something about my my uncle. And she said Nothing. My uncle came back, said they did. Section was Section starts looking for me. I remember. She said, Don't worry, I tell him Nothing. Meaning you know in her mind, you know, she was doing this great thing. She figured he was in trouble.
02:08
Kind of the ethos was that you keep a secret. I know it sounds so silly, so
02:14
prototypical e my family.
02:15
But in this in cease of business, there's a level of discretion that's not in any of the compliance requirements or frameworks.
02:24
It's not something you're gonna learn it, Sands. It's not gonna be something you pick up in a lab.
02:29
It's gonna be something that you pick up by doing the job, and you should know about it. And I should have what I want to share with you today,
02:37
kind of my belief that this is an important comp competency. It's something you should be aware of.
02:42
And as you plan out your career, it's something you should think through. But let's read as we always do,
02:49
kind of sentence here said Effective. See, Summers exhibited maintain a high level of discretion and trust in dealing with sensitive information regarding threats, investigation, ongoing initiatives. So if you're kind of person who likes to talk a lot and brag and lab and so on and so forth, then you gotta knock it off. Because this is a profession
03:08
where the ability to show great discretion
03:12
is important, that I'm gonna give you a lot of examples. And like I said, it jumps all over the place. It goes from
03:17
it kind of Mafia things to public key cryptography to government information sharing thio old early orange books stuffed all of the above
03:29
kind of building this collage that I hope that at the end of our hour here.
03:34
You'll understand that. We do have a little case study that touches on this and also have a guest
03:38
who, um,
03:40
I'm gonna be speaking more generally about, um, she's She's one of the finest
03:46
si SOS and executives and security experts that I know.
03:51
So we'll certainly broaden the discussion for her with me. When we get to that portion, I'll leave the last
03:57
20 minutes or so
03:59
to hear from her. But next 40 of 35 minutes or so,
04:03
I want to give you some information and some insight into an attributes that I really do believe
04:11
is about as important as your technical ability. Well, let's start with something that's very unusual.
04:16
This is a picture of the Raven Night Social Club in Lower Manhattan, Middle Italy. Um, this is where the Gambino crime family and John Gotti in particular,
04:29
used to socialize. It's funny, it's a shoe store now,
04:31
and I work a few blocks from this building. I was walking by there simply, and I said, Oh, my cast is great and I took a picture, and I remember somebody looking at you, thinking that I was probably FBI or something, taking a picture of the
04:44
Old Raven. I've been annual rate recent. Bring this up. Is that the way
04:47
security was done in these days from the seventies at a place like A T and T? Then could The Bell system
04:55
is that they were engaged with law enforcement to catch these people, and they caught them using telephony.
05:01
So it's less about hacking.
05:03
But the quote unquote phone companies all documented, like insanely the Bull Gravano's book there, which, by the way, is an amazing book You need A. You're the type of person who
05:15
is willing to read a trashy book occasionally. That's a good one. I don't think you got a Peter Maas up is an interesting writer. We invited him to Bell Labs many years ago to get talks. I met him
05:27
and thought it was just a really wonderful writer. But he wrote this book about
05:32
life in the mob. But any rate, the reason I bring it up
05:36
is that these investigators at A T and T were, you know, going after some people where if you were didn't show discretion, you could be killed.
05:46
It's so I have to ask you, I want you to take a moment and think,
05:53
Would you be comfortable in that kind of arrangement? Would you be comfortable in an arrangement where
05:59
you're working an investigation at work,
06:01
chasing some hacker or something?
06:04
And if you don't show proper discretion,
06:08
your safety could be put on at best.
06:12
Think about that a minute. How would you feel about something like that? And how would that affect and influence your behavior? Most si SOS, including myself,
06:19
have had their run ins like I've had death threats. Personally, I've testified in court to some pretty scary people who went to jail when they got out of jail would send me a little snarky notes. Um,
06:33
so there is a reason why when you start doing the sort of job at a more senior level,
06:41
you need to look at this picture and recognize that you're antecedents. The peoples of love O R. Your predecessors. Rather they they operate in an environment
06:54
where many cases there was great danger. I'll bet none of you have ever considered that he breathing on microsecond
07:00
as I t security folks, you know, deploying. You know in point security
07:05
that this idea that there could be personal risk, but I think it's an important metaphor. It's something you need to recognize and something that no one will ever tell you unless they've been in the job as I shot. If you if you've done it
07:20
and you and you dealt with this issue and you felt kind of that that that fear and rest that comes from this, let me just share with you how how they they caught
07:30
John Gotti upstairs from
07:32
it might have been the Raven night or one of these clubs.
07:36
There's an old lady who lived upstairs,
07:39
Um, and what they would mafia would be wandering around the streets down Lower Lower Manhattan
07:46
on the 18 tea from
07:50
operative. The security teams there
07:54
and the FBI
07:56
would do surveillance, and they would see them kind of occasionally going into this this
08:01
doorway.
08:03
We weren't quite sure why they were going in. There was nothing in there,
08:07
and they'd come out 15 or 20 minutes later
08:09
and they realized after they did a look up. But there was a little Italian lady who lived upstairs, and
08:15
what they did was they got a court ordered wiretap on her phone,
08:18
and that's how they caught these guys with telephony, with wiretaps with a combination of
08:24
surveillance
08:26
on discretion,
08:28
an old fashioned sort of police work.
08:31
Uh, and that's how they caught these people.
08:33
And if you were doing security
08:35
in telecommunications back in seventies, this is the kind of thing that you would likely be involved in
08:41
again. We don't see too much of this now. And if you were doing a similar sort of thing
08:46
with hackers on the Internet, I doubt that you'd be staring at a scary face like this guy, Sammy the bull worrying about your personal
08:54
safety.
08:56
But it was much more personal. And if you go back and look at Kevin Poulsen's writings back in the 19 nineties when he was hacking,
09:05
um,
09:05
remember the half Rick, these radio program, you know, and one a Porsche. What he did is he broke in to the Pac Bell switch,
09:13
redirected. I think it might have shared the story with you at one point, redirected the 800 number off to Devon Elrond made calls,
09:22
but what Kevin Poulsen was also doing back in that era and the reason he understood that
09:28
Tak Bell system is that he was climbing through the window into a pack, valves building
09:33
and literally leaving notes for the security team on their desk. Now, admittedly, is a hacker. He's not saying me the ball like this guy with a bunch of murders under his belt.
09:45
But I just want you to understand that when I say discretion is important here, it's I'm not telling you. Don't gossip. Don't be so you know, be well and keep secrets. Not that
09:56
it's that when you're doing security, you're dealing with malicious activity.
10:01
You're dealing in some cases with very serious criminal activity. Nation, state activities. People mean business.
10:07
So if you're gonna do this is an executive,
10:11
you need to develop a skin understanding. There's a time when you need to learn to exercise considerable discretion.
10:20
That's good in the next one.
10:28
Now, if you share a generation with me, these books, this rainbow Siri's will look very familiar. Um,
10:35
this is the in my opinion, I think if there were
10:41
kind of ah foundational documents for our discipline, these air them. So the 19 seventies, um, the federal government realized that computers were gonna be the way they kept secrets,
10:54
and
10:54
they hired Miter Um,
10:58
and I guess it's funded miter
11:01
miters in FR. D. C. A federally funded research and development center for the United States.
11:05
They funded miter to go in and figure out how they could
11:09
protect documents and do confidentiality and protect secrets on computers.
11:15
And this whole rainbow, Siri's popped up a bunch of different documents that laid out
11:20
guidelines for what you do. And the reason I think this is an important thing to take a look at is it was all about confidentiality. 100% about that. It was all about keeping secrets. How do we keep and maintain secrets in government
11:35
like the obsession then waas secrecy, confidentiality, prevention of the disclosure threat that is our route. That's where we come from.
11:45
And, you know, it's funny the way that this will work to the root of most these documents is that military security model, which is all about
11:54
clearances for people and classifications for documents.
11:58
Now, I don't think there's a person on this call right now who works outside of government,
12:05
who would consider, you know, their workplace to include some rich kind of structure around clearances, classifications and so on. You tend to have more need to know. And then there are level orientations like there might be
12:20
the CEO and the CEO is direct. Reports have access to certain things, like earnings information in advance of a call or, you know, certain types of
12:31
in a critical business documents that they would have access to that others wouldn't.
12:35
So there is some sort of a level hierarchy, and then there's the need to know categories that are kind of based on project. So if you work in this project, you see these documents work on this project works in others, we have some of that.
12:48
And what does this mean? It means that in all of these environments of government or commercial,
12:54
you have to learn to be quiet, right? You can't be working on a finance team
13:00
doing cybersecurity and blabbing about earnings. You could bring the whole corporation under. Ah, you know. And there's some legal proceedings. If you went off and you you talked about something that you had access to, and as a security team, which you will learn as the security executive, what you will learn
13:18
is you basically have access to everything.
13:20
So what does that mean? I mean, then you better learn very quickly.
13:24
You know how to compartmentalize and how to respect the level hierarchy that exists. And these documents are where most of that was invented. So it might, er they recognize that
13:35
when you had a secret clearance, you could read secret stuff,
13:37
but you couldn't read top secret stuff.
13:41
So they went, huh? You can't read up
13:43
the coin, This motto No read up.
13:46
And then they noticed. You know, if I have secret information, I could put it in a secret safe,
13:50
but I can't put it in an unclassified safe so I can't write down no right down.
13:58
And they coined that motto and put it together into this model that guided the way they protected secrets in the military for about 25 years.
14:07
So again,
14:09
secrecy and protection of these, these confidential,
14:13
um uh, documents and confidential data is so at the root of this job,
14:20
and it's something that is not easily laid out. How do you How do you teach someone that when they go to a board meeting and they hear something that really ought not to be repeated
14:30
to not repeat it or maybe you have to repeat it. That's why I call it discretion. I don't say don't ever talk about sensitive stuff. I said you have to exercise good, solid discretion based on the situation based on ethics based on the corporate sort of ethos
14:50
in situations like this. So So what that means is that you know where in the previous case I was sharing that,
14:58
You know, you wanna have a good sense of discretion because they're in the old days, your safety could very much be on the line. You can have
15:07
no your life put at risk if you're not very quiet and careful about what you talk about or don't talk about in this case with the Orange Book, very different. You know, this is more about respecting
15:18
the information classifications organization and making absolutely certain that you're following a set of procedures that make good sense. Now, I've got a few questions here that I want you to ask yourself
15:33
in the context of
15:35
kind of your own way. You go about things, so So we'll go through these well, asking each question. I want you to just
15:41
take a moment and ask yourself
15:45
whether these air things that that you do. So when you're
15:48
in a security role and you're passing along some sort of information, I don't mean silly information, like, you know, a J here. You know, Joe, word Mary got promoted. That's that's not what I'm talking about. I mean, something that substantive Like, for example,
16:03
that, you know, the data center outage
16:07
that happened last week may have been malicious
16:11
and that there could very well have been an externally initiated half
16:17
that caused the data center. Did you be down for two hours or something? You have someone in your team tells you that?
16:22
Well, when you're decide that you're gonna go repeat that,
16:26
do you habitually ask yourself who really needs to know this?
16:33
Like, I don't want to share that data just to show my power.
16:37
Where the gods support to sort of deal in information is one of the great criticisms we have when we share with government, particularly problem in a political setting
16:45
that a lot of times it's not. Who really needs to know this information, But who can I impress
16:51
by sharing this information? If you say I'm gonna share this because the person who hears this will be impressed that I'm very important to know this. Then you are in the wrong role. That is not the way to operate as a chief information security officer, a. C. So in contrast to someone
17:07
who must always ask this question
17:10
before I share this,
17:11
does this person really need to know this?
17:15
So that's number one
17:18
second.
17:19
Do I need to write this down? It's kind of funny. We watched all this stuff with Robert Mueller and,
17:25
um, Jim Comey to people that I've known. I've known for some time when I left A T and T
17:30
um, I started doing some board consulting,
17:34
and, um,
17:37
I went and gave a board talk and the sea. So who's there? Asked if I would share the stage with someone
17:44
former government guy who's gonna talk threat. Before I talked back, I said, Of course I went pretty well, and we were gonna actually tag team and maybe do some more. But then he got busy because the person was Robert Mueller, who was 52 in the front end of that presentation. So we didn't continue or start a business together. Anything but a right
18:02
this idea of Do I need a record of what's going on here? And I put ask a lawyer, because a lot of times you would. But that's also a discretion kind of issue. I often
18:14
would would have this habit of of writing down things that I thought would later be something important.
18:22
So just take a moment. Ask yourself, Do you do that
18:30
about this question?
18:30
Before you share something you're about
18:34
share with some individual,
18:37
whether it's an email or uh
18:40
something you just verbalize. Do you return the ask yourself Gonna get email is the most important
18:45
kind of medium for business communications, so there's an important one as you exercise discretion is a C. So
18:52
when you go to top something out in an email, do you ask yourself, Would this be okay
18:57
if this were published in The New York Times? Yes or no? Now I think this is good practice for anyone
19:03
set aside being a C. So. But it is super important for security executives to ask themselves this question
19:11
every time, all the time that they're sharing information and we're gonna hear from our guest a little bit. But I have a feeling she'll probably
19:19
have a couple of things to say about this. She she's been at this so long. I think she's probably seen people make this mistake
19:26
all the time. Tapping out e mails that really should never been codified
19:33
Have I respected the principle of least privilege in Sharon?
19:37
So here's a case where there is a foundational design principle that's so useful and important for our industry
19:44
that you could apply to your human interaction. These privilege means
19:48
maybe I gotta share something with you. But I want to tell you everything you know. If, for example, on the day in the data center case, let's say I I suspect
20:00
there might have been something malicious in what caused the data center average,
20:03
and I have to go talk to my
20:07
you're one of my one of my teams about deploying something into, to, say, support an incident response.
20:15
Well, maybe I have no choice but to tell them that we suspect it may have been malicious, but do I have to tell them who the rumor is that it might have been? What the reason or what do they need many motivation? Probably not
20:27
In fact, that's probably, though
20:30
most overused piece of information, particularly with boards that I've ever seen. This idea of somebody saying Who did this? And then you spend a lot of time gabbing and
20:40
you know, but dropping names of potential hacking groups, and you may not really know. But you're doing that just to serve the board's curiosity and
20:48
sort of voyeurism and wanting to know who would ever come after us. Why are they coming after us? And you may have no idea. But the point is this principled least privilege. I think it's spectacularly important
21:00
as you go about building out
21:02
personal program of how you share information with the team during cyber security incidents,
21:07
and then this is one I really do want you to think about. I'm even gonna pause here and let you think about this for a minute. I want you to reflect.
21:15
If I were to come up to your peers
21:18
and give them your name and say,
21:22
Is this somebody that I can trust with this piece of information? What would they say?
21:29
Would they say absolutely. You know, she's amazing, Comptel or anything,
21:34
and it's gonna stay there or would they go?
21:38
Don't know.
21:40
I'm not sure that somebody I necessarily want to share this information with, Maybe in a prior engagement you went and you shared with somebody something that you should not have shared, You know, something that they had no need to know,
21:52
but that you were sharing Justine a boastful manner or two.
21:56
You wave your tail feathers saying, Look what I know.
22:00
Um,
22:00
if you are that person,
22:03
then you really do need to think that through, because that is not a good attributes for any chief information security officer.
22:15
This, um, nice woman here, um,
22:19
is the wife of one of the most spectacularly important computer scientists who've ever lived that you've never heard off. And that's his picture there.
22:30
She's holding his picture. That's James Alice.
22:33
And you may not know this, but when with Diffie and Marty Hellman originally put together and reported on the basics of public key cryptography in the you know in the literature 1975 their landmark paper on public, he's cryptography
22:53
and then later reversed Jimmy Aaron Edelman implemented much of it, codified it in there
23:00
algorithm for generating public and private keys
23:04
and defeat Hellman later codified much of the protocol infrastructure
23:11
in the concept of Diffie Hellman Key Exchange, where two entities that are
23:17
unknown to each other in advance other than to agree that they're using a similar scheme,
23:22
um, can share a secret without 1/3 party. Members of the just amazing contributions of thes people in the 19 seventies and I'm on my great
23:33
sort of personal prides in this business is I didn't get a chance to get to know what with Diffie and spend some very, very quality time with him.
23:41
Learning from in
23:45
brought him to spend time in my lab.
23:48
We interviewed him, he noted. A conference aced around, and it's just a wonderful, wonderful man.
23:53
Um, you know, we live in an era where there aren't too many heroes.
23:57
That's what I was one of mine with Diffie.
24:02
But, um, after Whit Diffie published and was basically heralded with Marty Hellman and others is having introduced this amazing inventions.
24:14
It became known much later
24:17
that the man in the picture right there, James Alice, had actually stumbled onto the basics of public key cryptography long before
24:25
Diffie Hellman had done their work as much as a decade earlier. Maybe less than a decade, I want to say maybe six years.
24:33
James Alice had been working in more places in the U. K Post office as a technologist mathematician,
24:40
and then he went back to G C H Q, which is where you had originally been. When he came back to the job, his boss, Fred Williamson, had to find him something to Dio and said, Why don't you go work on the
24:52
he exchanged, you know, secret sharing in the man and little issue of sharing secrets with us
24:59
Trusted intermediary.
25:00
It would be sort of like
25:03
me asking you to go
25:06
solve the weather. NP complete problems are actually whether there's a shortcut like it was a famous unsolved problem. Almost kind of a sarcastic joke
25:18
to go work on the problem. Sir James Alice
25:21
had actually been reading some papers from Bell Laboratories from World War Two,
25:26
where an engineer we don't know who it waas had proposed that
25:30
if you introduce him lying noise over a phone wire
25:34
and you remember what the noises,
25:37
then that noise can travel with the phone signal
25:41
and be subtracted off on the other end
25:44
and it seems like such a trite, obvious thing. But something about that
25:48
sort of caught James Ellis is fancy. And you read that paper over and over and over again
25:55
and came up with the concept of a public
25:57
and private key
26:00
where you you crypt with public decrypt private key blub about vice versa and get all the beauty,
26:07
mathematical elegance that comes with that spectacularly wonderful idea.
26:12
And he showed it to his boss to page paper. And if you're interested, it's a paper called
26:18
Non Secret Encryption by James Alice. If you Google that you could read it, it's great. Read I was make my grad students at N Y. U and Stevens read it every semester I make the read that, um
26:33
But it was you wrote this paper and it was in
26:36
just spectacular. Everybody looked at it with Oh my God, this is incredible. The problem waas two problems. One of those classifieds, he couldn't talk about it, and two computers in 1968 or nine were like they would get tired multiplying,
26:52
and he had these complex mathematical operations that had to be would have had to have been implemented. He didn't know what they were. Wasn't the mathematician did he come up to ski?
27:03
So some number of years later, Clifford Cox was a young man who just finished his mathematics
27:11
training at Oxford or Cambridge. Centrally fast
27:15
join G. C H Q. And happened to be having tea with,
27:18
um, Ellis and or Williamson. And they told him about the scheme.
27:23
And Clifford Cox said, My God, I know just what to do there. It turned out he'd been just doing his his graduate work in
27:32
prime number theory. Go figure. And we all know that the essence of much of this public key cryptography
27:40
is around the idea that when we take two prime numbers, if I multiply them together, it is a spectacularly difficult problem to figure out whether it's a product of two primes. It's an NP problem where the only way to solve it is divide by two, divided by three divide by five to essentially do brute force, which is the end
27:59
essence of N P.
28:00
Complete
28:03
membership like problems that live in that in that category,
28:07
um,
28:07
have no shortcuts, which make them beautiful for cryptography because the only short cut dad is to give one of the primes and divide, which is essentially giving you one of the keys.
28:18
So So it turned out that Clifford Cox And by the way, if you Google Clifford Cox be better to Google James Ellis because he does not spell his name Sio Eckes. He spells it a different name, different way. And I've had my graduate students good with me.
28:33
I had the bursts are called me up and say, Would you have to stop
28:38
asking your students? Thio
28:41
Google that word So but Clifford Cox was also part of GC Excuses all classified. You get the point.
28:49
So these air guys who implemented basically the Diffie Hellman our essay and public key cryptography years before
29:00
Diffie Hellman reported on this.
29:02
But it was classified, so indifferent. Hellman reported their stuff one. A touring award became world famous. Everyone on the scholars heard of these folks. None of you heard of James Alice
29:14
on
29:15
it probably was very tempting for James Alice to go blab
29:19
for Clifford columns. You know why not go out? Hey, listen more.
29:23
And actually, there was one time when it was blabbed a little bit.
29:29
Bobby Inman was the director of the National Security Agency, 1979.
29:33
And Inman was speaking at the big computer conference at the time, which, ironically I think was called the computer conference were very imaginative and naming conferences then
29:45
and, um,
29:47
somebody asked him a question. They said, Keep your minds 1979 So subtract back,
29:52
you know, 40 years,
29:55
four decades. I want you to go back. So for those you're under 40
29:59
before you were born.
30:00
And we all think that we just have all these controversies around government surveillance, whatever. Back in 1979 the hand goes up.
30:07
Then somebody says Director Inman
30:10
well, the work of Helmet and defeat Hellman make it more difficult for an essay to spy on citizens. Take a moment. Think about that one.
30:22
So the question had been, you know, will
30:25
will be harder for an essay to spy on us.
30:29
Eso eso in men.
30:32
He pauses and he goes. I laughs, he says. Oh, well, Public key cryptography shouldn't be a problem. We've known about it for well over 10 years,
30:41
and right there in the front row, our in those audiences with defeat going 10 years, I just invented it five years ago, four years ago.
30:48
So def he starts nosing around, asking for an essay in other places but big community doing
30:56
crypto in the seventies. Eventually, somebody says James Alice. So if he gets on a plane with his wife,
31:03
fly over to the UK and and he makes friends with James Alice who, by the way, never said a word about it lonely. The closest thing Ellis ever said was he said something to the effect of You did more with it than we ever would have.
31:22
That's what he said. I can't you Americans did more than
31:26
we would, so there's like it closest he ever came to an admission. But I got to tell you, this dude right there that is a that's the patron saint
31:34
of showing discretion.
31:37
Never said a word kept his mouth shut because classified,
31:41
you didn't talk about classified information. A Z years pick progressed clean to the eighties
31:48
into the early nineties.
31:49
Um, there was a lobbying effort at G C H G, where a lot of people are saying, Come on, for God's sake, you know, let's let's give these guys credit where credit's due
31:56
and finally
32:00
in the mid nineties. I don't know. Somebody retired and died or quit or something. Who is blocking the whole thing?
32:07
And they made a big decision. All right, we're gonna finally give this guy credit.
32:10
Now. They were gonna do a website right mid nineties and just was brand new technology. What happens? James Ellis dies before he gets any credit.
32:20
So there's his widow holding his picture, obviously, many years later.
32:25
And I gotta tell you, I think it's for si SOS. This guy is somebody that we should probably name an award after. Like how long? Thought that once a year we should give an award to the Sea South
32:37
who exercises the best discretions in his or her dealings in our industry. And I would call it the James Ellis Award because this is a guy who's
32:46
the guy knows how to keep his mouth shut. He probably would have been somebody that
32:52
John Gotti would have hired. Just kidding. Let's get it an excellent. So here's the opposite end of the spectrum.
32:59
This is way over on the other end. And look, I am not a ah guy criticizes Greenwald,
33:07
you know, and the sort of libertarians who believe that information always must be free. My dad is a libertarian.
33:16
He never saw you know anything that Ralph Nader every day that he didn't love and Glenn Green won't come from, that is, does a Sanji and all these other guys.
33:25
Um,
33:27
these are people who are well meaning There's no I don't think there's anything evil here. There's a lot of people probably don't like Glenn Greenwald, probably a lot of people who revere him. But as they see, so you're going to have to determine
33:38
where you stand on all of this. And I'm just telling you that there's things that you can't be sharing with reporters.
33:45
You could be the most liberal minded person on the planet who could
33:51
have a T shirt that says all information must be free, you know, because it's cheaper to do so. Information needs to be out there.
33:58
Um, secrets are bad. You can have that ethos. It's perfectly fine.
34:02
All were willing to admit that I have some tendencies in a direction myself, but when it comes to work,
34:08
you can't do it that way. You know, this is this this idea of
34:13
ah, free press that makes I think the United States such a wonderful place to live.
34:20
You can't be sharing with a free press things that our company secrets companies have the right
34:25
to secrecy. And everybody would agree with that. Right? You have a fight with your spouse?
34:30
Does everybody have a right to know the details of your quality? Your quibble with your spouse? Of course not. It's utterly absurd to even suggest that.
34:39
So all of us everyone on this call
34:42
would agree
34:44
that there are cases where information
34:46
should reside with its owner and that nobody else has any right. But there times when
34:52
you do have to go share like there's this idea, like the
34:59
Daniel Ellsberg case, you know, are all these. You know, you could argue a son's. You could argue Bradley Manning or whatever
35:07
that you know. These were whistleblowers and that they're sharing information because there's something going on that's ethically inappropriate.
35:15
And I got to tell you as a security person,
35:19
man, and you're gonna come up against these things all the time. I don't mean ethically inappropriate in the sense that you know you work for a company as bad ethics. If you do, then you oughta quit,
35:31
but you will hit on situations occasionally where there's an investigation going on
35:37
that is part of the investigation.
35:39
You might hear that
35:42
some sort of surveillance is gonna be directed in an individual by your team, and you'll have to decide whether that's appropriate enough. And God help you if you have no policies and God help you if you have a poor relationship with your lawyers,
35:54
because that's where thes kind of subtle discussions must occur that must
36:00
be
36:01
predetermined. You don't want to be making the stop up. What's going on. That's not a good thing,
36:07
but by the same token, there will be some termination that's gonna have to do you need. So look, this is important. It's a good buck. I think it's something you probably want to read.
36:16
Whether you like Glenn Greenwald or not is important
36:20
person.
36:22
The whole LosAngeles sort of case is something that I think you should have an opinion about,
36:27
But what I would like you to accept, regardless of where you live in the
36:30
the spectrum of handle information is that's very organization. You gotta learn to keep some portion of the organizational information, Quiet periods.
36:43
Now, this is something off that Ah, lot of you don't manage.
36:47
Probably something the company called Records and Information Management. And that's how you handle the records in the company. The documents, your info.
36:58
Um, I've been amazed in my consulting work and how many companies, large banks and others
37:04
have somebody sitting in a cubicle, you know, tapping away, you know, in obscurity, seven levels down from C. So
37:12
I'm not even in the sea, says organizations controlling the ribbon policies on. You may not call it that they called something else. But
37:20
I see that all the time.
37:25
And that is a really important policy because it dictates things like,
37:30
um,
37:32
how long do you keep documents? What should people keep on their laptops?
37:37
I'll bet you there people listening to this call right now
37:43
who have
37:44
so much *** stored on their laptop that's unnecessary, that it's not funny.
37:51
Like I bet you have. If your manager you might have old performance reviews and people doing work in your company anymore,
37:57
because you know what you d'oh! It's snapped an image of the last laptop and you start as a subdirectory on the new one. And then you keep nesting that algorithm to the point where
38:07
you probably could go find documents that you had 11 years ago,
38:12
Um, on your laptop.
38:14
And the question is, what's the policy for that? If it's there, it could be stolen.
38:20
There could be Social Security numbers in there. If you're in us, there could be credentials in there that could be password in front or whatever.
38:28
So who manages that policy Spectacularly important thing from a discretion perspective, it's probably gonna be up that not well codified. If you'd made the determination, you're gonna ask 10,000 people
38:43
in your company to see if they could just get rid of 100 gig of *** that they don't need.
38:47
Um, that's a pedal bike. And if you think 100 gig is too much than 10 gig, everybody give rid of 10 gig of stuff,
38:54
you know, even one gig. Do the math. One gig is what a terabyte. So
39:00
if you have 100,000 employees or a 1,000,000 employees,
39:02
you can do the math so so alone with sort of the discretionary component here, this important issue of
39:10
saving, sharing and managing you know how you handle data
39:15
now in the context of cybersecurity that inevitably, information sharing is the topic that pops up. And I just wanted to use *** Clark's picture here because I think he's the father of information sharing a cyber context. I've known him for many years, a very good friend of mine, Um,
39:35
and a political guy, you know. So there's some people who hate him and people love them. People in the middle,
39:40
you know, I'm a person who respects him. I think he's got spectacular insight and,
39:45
you know, whether you agree with his politics, another cares. But on cyber, he was one of the very few
39:51
really dug. It was almost like his punishment,
39:54
like he was in the Bush administration after having worked for Clinton. And I remember
40:00
he got a quote unquote sort of relegated the cyber from his post doing quote unquote real terrorism. I remember
40:09
Rush Limbaugh
40:10
saying one time on his radio programs, talking about the Clark. He was actually interviewing *** Cheney,
40:16
and they made the joke that
40:21
he said something like, You're a *** Clark,
40:22
he said, Yes, she's no, he's doing cyber. Isn't that like PC viruses, and they both had a good laugh about that.
40:30
And I remember being pretty annoyed thinking, No, it's more than just PC viruses and *** dug into it. That's when I got to know. Became spent time with me. I spent a lot of time taking him through everything I knew.
40:40
He went, visited all the tech companies and got to know them, did the math, took notes, listen,
40:49
absorbed it all and came to the conclusion that information sharing was one of the most important things that we should be doing.
40:54
And if you go back and look at PDD 63 which he essentially drafted with his team, Paul Kurtz and others, romance
41:04
is a spectacularly important document and I believe the beginnings
41:08
of modern information sharing as we know it. So I could take a look at you like this book, The fifth domain. It's kind of about cyber.
41:16
Um, he interviewed me for this and I told him that I thought the next wave's while ago I said the next wave was going to be not cyber but information manipulations,
41:27
and I remember them sort of laughing. And then 2016 happened
41:31
by the way circle Trust. That's Robert DeNiro, my chamber guys. That's really what it comes down to. Information sharing. Now let's do a case study real quick, and then we'll get to my guests.
41:43
Um, in this case study, you see our hero? Heh. Emily is talking Thio, her friend who says that she had an issue, namely government had come to her and said that there was somebody working for her who was kind of under FBI surveillance. Would she be willing
41:59
to kind of share back some information about this co worker and she decides in the case study? No, I'm not going to do that because they said we don't want you talking your supervisor or just sharing it directly back.
42:12
And she said, I'm not comfortable with that. They said, Fair enough. They went off. And then after they went off,
42:16
this individual starts acting a little weird,
42:21
and she thinks her changes
42:23
work habits. Something's not right.
42:27
And now she's having second thoughts.
42:29
And in this case study,
42:30
I sort of ask each of you what would you d'oh you know and again, that's an important sort of concept here where government comes to you. There's no playbook for what you D'oh! You have to use your discretion and what I want you to do as you always do. Each each week is go back and dig in to this little case study
42:50
and and see what you think. You know what I did? Was the FBI being reasonable asking her to deal with them and talk to her supervisor?
42:58
Um,
42:59
what do you think of her original response where she said no. But then, after this person starts coming in early, acting differently, is that enough to be suspicious? And what would you be? Your advice to Karen, Should she
43:12
go back to the FBI and say yes, I do see something unusual here or not. Do you have that obligation?
43:17
So that's an important one for you as a group to spend some planets. I hope you don't.
43:22
Finally, we get to the highlight of ours discussion here. Wanna introduce my good friend Jennifer buys who I've known forever
43:30
and I think is one of the finest
43:36
finest in our business. Jennifer has been in the sea, so roll she's been in a variety of senior executive roles. She knows more about risk. Cyber risk management than a human being on the planet.
43:47
I think she knows as much about Cyber Security is anybody I know. Um,
43:52
so first, I wanna welcome gen to the, uh, to our discussion. Jennifer. Thanks for making some time here.
44:02
Jen's on Mute
44:05
Leaf. Can you hear me? Okay, but I am back. Okay, Go ahead. Jennifer. I welcome that.
44:10
Thank you. Thank you.
44:12
A very interesting talk on the effective
44:15
behaviors. I gave me a lot of ideas on,
44:20
um,
44:22
looking back at my career and and good examples to, uh,
44:27
you know, on each one of them, you know, just
44:30
really resonating.
44:31
What did you think? I mean, I know that's not the usual sort of discussion. Is that usually when you and I were doing this sort of thing, we're talking about
44:39
a risk or firewalls or compliance. And the essence of this course is kind of guiding the floater on the call who are all you know, folks who do what we do but have aspiration to kind of advance their career.
44:52
Um, do you think genuine people and I know they do come to you. I come to you for advice. What?
44:59
What do you generally tell people when they think I might like t o get promoted up into that big job? What's what's usually a reaction other than first trying to talk them out of it? But after that doesn't work. What's your
45:12
what do you usually offers guidance to people interested in that job?
45:15
Well, um, I
45:22
I asked him how many people,
45:23
uh, they really know who are really good engineer if it would be willing to leave whatever they were doing and come work for them. And
45:34
they have to understand that there is a possibility when people change seats, does that the stuff that you're inheriting may have to be refreshed and that you don't have a roll index of people that you call for really important trusted positions like head of identity and connective management.
45:53
Then you will be doing that job in addition to the sea, So job.
45:58
But you really have to be careful that you either have all the skill sets that are needed to run the so function. And I'm sure bye
46:07
following this course, they'll pick up the list
46:10
and then you have to know where you can find them. If you don't have them yourself. It's kind of a brutal position, isn't it? We, you and I both have friends and no people who
46:21
you have come and gone sometimes for things that they deserve a lot of times for things that don't sit up in your observation that maybe it is a little bit more
46:30
a little bit more tough environment than you might find with the typical I T executive.
46:37
Absolutely. It's one of them dropped, like the c I O where everything's going smoothly. It's great. And if one thing goes wrong, then it's your fault and all hell is breaking loose. So, um,
46:50
they're, uh, in the 19 nineties,
46:53
there was a statistic that C I ose changed jobs every 18 months.
46:58
Um, now even now is one of the
47:01
higher executive management rollover job and see, So is down there in the two year range now as well.
47:09
And for a variety of reasons. If you said it's not always that things were going badly, it could be that because there was so much opportunity that if you are a really good so then you have mobility, you know, live in the city that you want to or where the industry that you'd like to work in,
47:29
but a lot of time.
47:30
It's, uh
47:31
because as what you said, that the stress is really hard. You're responsible for everything. It's very hard to keep on top, and the grass is always greener somewhere else. Jen, take us through your your own sort of personal journey. You know,
47:51
in your career it's been a
47:52
an interesting one,
47:55
and I think you've probably seen this position from this many vantage points. Is anybody I know what take us along your journey a little bit and some of the things that you've learned and some of that may be some things that stick out along the way.
48:13
Well, I'm reminded of, uh, Dan years remark on hybrid vigor where
48:19
people came to the job.
48:22
Computer security back in the days when we still call the computer from a variety of fields because there was no
48:29
master's degree in,
48:31
uh, computer security, there was barely a masters degree in computer science when I started. Luckily, that we had you had such a great program and Stevens where I was able to take a class and security,
48:44
uh, when I was in a computer science, but
48:46
there were people from law that got involved in, uh,
48:52
of computer security people from biology where Dan here came from just because he was using the computers and understood the importance of it. People from,
49:01
uh, project management and, of course, the software systems. But they all brought their own
49:09
motivation and their own priorities, and they all started to read the Orange Book and started to develop a set of skills and practices that seemed to work of for most people that allowed more standards to be written and more community to gel.
49:27
I was
49:29
thinking of the first couple of national computer security conferences, which are right after the national computer conference, isn't you're talking about
49:38
came along a couple of years after the computer ones, but the first ones were called National Computer Security,
49:45
and they really, uh, the government and the party's. He's like minor,
49:49
uh, and the universities like Carnegie Mellon who has have been funded for the computer emergency Response team.
49:57
We're able to bring
49:59
of these practices and groups together, too.
50:01
You try to understand what was working and
50:06
share knowledge. So we went to a lot of those.
50:09
What happened somewhere around the turn of the century is that
50:15
so many vendor started to participate in. There are so many conferences that,
50:20
uh, have computer security as a theme that the government decided they didn't need to do that. Fostering and shepherding anymore kind of took a side roll.
50:30
And
50:31
now we're kind of just left with a lot of competing views. And I think it's gonna be another couple of decades before the job function of a sea so gets even as structured as the job function of the C I O is now and that that changes a lot as well.
50:50
You asked about my personal journey. I came from, uh
50:53
well, uh, as you did and I was in,
50:58
uh, expert systems and software development. And I went into security architecture because that's where the bigger problems were to be solved after A T and T broke up
51:08
and
51:09
I became the sea. So because I just happened to be the highest ranking person in computer security after 9 11 when the Elektronik Times
51:17
horse came in all the financial service companies and said, Who is your chief information security officer at the time. Maybe two or three companies have one, but he's recognition that this was a high level function that had to be addressed immediately because of terrorism.
51:34
You know, from *** Clark. Billy gave the position much more prominent.
51:39
So it's not that
51:40
companies decided We need a chief information security officer. It came in from the outside, and I think we have to be very cognizant of that, that a lot of times, if you have a first, so being hired in a company, it's because they're being forced to create a position at a high level that they didn't think was necessary before.
52:00
That's another thing that makes it hard.
52:04
This was at Bear Stearns that that position
52:07
itwas
52:09
Are you aware that you had to after that?
52:15
Oh, um, when Bear Stearns died, I didn't leave their stars turn left me, as I always say, I'm him.
52:22
I decided I didn't want to stay in a very big company. I wanted to see what was going on. So
52:30
there are a lot of different voices in, um,
52:34
their security information security. I guess it was started to be called in, and then, you know, now it's all cyber security. But I see all that is a continuum.
52:42
And
52:43
I did consulting a big banks and, uh
52:46
um, larger companies, pharmaceuticals and things. I went into academia created Ah, after his degree and system security engineering for Steven's Institute of Technology. I did a lot of research, government research and my own research, wrote books. And
53:05
generally, um, I just see what the spectrum of the, uh
53:09
uh, field waas
53:12
so that I could make an informed choice coming back in.
53:15
When I did go back in, I went into enterprise risk management because, uh, the chief risk officer, we're just starting to realize that they needed dedicated health in cybersecurity as well. So there is a path even beyond. So for those of you who are looking to become sees those now,
53:31
um, eight
53:36
most critical aspect of the pieces job from the court. From the point of view of the C suite today, does that risk management function And
53:45
are we
53:45
visioning the right defenses? We always know where provisioning defenses. But are we really identifying the assets to be protected and understanding are risk profile so that if we do get attacked, at least we're not surprised because we knew that risk was there and there was a high probability that
54:05
something would occur because of our knowledge of our current state,
54:09
at least if you're not surprised to see so you can stay in a good position.
54:15
So, um, so from risk management, I, uh,
54:19
which I spent another
54:21
for five years doing,
54:22
I came back into consulting and started my own
54:25
for company called Frame Cyber. That has risk management software that those risk managers can use.
54:32
And I still am consulting with that software.
54:37
I actually ended up using the tools and techniques from my consulting practice to build software.
54:45
And it
54:45
I think it's, um,
54:49
having gone through that stage, you know, being in I t. And then being in a position of responsibility and security, doing a lot of research coming back through,
55:00
uh, two consulting again. I do a lot of expert with consulting as well, so I really get to see things from a bird's eye view and how
55:08
tools and techniques and practices are being judged in the eyes of the law. Um, I would say that
55:17
no,
55:20
all of those experiences
55:22
brought back into any kind of
55:25
cybersecurity management role are helpful.
55:30
Jenna's risk is risk management, something that you
55:36
just sit down and learn. Is it something that you that requires a period of apprenticeship like I've heard you many times talk about. The importance of this measure is very inspiring to May cause you're so capable in this area,
55:49
I always kind of gives me a feeling of inadequacy because I
55:52
was kind of feel like why? I know some of the basics.
55:55
What advice? The APB for somebody who says yes, January mismanagement spectacularly important. What should they go? Do I know you? You organize a conference, for example, and there may be I know you've written a number of books and perhaps, um, some courses you recommend. What would be your guidance for someone who buys into the idea
56:15
that doesn't know exactly.
56:15
We had to get started.
56:20
I would say
56:21
Earth read a couple of very important books, like Against the Gods, the remarkable story of risk that talks about how risk
56:30
is managed by the risk management community that really grew up in the finance side of the House
56:37
and all of the principles and practices that we use and cybersecurity are adapted from a discipline that they call operational risk.
56:47
So if you, you know, take out 101 course in operational risk, we'll have the background that you need to approach risk management in cyber security.
56:58
A couple of conferences that are dedicated and there are two into cyber security risk management issues
57:02
metric on. And you're a con,
57:05
um, are
57:07
are very
57:09
I just
57:12
unusual in that for a very small niche field. If you said, you know, not a lot of people know a lot about it.
57:16
There are so many opinions on what the right way to do it is on. So when you go to one of these conferences, you will see some company get up their present, their cyber security risk management program and say, This is working for us. Everybody should do it this way, and there will be 20 different opinion on what that is.
57:35
So remember, if you're going into risk management, research is part of your job.
57:38
Not only historical research, but
57:42
finding out how other people are adopting the
57:45
A risk management principles,
57:49
however, security because you're you're really
57:52
taking it up at this stage as you go along because every company is different, every cent of risks is different. Your information classification will be different. As you pointed out, earlier information classification is key,
58:05
and there is not a standard way to do that. So,
58:09
um, it is, uh, it's, um
58:12
I find it very interesting and challenging, and that's why I'm spending a lot of time in that area right now. I think it's the green field of
58:21
Peter in Cyber Security,
58:22
the platform that you've built that frame cyber take a moment to share what what it does and and maybe give people a little information about where they could be in touch with you.
58:32
They practice your website if they might want to learn more about it.
58:37
You're a frame fiber is one word. The word frame in the word cyber. It's taken from the work framework,
58:44
but frame cyber dot com is where you can see more about it.
58:49
What I've done with frame fiber, if I've taken the major information sources that people need for cyber security, risk management and put them all in one
58:59
easily assimilated
59:00
um,
59:01
piece of software
59:04
so that you don't have to go to eight different systems to go looking for your events, looking for your issues in your metrics and the things that people need when they're going to make a risk. Punishment decision.
59:15
A lot of companies have bits and pieces of these models in other systems, like they may have ah GRC system in Archer, and they may have an issue management system in their environment. But they, when they do risk management, they have to get data fees from a bunch of different places.
59:34
So I use cream Cyber not just for my consulting and for sharing with other consultants and companies, but for teaching,
59:40
um, at the graduate level in cybersecurity so people can really understand and see at a glance
59:46
what you need in front of you. If you're going to make a cyber security risk management decision,
59:52
that's it. In a nutshell.
59:54
Can I assume that your website would be the right place to go if they have some interest?
60:00
Four. And they're a contact of field and, you know, happy to answer email. You can get to me. Agenda frame cyber dot com. Well, that is wonderful. Jen, I want to thank you for spending some time and sharing your wisdom with the team. Um,
60:17
you're, uh
60:20
your career has been just exemplary. And I appreciate you share with the folks here and for those beyond the cult. I think we have our next lecture on Thursday.
60:30
So, uh, look forward to, uh, you know, an hour with you then and
60:35
Leaf and General, I'll go ahead and go closer then. So thanks. Everyone will see how their next goal.

CISO Competency - Discretion

This is the fifth course in Ed Amoroso's Twelve Competencies of the Effective CISO, which focuses on the CISO Competency in Discretion and Trust. The CISO must be able to exhibit and manage high levels of discretion and trust in dealing with sensitive information regarding threats, investigations, and on-going initiatives.

Instructed By

Instructor Profile Image
Ed Amoroso
CEO, CSO, CISO of TAG Cyber
Instructor