Enterprise Security Leadership: Understanding Supply Chain Security

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
or

Already have an account? Sign In »

Time
58 minutes
Difficulty
Advanced
CEU/CPE
1
Video Transcription
00:00
This course is powered by Sai Buri for teams. Security leaders encounter new workforce challenges daily Cyber A for teams helps organizations build a cybersecurity enabled workforce to tackle new challenges, handles security incidents and prevent data breaches. If you'd like to learn more and see how other security leaders like yourself
00:19
are utilizing Sai Buri for teams,
00:21
you can schedule of free demo in the link below or search business in the navigation bar.
00:27
Okay, Thanks. Um,
00:29
so yeah, as Tatiana's suggests, we're gonna do some supply chain today now gonna take an angle here that's different than you usually get with supply chain issues.
00:42
And I want to preface a few things here. First off, this is a leadership course. So one of the things that I hope you've sort of gathered from our session some of you went through a little journey of earlier in the year where we did six and this is our third session together
01:02
is that I really would like you to have an opinion about
01:03
things like It's one thing
01:06
when you're an individual contributor. Thio kind of keep your mouth shut about opinions,
01:11
But at some point, when you're a leader you need Thio
01:15
have some belief.
01:18
And there's a problem
01:21
when your belief conflicts with the way the corporation may want you to be behaving.
01:27
So, for example, you might be somebody who just like smoking. If you work for Philip Marston, that wouldn't be so good if you love smoking a good place to work, you know? And I'm not making a value judgment. I'm saying there's things you're going to believe in
01:45
that have to come up. If you work for a defense firm
01:48
and you're this extreme pacifist, then you've got an issue, and the reverse is also true.
01:53
So
01:56
I'm gonna make some comments here when we get through the supply chain material that I'm pretty sure is likely to be. Maybe things you don't agree with. I don't think it will reach the level of offensive.
02:09
But it's opinion in particular about, say, U S and China.
02:15
I'm not asking you to adopt my my belief here. I'm merely trying toe
02:23
get you to have one. Okay, so I'm gonna take you through issues. I'll show you some things that just flat out computer science facts, and then we'll talk a little bit about how that translates to policy.
02:37
As you move up in the management chain
02:39
in cybersecurity, you're going to find yourself exposed to policy. And, sadly, to Cem politics. I've spent more time in Washington than I ever would have expected. When I first started doing
02:53
what was then called computer Security, I never would have believed it. Andi have toe adopt some beliefs. So
03:00
so that's my preface here. I'm not trying to sway one way or another, but I hope you have an opinion. I I would rather deal with someone who has a well formed opinion that I vehemently disagree with,
03:10
then someone who has no no opinion. Don't be that person, um,
03:15
Thio. Collect the fax process what you think is right.
03:20
We've in your personal beliefs and then have an opinion on things. And for me, the sort of executive summary
03:30
is that I think most of the supply chain policy that we have in most countries is wrong because it doesn't affect the core issue. There's an old joke
03:39
that I remember. My dad likes this joke where
03:44
this guy looking for his money on
03:47
Eighth Avenue and 45th Street
03:50
and somebody walks by and looks and sees a guy looking around. What's matter? Because I lost my money. Person says. Oh, here, I'll help you,
03:57
gets down and start looking And he says, Where exactly did you think you lose it? Where? Which part and the guy goes, Oh, I actually was four blocks that way where I lost my money
04:06
he says, Well, why in the world would you be looking for it here? And he says, because the light's better here
04:12
and that's how I feel. Much of supply chain is being done.
04:16
There are things we can say
04:19
that are suggestive of managing risk.
04:23
And as I hope to show you,
04:25
I think a lot of its displaced. Not that there's no risk in the areas you look up, but you guys are all experienced cybersecurity experts and and executives. And you know that if there's a bigot issue and then a little issue,
04:39
maybe you're forced to work a little issue. But if someone asked you has a big problem, a little problem, what should we dio? I hope you go after the big problem, right? I mean, I know that there's scenarios where you can
04:49
and I think much of what we talk about in supply chain, particularly on a national level,
04:55
just from the perspective of computing. Onda hacking at risk and offense defense. Look out the right things most of the time because there's a lack of understanding. Try and help you with that. I think for a lot of you this will be,
05:10
um, a lot of tactics you already know.
05:14
I'm gonna walk you through some very basic source code. You you have no problem. Some of you is old hat.
05:19
But let's let's go through it. I'll take you, take you through some things. Now, by the way, this picture here, Mr Sasser, is kind of funny. Back when I was running the program at 18 t, they were doing a commercial TV commercial for their security,
05:33
and they had this big Trojan horse, and it's actually I saw the thing. It was in my office for a little while, very small,
05:41
maybe about 3 ft high. It's a little thing made out of wood, Um, but they're doing the commercial. They had somebody checking in. It was a Trojan horse, checking it at a business at the front desk
05:51
and and I was watching. And I said, Hey, would you put Mr Sasser and then what would we do? That And it was the Sasser worm was time 2003, and the producers, the people from wherever they were, one of these big advertising agencies who were using, they thought
06:09
actually kind of liked that name has a nice ring to it.
06:12
So this is like my Easter egg to a TV commercial was running on all these TV stations, and I think there might have been me and three other people on the planet who got the inside joke here.
06:24
That Sasser was the Sasser one e was beginning in the end of my career. Writing ad copy for TV commercials was pretty funny.
06:33
Now, first off, I think you already know this. This is my friends over Mo mentum partners. I love their work. I I participate in what they dio
06:43
um
06:44
you know, and I do this for a living. To keep track of all these vendors is a massive right. So the first issue we all know is that when you're doing supply chain, just the practical issue of who you're getting your your security tools from and everything else is massive. This is these air, just security tools.
07:03
But you know all your I t infrastructure, you're operating systems
07:10
and point tools, the applications,
07:13
its massive, right? So the idea
07:15
that you could somehow keep track of all this is really quite daunting.
07:21
Doesn't mean you shouldn't try. So I think for teams that try to do portfolio management,
07:28
that's a that's noble work, and I don't think you should give up.
07:31
So when you see this, this is not a roadmap to give up. It's just a roadmap to recommend that perhaps you and others might be trying this work with not enough labor. So you better get a consultant to help you because you really do have to be able to make sense of this. You can't
07:50
just sort of pass it off for outsource the decision making and did all of your other
07:56
you know the thing. So so the dimension of complexity and size of suppliers. We're not going to really spend time on here. But I just wanna acknowledge that even if it was a perfect means for establishing three integrity of a given system or application a tool,
08:13
maybe you could push it through this little car. Washington things green If it's fine red. If there's some Trojan,
08:20
God only knows that there's, you know, a million reasons why that would be useful, but it's almost impossible as I'll show you.
08:28
Um, even if you had that, this is the very daunting to keep track of all these different vendors. When you walk into our PSA and you see the two Expo halls and you just sign and think how we're going to keep track of that, that is a problem. But it's not really the problem we're gonna be addressing in the next hour.
08:46
Here is the problem, and I think it makes sense to
08:52
to take you through
08:54
what I think was a finding that Ken Thompson on my old
09:01
I brought hesitate to call my colleague because he was like the most famous Bell Labs engineer in the world. I was a little pup sitting over in the corner, used to just find excuses to walk down that hallway, hoping Ken Thompson's or our like the droplet of his breath would hit me
09:18
and make me smarter. And there's a guy who helped invent eunuchs Richie and Thompson and the bunch of others. Brian Kernaghan Greatest moment in my entire career with Brian Kernaghan, said Ed. That's a good idea. That's it. That was, That was like
09:31
when I walked out of the office. My I was this high off the ground, no longer touching the ground, I walked. So anyway, here's what Ken Thompson kind of taught us. He taught that you can put a Trojan in software and you'll never find.
09:45
And and it was embodied in his Turing Award lecture, which is reflections on trusting trust. You have not read it. You should really wonderful paper. I make all my graduate students read it every year. It's first paper. They read a matter of course I teach, but here's what he showed, he said. Look to make
10:03
a malicious insertion invisible. Let's say you're buying from some company
10:07
that comes from Elbow Nia. That's the Dilbert country that Scott Adams made up country, so it's safe to use. I'm not offending anybody by using elbow Nias example, but everybody know it's the U. S. And China are the ones fighting about this stuff. But let's say it's Albany.
10:22
So the most code has a little log in snippet that looks like this, right? You print, get your name. And then then there's a function that gets the variable signs it to name.
10:33
Then there will be some sort of, ah, things that enter your password again. A little routine that grabs whatever you've typed to standard input andan the function. Okay, the checks to see if name and password. Okay, if it is your in, if it's like the minimal, economical view of what a log in
10:54
program does every application, every operating system you already device, they'll have this little piece of
10:58
code somewhere in some form. You recognize this? So let's look at how we can goof around with us. The first thing we're just gonna, you know, point out. For those of you are developers, this is source code, right? So if your kids, or if you
11:15
like, programming one of these things pythons, my favorite language language I taught my
11:20
son was in eighth grade. He and I wrote a book called From Gates Toe APS that maybe if you have teenagers you may want to download from Amazon is pretty good book. It takes you through basics of
11:31
computing from hardware and so on. But the source code is what's writing in. And you know that it goes through this thing called a compiler. Uh, I mean, but maybe take
11:43
some people may not know that I don't know. Some managers may not have come up through computing ranks and don't realize source could go through through a compiler.
11:54
What comes out? His object code. And for people like May, I would say this is clean source, clean compiler, clean object, meaning? I'm presuming there's no malicious insertion, There's no Trojan horse, and we tend to refer to Trojans when we use the phrase Trojan when it's, ah, a negative insertion
12:13
and we would call it
12:13
a an Easter egg if it's some little fun thing that you put in there like a little surprised, like you click on this. This this dropped on that, and then your program turns into like a pinball machine. Microsoft Word and Excel had these issues some number of years ago where you could do these things in a
12:31
click. You type the word blue, turn it blue, bold it, click on about drag it down, click on the next something, and then your Microsoft Word was a pinball machine that's a Trojan horse, but you call it an Easter egg is presumably. It didn't have some malicious point, but when none of that's there, he says, I'll clean and again just completing the picture.
12:52
Your object could goes an assembler somewhere goes machine,
12:54
and it all looks like this. Here's the the Python code. Maybe that you wrote,
13:01
um, here is the object code
13:03
and here's the machine code. So this is the way it's supposed to work. And when you buy something from a company,
13:09
you you asked them about this and say, Hey, you know, what's the deal on day? One thing you might do is something called a binary equivalents test. So with Huawei, the British government
13:22
has an evaluation center, some called something like, Wow, a security evaluation center. Very impressive
13:28
that the UK government has been
13:31
basically playing man coverage. Run your sense of running point on Huawei for almost 10 years, and last year they published a report. I couldn't believe it. I think three people in the world read it. I read the whole thing. I was fascinated by what they had found
13:48
and one of the things they found? Is that what we had? Trouble passing a binary equivalents? Testing. Here's what that means.
13:52
You start with the source code
13:56
and you build out to machine, and you can actually run simple things like check sums or hashes on the output.
14:03
And then in a separate setting, you've got source code again,
14:09
and you run it presumably to produce the same machine because the binary should be equivalent. And for whatever reason, they couldn't seem to pass that. I didn't really know
14:20
the answer. By the way, Jared is asking its reflections. Untrusting trust for those. It's really easy to find Ken Thompson
14:31
eyes. The is the person here. I know there's questions. I'm tryingto lecture and also follow the chat
14:37
and the question. So and the author is Ken Thompson,
14:43
so hopefully that will be something you look at you explain all of this, but the binary equivalents really should come, because if this is working
14:50
to be saying now, obviously compiler could be different. The object different that what all kinds of things could be different. The essence of computing is translation. When people ask you, what's this computing thing all about? It really is about taking abstract concepts down to the lowest possible level,
15:07
sort of computing interactions at the physical level to cause things to happen,
15:11
you know? So we go from these broad system designs. The thing that things that happen in silicon, it's maybe the most spectacular scale challenge that I think exists, even even beyond some of the things you see in physics and then computing. We have this very wide range of translation. Now
15:30
let's look at how you can goof around with us. Here's what Ken Thompson showed us.
15:33
He said, Let's say you're bad developer your you know, your untrustworthy
15:39
and you work on the Dev Ops or something. And here's what the code is supposed to look like. And what you do instead is you put this thing in here. So here's what we had. If okay, name, password, then permit. You see that? Now watch what we're gonna put in.
15:56
I'm gonna put if Okay, name, password or
15:58
if the password just cyberia underscore. 123 then permit.
16:03
So change the source code. The conditional now has the second part too. It is an or eso. There's two ways you can get into this system. One is, you know, good, valid
16:15
identification and proper authentication, in this case of one factor password. But it could be whatever. Or
16:22
if you just know the secret password, which is Cyberia. Underscore. 123 Doesn't matter what your name is. You could put Mickey Mask or whatever. Doesn't matter.
16:32
Billion of matters is if the passport is cyber Berry underscore 123 capital C.
16:37
So, what would you call you? Call this a trap door. Right. And if you're a developer, you do this stuff all the time, right? If you're writing code, you constantly logging in logging and like Oh, man, you put this in to save yourself Trouble with the belief that you're gonna eventually take it out.
16:53
It's a terrible idea. The reason being it gives a very comfortable excuse. Should you be caught doing this, You can always say, Oh, my gosh, I can't believe I left that thing in there. If caught by someone,
17:08
I put that in a shorthand during my development. Gosh, I'm so sorry.
17:14
Welcome to perfect impunity there, Right? You're not going to be
17:18
blamed. But you can see that if I do this
17:22
I might get caught, right? I mean, it's it's source code changes like look,
17:27
this thing, this back door, it's right there. And how would I do? It was dirty source code. And yeah, clean by will produce dirty object code. I get that. And and But look, human review of the source code or somebody scanning
17:45
or if you I guess testing is not going to really catches, like would you ever.
17:48
Unless you buy some spectacular accident.
17:52
Um, put type cyber and score 123 If if the If you were a developer
17:57
that was doing this just as a a shortcut,
18:03
you might put the word password as the password.
18:06
That way, code testing would pick that up because it's one of the first things you check for
18:11
password. You know, ABC 123 You know, that way this test would pick it up, but scanning would pick up that Something's weird here
18:18
because a code good code scanning tool will look at the branch structure and the logical structure and will suggest some changes might might find it, but certainly human review. Anybody looking at this is gonna go, You know his problem
18:33
and That's why governments have said, Well, we need to go in and review the source code like Huawei and others and including American company Cisco.
18:41
They say, Here, here's our source code.
18:42
You could look at it
18:44
and you could look and see if we put something like this in here. And legislators think that sounds great and managers thinks it sound great. Executives think it sounds great. I've been sitting behind a microphone in Washington where people have said
18:57
that very thing, and a bunch of senators and congressmen say, Wow, that's wonderful. My gosh, they're opening up the coffers or letting anybody see their source code.
19:06
Well, let me show you why that's fundamentally flawed and really indicative of someone just does not understand computing.
19:12
Let's talk about this compiler. Let's let let me show you what the compiler does, and I can show you how it will make that source code Go away. Remember, we have dirty source code clean compiler, dirty object code. You do the review on the dirty source code. You get it. Well, what if we did this? Here's how compiler works at the highest level.
19:32
It really is
19:33
constantly churning through your source code to translate. As I said, it's translation is the essence of computing. So there's this first lexical analysis phase. We was embodied by this get line of code. In this repeat loop,
19:49
grab the line of code and then the code generation phase and a compiler is called translate. So translate the line of code. So I scoop up some of the code. I translate it. I scoop up some of the code I translated. It goes on and on until you're done.
20:06
That's like a cartoonish
20:08
view of our compiler works. But you get the idea. And frankly, having worked on compilers before, this kind of right, like this big lexical analysis phase, we build a structure based on the code, and then you traverse the structure, generating
20:26
the assembly language,
20:29
cocoa generous. It kind of does work this way, but not so much a loop. That may be more how in assemble works, but nevertheless, So let's give ourselves a little room here on the screen toe work.
20:40
Yeah, here's what I'd like to dio
20:42
I'm gonna have it every time I grab a line of code,
20:48
I'm going to check to see if by some miracle.
20:52
It's this line of code. See this one here? Right here. If Okay. Name, password, then permit.
20:59
Uh, you'll be compiling all kinds of stuff. But if you ever happen
21:04
to be translating this line,
21:07
then here's what I'd like the software to dio
21:11
if if I see the line of code is this thing thing I just showed you
21:15
Then I don't want you to translate that.
21:18
I want you to translate this thing. I want you to put into the assembly code
21:25
the Trojan you follow. I've programmed the compiler toe look for something in a piece of source code that's written in a way that I know
21:34
will exist because I wrote it.
21:37
And then in the compiler, which also rises,
21:40
I'm going to be looking for that.
21:41
But when I see that I translate the Trojan.
21:47
So what that means is this is clean source code. It's the compiler. Now that's dirty.
21:55
Do you follow? The compiler looks for good translates with bad. So what we have now is clean source code dirty compiler. Dirty object assemblers could be clean. It's gonna be dirty. Machine is gonna be Trojan Trojans object code here.
22:11
Now, if I come along and I say, Hey, let's do Human Review the source code. What am I gonna find? I'm gonna find absolutely nothing.
22:18
It is not there is. Not that it's the compiler
22:23
that's generating the Trojan horse.
22:26
Do you follow? This is kind of profound, right? And by the way, you can move this down to the assembler. You could move it down into the machine. And if you read Ken Thompson's paper,
22:41
there's some very interesting kind of
22:45
tricks that you could do with eunuchs
22:48
to really make the thing disappear. Even in the compiler, meaning you don't have to push it down, I could make it go away. It's really, really one of my favorite papers is Ken Thompson, but again, you don't read the paper. This is this is the essence of it. But now let's say you're legislator
23:07
and somebody comes to you and says, Okay. The way to detect whether or not there's Trojans
23:15
in this Acme manufacturing code from Al Bonaya
23:21
is I'm gonna have our crack engineers go in and review the source code. Pour through it carefully,
23:26
reading every line of code, looking to see if the's al bony in engineers
23:32
have placed some sort of a Trojan horse surreptitiously
23:36
to do, you know, their dirty work.
23:38
And that sounds great. It sounds great in a hearing room in Washington. Wow, what could be better? I'm lifting the hood up, Take a look,
23:47
But you can see this bunch of bs unless you're looking at the compiler. Or unless you're looking at the Sembler, looking at the entire building or even looking at the hardware.
23:56
It's not gonna work Now, in a minute, we're going to talk about what
24:00
nation states would do here.
24:03
But I want you to recognize that when policy is set based on things like this, it's wrong.
24:11
Okay, Trolls doesn't matter whether you're for or against neutral to the way the U. S. Is handling things like Huawei.
24:21
But you should know,
24:22
as an expert
24:25
that if the implementation
24:27
involves source code review doesn't work
24:30
because it's trivial
24:33
for, ah, a capable development team to sidestep it.
24:37
So let's get Thio again. You could say I could make the dissembler dirty.
24:42
Let's get to what it is that governments are so worried about. Okay, so there's this tremendous fear
24:52
that if I were to buy dirty equipment
24:56
and embedded. I'm using us government terminology. Here it's Conus Continental, United States. Okay, bonuses outside the continental United States. So whatever you get, the idea is that that's because I'm really focused here on what the U. S. Would be saying
25:15
Pick your favorite country album, NIA of China, Russia, Israel,
25:19
Japan will have their own, uh, you know, boundary around what they would consider to be their their country.
25:27
Um,
25:29
and what we have here really is a situation where
25:33
their first thing they're worried about, you know, by getting something dirty, eavesdropping. So I'll have this piece of equipment that I buy.
25:44
Let's say it's a router, broadband route or something, and I allow my eyes piece to buy that.
25:49
And the ISP then puts the broad racks of these broadband routers in Iowa someplace,
25:56
and a bunch of the citizens in Iowa
26:00
who are making use of this routers part of their ISP service,
26:03
would have there communications filtered through that. And because there's a Trojan,
26:08
what would happen is the code that's in the Trojan, and I showed you how you can make that code appear so it would look like good code But what would happen is when I translate the good code, it would produce machine code that has listening programs in there. Okay, Wouldn't be in the source, but it would be
26:29
I grab salt like,
26:30
capture the packets,
26:33
I buffer them,
26:36
I store them over here, and then somehow I have to scoop and get him out somehow to send them to ah u R l through your reverse proxy or whatever. Get through your whole gauntlet without you noticing it.
26:49
Okay, so the equipment is sitting in line
26:52
and every piece of equipment I've ever seen Intel comes have been my whole life in Telkom
26:57
has both a data channel
27:00
and a control channel. I say every piece, older stuff didn't modern ones. You have a control plane data plane.
27:07
So what would happen is
27:08
the data plain is where the eavesdropping.
27:15
And then I have to do all that work, which is embedded in the translator, which produces object code that has all the stuff you didn't see. Source code wouldn't show it. Object code. Lots of code there to do this has toe
27:30
hold from the control plane.
27:32
What you're looking for So I'm gonna be everything is You get buffer. Everything That's be like if you know, FBI agents are calling into Iowa and noticed that some key on the keywords I have toe bed that into the Trojans. Wow, the buffet. Collect it, store it and then through the control plant plane.
27:52
Find some way to get it out.
27:55
Okay, Because control planes where you do the processing. Now, maybe you could find a way up the data. I don't know. You design the Trojan.
28:03
The idea that the ISP wouldn't notice this strikes me. First of all, somewhat preposterous.
28:10
But maybe maybe you got a really dumb S P. I mean, I know the bigger ones better. I know. Verizon and T Mobile and Sprint 18. Good luck. And they're not dummies. These people are doing that working for 100 years, some or 130 years. So the idea that you'd have this data scooped up
28:30
and passed out
28:30
if minimum, if you got reverse proxy, it's not gonna let it to go. Go out on, you know, on categorized sites, which is where you're gonna be trying to go.
28:41
So the whole idea that you can put this in place,
28:45
it has this Trojan, which I wouldn't notice. I wouldn't see that processing. I wouldn't see the buffering, and I wouldn't see a lot of that. And then it scooped up in going out over a network
28:59
like these. Pack like it's not gonna be too little things. Thio eavesdrop on communications. You know what you like when you do a zoom session and when you store the conversation of the zoom session, that's a pretty big file, right? So you're saying that I'm gonna
29:15
collect that process, it be able to pluck it out of all the broadband communication
29:19
and then ship it out and nobody at the Espy's gonna notice this
29:26
Really? But whatever. I'm not saying it's impossible.
29:30
I'm just saying that that's what you're saying. That's that's the implications. But that's how you're going to get this data now. I'm My claim is which I'll show you. A minute
29:40
is if I want your data is way better ways to get it. You know, a P T. How about a phishing attack dropping malware and then I traverse and steel and go out your back door?
29:52
That's like, uh, that's something I could show an undergraduate may because sophomore undergraduate at N. Y U R Stevens, I could show them how to do that.
30:03
But this thing is way harder, right and way more easy to get caught.
30:10
And if a company like, wow, I got caught like, let's say that broadband provider in Iowa, you know, notice this why we would be out of business the next minute, right? I mean, we would be all over the Internet, we'd be going crazy, and then everyone would just shut them down. So gigantic risk,
30:27
really hard to dio.
30:30
And there's way easier ways to do it. So So when people go, Hey, is that possible? I go, Yeah, of course it is like it will be, you know? So you got a problem. You've got to fix it. I go. Okay, well, there's big problems. And then there's little problems that are more unlikely to be an issue.
30:47
So would it make more sense to focus on the big problem, which really is enterprise security? The thing that I've devoted my life to a lot of you do for a living
30:57
protecting enterprise from cyber attacks,
31:00
that's the issue. Not Can you eavesdrop with a Trojan that I embed through us, uh, you know, compromised translator. That then has to do all these things without getting noticed. And if you do get noticed with the network, then you go out of business really, really going to do it that way. I just I
31:21
I just don't see it. I would.
31:22
I'm on the offense. I wouldn't do it that way.
31:26
It would be dumb. It's just not a reasonable offensive measure because the risk is too high. It's too hard to do, and there's way easier ways to do it and look, the other two threats here.
31:40
Integrity. That's where you're modifying. So somehow I'm going in and man in the middling this thing. You know where I'm changing what you see or modifying.
31:49
That's even harder, I guess. Like encryption. Good luck.
31:55
I guess you could do. It's a similar thing with one thing I would say is maybe a little bit higher risk would be blocking.
32:02
So I do get that there could be a kill button,
32:06
and then all these broad brand routers break that's reasonable to may.
32:10
Now, again, that's the nuke option,
32:14
because if they all break, I'm not gonna buy any more from you
32:17
and you know you're out of business, but may be World War Three is broken out. So that one I'll be that when I believe like that when I think yes, if there was an armed conflict
32:29
and you're gonna be going a mortal combat with the elbow knee ins and you buy Albany in broadband routers, then they're gonna push a button and have all these Albany in pieces of equipment die.
32:39
And you took that risk on. So, yes, that one
32:45
I'll give you like. But that's not what we talk about in hearing rooms.
32:50
It's always eavesdropping. So
32:53
So I think it's reasonable. If you really, really do believe that buying equipment
33:00
from a country that you don't particularly care for is evil. Just make sure that in your mind you have the threat intensities carefully worked out in your head.
33:10
And in this case, I would totally agree
33:15
that a denial of service attack is possible. But it's the nuke option the other two
33:22
don't know. It just seems really hard to dio with a lot of risk, and there's easier ways to get at what you're trying to do, which I'll show you. That said,
33:34
if you're really determined cause we're talking supply chain here, not a p T. So topic here, let's let's look at what a business would normally do when they buy from a vendor.
33:44
And the supply chain team says, All right, we got to do something to have a supply chain program to reduce our risk. I don't wanna look like a loser with the board, so you got to give me five things we're doing. This is what, in my own consulting practice, this is what I see Enterprise teams do all the time
34:02
and these air good things to do. But I want to make sure you understand
34:07
what? What kind of risk is actually being dealt with. So the first thing everybody does is they say that I need to be able to go to the vendor and say, I wanna
34:16
information about your product documentation, your suffer development process. Talk to you a little bit. Have you answer some questions? We've all seen those questionnaires that result in some sort of a metric from 1 to 10 or 1 to 100. That gives you some figure of merit around the quality of your SDP.
34:37
Perfectly reasonable thing to Dio. This is not a bad idea. Is this going to catch a Trojan in the translation process? Of course not. I mean, might like with the wa way UK evaluation work.
34:51
They found through some of this that they didn't think the documentation of Huawei was very good.
34:55
That's fine. That's a quality issue. That's what you're going to buy a car. And you don't think you know you didn't like the brochure? You don't like the handbook? Don't buy the darn car. But that's not a malicious insertion. That's just you making a normal buyer decision about the quality of the thing you're buying. This is the first thing anybody would dio.
35:15
Second thing is, you probably have a little look. It's the stuff you want to inspect the hardware and software,
35:21
you know,
35:22
lift the hood up, take a look around even if you don't know what the hell you're looking for. You look around, maybe run some tools. Maybe you got some experts will take a look at this. You might ask for source code. You're not gonna get it unless you'd say the Disa or the D o D or you're the G C H Q. Then yes, you know, you could probably
35:43
use your muscle to demand from companies like Microsoft or others.
35:47
We want to see the source code and they'll give it to you because you're you're spending billions of dollars. You're right to see the source code.
35:54
We won't know what the translation tools look like. I guess you could ask for that. If you're an A C or G C H Q. Then you do anybody else, you're gonna go read the
36:05
million lines of Code of compiler code, I guess. But most enterprise would never do that.
36:13
Third thing you do is we usually recommend to our clients
36:16
that in the supply chain contract to write this down if you maybe you're doing supply chain,
36:23
write this one down because if you're not doing this, you should You should have a clause in the requirement that basically tells any vendor
36:31
if somebody snitches and it turns out you do have Trojans in here, then there's gonna be consequences, usually financial meaning I you know you're going to give me back, you know, 25% of the list price that I spent on my license
36:45
and see what they do. They might cross that out. I will tell you something. Or maybe they might go, Yeah, that time. Because the only time we ever find out about Trojan horses, like with Checkpoint they were doing it. And I mentioned earlier Microsoft was doing another is when somebody snitches.
37:00
And then there's always that impunity where you say, Well, we're just doing it for bubble blower. Remember checkpoint firewalls?
37:07
You could call them in distress and say a problem. They would then ask for i p addressing Boom. You know, they'd get into your system and you go, Wow, you could get in. I've had some personal in my college roommate, remember? His terrible parents died in a plane crash,
37:23
and he called me some weeks after and said, You do security. How do I get into
37:30
my dad's quickened. It was quick. And then now you would be using QuickBooks and I said, Call the company, they'll get in. He was like what? And turned out they did like for that reason when people expire and they have all of their books and all their financials and then quicken or now QuickBooks,
37:46
there is a master key where they can get in and unlock it. It's a Trojan horse,
37:52
but it's in the contract. If you read in the fine print, it says it's there, so that's fine. But when it doesn't exist in the fine print, the terms of conditions don't reference this. That's not acceptable and that you should put in your contracts. You should say, I want to know everything. Is there
38:08
like with Excel? Um, 10, 15 years ago, you GOTO column a row 50 Highlight the column, going the about tab. Drag it down, click at the bottom and it turned Excel into a flight simulator program that fluto, a big Holy Grail at all the developers names. Now,
38:29
is that malicious notes? Is it a Trojan? Yes, it's an Easter egg, but you should be able to decide at purchase time. Whether you want that or not, you're using excelled into your Sarbanes Oxley to do your financial reporting. Do you really want Easter egg code that you don't know about in there? No.
38:49
But if you like it, no problem. You should just have the option
38:52
to say Yeah, leave it in. Who cares? And you just accept that it at config, innit? Install time when you're building the golden images of all your excel deployment across the enterprise.
39:05
So this idea that there be some requirement, we usually council that people do that Third one is you definitely monitor the community, see who snitched is usually somebody somewhere who's going to snitch. And then finally, yes, you should always be doing this proxy verse proxy right where you're watching
39:24
It should be that that you are l filtering that you get from Z scaler and from Symantec and from
39:30
you know, the old
39:31
Web sense and blue coat the these air, you know, older brains, but the capabilities the same where you dictate what you do when there's outbound transport. To a Ural, you never heard of
39:45
what you should be doing is interrogating that having the user confirmed that it's all right and then he allowed to proceed. You call that a speed bump?
39:54
Um,
39:55
you could block it. A lot of companies do. Banks usually block,
39:59
but there's a lot of options. If you just allow that to proceed, then that's a problem. So you don't want your you know any kind of equipment in the infrastructure to be able to beacon out toe on categorize sites that, you know in some sense you know, would be whatever. So
40:17
all right, so this is normal enterprise
40:21
risk mitigation. Now,
40:23
if you are an intelligence agency, it's a little different.
40:28
As you know, let's look at what they dio
40:31
So first thing, any intelligence and we're talking spy agencies now that was like elbow the elbow knee in spy agency. Here's what they would dio
40:39
first thing is they would do, in fact, do engineer a social engineer to the vendor along the lines of what I said, saying that they're under distress,
40:47
they're begging for access assistance from the vendor
40:52
to see if they may give it. And if they do, you've proven that it exists. Eso kind of the rule One is, if you're going to put malicious insertions into the code, you limit the number of people who know about it because somebody might get social engineered into giving up what happened
41:13
so most agencies would do this. Is this illegal? I don't know. It might be,
41:17
You know, social engineering is not an acceptable practice. Lying is not acceptable. I'm not for this. I'm just sharing back that if you work in some agency and the intelligence community probably going to do this stuff. And I guess they probably have the authority
41:35
to do whatever it is to make the public and society safe. I don't know. But this certainly would work if you you know, if you had something there and someone knew about it. Second thing is the binary equivalence, which I believe is very powerful. And what you like to do with binary equivalence is by a product of multiple and Stan Shih ations
41:54
with exactly the same Pio. Same purchase order, same config. Everything the same.
42:00
Just in one case, you're buying it as the National Reconnaissance Office. Thea other. You're buying it as you know, Harry's kindergarten for preschoolers or something. You know that in those two cases,
42:13
if you're buying the same piece of equipment,
42:15
if they're different, you'd want to know why. Why are the binary is different? In these cases? I use the same but same thing.
42:22
Why did you give me something different when you thought I'm a government buyer? Then you gave me when you thought I was a commercial buyer and again why I failed this. But I don't think for malicious reasons, they just It struck me in reading the report from the U. K.
42:37
That the build environment just seemed to me to be somewhat Ricky. It has to be deterministic to go from source code, toe object code and two different instances for that to be perfect, the translation path can't change, right? It has to be deterministic, predictable or
42:57
comfortable. So this be a second thing. And if you see something wrong, you do something
43:00
again. You know snitches. And I see some people were making some New York New Jersey jokes. Yes, I am from New Jersey. I went to college in Hoboken. So I guess if anybody's gonna be dealing with snitches, it's probably may. But
43:15
so, yeah, but former Microsoft employees were the ones who talked about the Trojans and word and Excel and so on. So you definitely would do try to get information. People who know, you know governments do Siggins,
43:29
right? So lawful. A lawful intercept to see if there's any chatter about something like this or embedding developers right in this is like putting, you know, active agents into an environment or phoning back when intelligence agency says
43:46
publicly
43:47
that they believe that something has. Trojans,
43:52
you have to think
43:53
carefully before you disagree with what they're saying.
43:58
The reason being
44:00
that they may be doing all of this stuff, like if they literally have a plant in the environment, like a spy or something. Who's king? Yes, where they're putting Trojans in this code. I work here and I see it.
44:14
Then if you'd probably be pretty confident that that's going on, you're not gonna pick it up
44:20
through binary equivalents. You're not gonna pick it up through source code review,
44:23
so I get that. So this is where
44:27
it's essential that a citizenry have confidence in its intelligence community and listen to it
44:35
and trust that what? When they say something, they mean it. The intelligence community I sort of grew up in a kind of pre, you know, Donald Trump era
44:44
is that they didn't talk much. Only done, never squawked, was when it seemed to be something really important to have much to say.
44:52
Now it seems like there's this all kinds of crazy stuff going on all the time I get I'm not blaming intelligent could be. I think they're getting a lot of eggs thrown at them.
45:00
So I don't know. This is something you'll have to make a decision about If that if your government
45:07
tells you that you shouldn't be buying Oracle products because they're bugged
45:12
or if your government tells you you shouldn't be buying ticktock
45:16
because it's bound
45:17
or if your government says you shouldn't be using, you know, uh, apple iPhones because they're bug, you have to decide whether to accept that because they may be doing all this ***. I don't know. You trust them? Do you not trust you? See how complicated this is?
45:36
This is not a slam dunk,
45:37
but a minute. I'm going to get Thio. What I think is the rial issue
45:42
around this and why I think more often than not, this would not be going on. So I'm gonna try and make that case for you
45:50
now. Eso again US national policy right now is that we don't allow away.
45:57
Um, the irony here
45:59
is that I don't know one tier one carrier in the U. S that was using Huawei. I'm not aware of any that where I know there's some small
46:08
broadband providers and small local I S P is very small ones. One in Portland, Uh, that we're kind of using some weiwei equipment,
46:17
but I'm not really sure what The impact waas of telling telcos they couldn't do this.
46:23
But I'm just pointing out that this is and has been current policy. Um, but here's what I found my my good friendly or from cyber reason.
46:34
Um, it's his team. Did some research that I looked at very carefully. I blogged about it. I was pretty excited when I saw this e should say Excited, Maybe, maybe more horrified.
46:44
But a bunch of telecommunication firms, all from outside the U. S.
46:50
Were subject Thio
46:52
that classic a PT meaning
46:54
somebody got fished some jerk clicks on something I shouldn't say jerk, because it might have been a great fish probably didn't make me have had bad
47:02
controls. They're supporting the employee decision making, I don't know,
47:07
but somebody probably, you know, did some kind of a click. The malware got onto the machine, traverse laterally around and then found metadata that pointed back to
47:22
essentially wiretap data. Now here's what that means.
47:25
If
47:28
if you work it out moments. Any country,
47:30
there are lawful intercept requirements for every service provider. That means if law enforcement comes to you and says, Look, here's, uh, Joe Smith. Joe is a criminal. I have a judge here saying This is a bad person. We're interested. We're gonna arrest this guy. But we need them or evidence
47:49
you, you know, Mr SP or Mrs ESP.
47:53
We want all of the communications for this individual. And here is the legal paperwork. Every country will have something different
48:01
for how that paperwork is provided, and the lawyers would look at it, see that it's reasonable, and then tell someone like May go push a button and all that lawful intercept would then happen. You'd be capturing that traffic, storing it and delivering it back to the government.
48:16
What happened with us, This attack, this thing that was God's soft sell soft cell
48:22
is that the A P. T attack resulted in scooping up all of that data. A lot of collected data.
48:29
I'm not sure it was all law enforcement related, but they found a ton of metadata about phone calls about kind of communications and scooped it up
48:37
exactly what we're worried about with Huawei and with these Trojan insertions
48:44
but done in the classic matter,
48:46
you know, with impunity. Who knows who did it? Who sent the fish? Some. It's all anonymous. There's no equipment where I can link it to the equipment manufacturer. Honestly,
49:00
is that where you're going to do this? Where you get your logo on the gonna drop listening code
49:07
into a piece of equipment I sold you that keeps my livelihood going with my logo there
49:13
And I've got I've got to somehow dance around all your detection
49:15
and you're in I S P You're good at detecting things on network. It's probably only thing you're good at,
49:22
I'm gonna do that Vs
49:23
a classic a PT that nobody has a solution for I fish you anonymously dropped malware and then beacon out. Give me a break. That's so much easier. Why wouldn't you do that?
49:35
Right? Here's here's ah, I blogged about this and I called it the way policy Trojan. Here's what I told you I was getting into
49:43
some things. I don't consider this offensive,
49:45
but I want you. I'm going to read to you
49:49
this first sentence from my blogged. Imagine standing in front of a barn with no doors,
49:55
watching all the horses casually wander outside into the fields so the door's air open. There's no doors, horses, air coming in and out of the barn. You're standing there watching it, and the barn owner then walks over to you.
50:08
And he points to some loose sideboards on the side of the barn.
50:13
And he says, You know what? I'm worried about those loose sideboards. Somebody can pull them back and steal my horses.
50:19
And you're standing there watching the horses wander in and out of the barn. No doors,
50:24
and the owner is looking at some boards on the side saying they look loose, Somebody could pull them back and the horses will get out. Now, if you ask me under oath.
50:37
Ed, yes or no? Is the risk of the loose sideboards being pulled back and horses being stolen? How could I say no? The answer is yes. That's a risk. And then if they say I have no further questions, your honor,
50:50
then I've just given the impression that there's this big risk
50:53
of that happening.
50:55
But if then the defense attorney comes up, it goes well. let me ask another question.
51:00
Is that likely to be the way you do it? And I'd say No. Say why? Because the doors wide open on That's a P T a P T s wide open. It's easy. It's not attributable you could do it all day long. It costs nothing. You don't get caught, you get the same payload.
51:16
Pulling back the sideboards is a kin to these these Trojans that everybody seems to be so concerned with.
51:24
And if you put me up on the witness stand is you know, Joe Telkom Security guy, which is not would not be the first time that's happened to me. And you sit there and the attorney says, Hey, Ed,
51:37
if a company like, wow a put Trojans in their code, is that a way that
51:44
data could be exfiltrate ID yes or no? And I would say yes.
51:49
They say no further questions, your honor.
51:51
And that's where we are in our public debate,
51:53
because nobody's then coming up afterwards and saying, Okay, now I have followed question. Hey, uh,
51:59
Doctor, I'm Rosa. Let me ask you a question.
52:02
Is that the way it's likely to happen? And I would say no
52:06
and they would say, Why is that? Because there's ways easier means for attacking and getting that data
52:15
way easier. You just scoop it up the A P T. Why would you do blah, blah, blah all the things I've been telling you?
52:22
So
52:23
I've been out yakking about this for a while, particularly with government folks,
52:29
and usually I can win them over to this idea. They usually like the idea, ones that are very anti. Wow. I love the idea that I'm very much onboard with the idea that Adidas I wouldn't be Adidas, that there could be a DOS attack, denial of service not distributed, but the denial of service through a kill button.
52:46
There's no question
52:49
that that would be an easy thing to put in, and probably a very likely thing to put in from anybody, including, say, us domestic equipment going out. It's probably something that exists in a lot of things today.
53:01
It's Kilda
53:02
that I agree with.
53:05
But this other thing I usually it's blows their mind
53:09
that I'm suggesting that you know what is this clear, obvious cadence by bugged equipment? If sniffs and it listens to everything going on and Oh, my God. Oh, my God. We can't buy from them that that's a lot of bs. But then they bring up inevitably, this thing tick tock,
53:29
it's the next thing
53:30
they go. Yeah, but what about that? Now, let me give you a little bit of what I think would be some sensible thinking around Ticktock.
53:38
Recognize that social media works this way.
53:43
That's how recommend, er engines work.
53:45
They steal your data.
53:49
I take your data. Whatever verb you want, use your data. Steal your data. Take your I don't care.
53:55
But they're doing it because they want to keep you on tick tock
53:59
like they're looking at all this stuff because it's the my teenagers, you know, like, if you're a typical teen, then you waste about an hour each day on Tick Tock, Baba, that that's true. And why did they waste in our? Because the ticktock algorithms are good at recommending things that kids are gonna like.
54:15
So this idea that there's a national security threat to this
54:21
is just absolutely preposterous to May, because first off, ticktock is this moronic, silly, ridiculous, sophomoric stuff that kids are doing
54:32
and
54:34
90 something percent of the users, a bunch of kids. And when I see the statistics that it's like 50% adults, I think that's nonsense. It's the adult who is the account owner
54:45
because the kid probably isn't old enough to do it. Every my family have a big extended family,
54:52
big family, my own.
54:52
There's nobody over
54:55
30
54:57
that I know is using this. And there's nobody under 30 that I know who is not using this. There's a bunch of kids doing silly stuff. This is social media behaving the way social media works.
55:07
And if you're gonna really have an issue with this, then you've gotta have an issue with all social media.
55:14
So I'm not saying it's good
55:16
that you're giving your data. My friend Shoshana.
55:21
Oh gosh, I'm not tend to remember her last name. Harvard Professor wrote this beautiful book, Real Fat one called Surveillance Capital that you should read. That's the second one. Surveillance capital, maybe just a piece of the title. But if you do that, you type surveillance capital men type Shoshanna. You'll see the book.
55:40
It's really fat, but it's amazing,
55:43
and she makes the case that we just basically have given away our data,
55:46
you know, to Google and all these social media companies Facebook, Twitter in exchange for some convenience, that social media. So when you're criticizing Ticktock,
55:58
it is a valid criticism, but it's you're you're criticizing social media, not like China.
56:04
So when you move the thing over to the U. S,
56:07
it's gonna work the same way.
56:09
And my understanding is that Oracle or whomever sort of negotiating purchase of this
56:15
could still make that data available to the
56:17
government or somebody. If if demanded, I I don't know, maybe swear they won't. I doubt a government cares if they really want data on your kids. It's easier to get it than rigging ticktock,
56:30
you know, to do that. But again, tick tock a social media and it's pulling with social media pulls. It's not a big secret, but this is the way all social media works.
56:40
So that's the opinion there. You wanna critic cause criticized ticktock criticize social media.
56:47
So, look, I I promised you that this would be a little bit edgy. Bond. I hope it makes you think a little bit. I see from the chat. There's a lot of really interesting Let's see. But hold on. Particular social. They already demonstrate. Allow China. Yeah, I I agree that censorship is a different thing. Um, totally different. It's censored.
57:07
So if you're outraged at that, then don't use tick tock.
57:09
I totally great. You should have a social conscience. You don't like it? Don't use it. If our government says we're not going to allow it because they censor well, we allow a lot of other business in China. Do we tell our banks you can't go into China because they censor? That's pretty ridiculous. So, Patrick, your rights make a personal decision.
57:30
But we're being such hypocrites
57:31
to allow big business. And but then we don't want our kids on tic Tac because they censor. Really?
57:37
Um, again, I'm an American. I'm against that.
57:42
But it just doesn't make any sense to me. If you're mad at them, then then deal with that. That's what diplomacy's about. I wish we would have a little better diplomacy, so good. Good point there, Patrick, that you brought that up. Any other things in here I didn't touch.
57:57
I love you guys calling me in New Jersey Snitch. That's so funny. Well, listen, I think we got the got through most of it here. I'm gonna give you back to your rest of your day.
58:06
Hopefully, we'll see you all next week. Same time. And I hope you're enjoying the course. I'm really, really glad to see such a big group of you coming back every week.
58:15
That's a very heartwarming I hope you were listening to the recording. Tatiana. Thanks for setting this up. We'll see you all next week.
Enterprise Security Leadership: Understanding Supply Chain Security

In this session, Ed Amoroso covers 5 areas of focus to understand and reduce supply chain risks; review processes, inspect hardware/software, specify vendor requirements, monitor for vendor issues, and proxy comms to unknown sources.

Instructed By