Time
5 hours 54 minutes
Difficulty
Intermediate
CEU/CPE
6

Video Description

This lesson covers the tenets of secure architecture and design; which are: • How much security is enough? • Defense in depth • Fail-safe • Economy of Mechanism (The K.I.S.S. Principle). • Completeness of Design • Least common mechanism • Open design • Consider the weakest link • Redundancy • Psychological acceptability • Separation of Duties (SOD) • Mandatory vacations • Job rotation • Least Privilege • Need to know • Dual Control

Video Transcription

00:04
All right, So let's look att the tenants of secure architecture and design And these principles are gonna apply regardless, the type of system that we're working with, whether it's a software based system and operating system
00:17
on application or maybe it's a hardware system or an infrastructure system doesn't really matter. Thes tenants of security are always gonna ply.
00:27
All right, So when we look, look at the first element we look at how much security is enough. And when you ask that question, most people will say, Oh, you could never have enough security And that's actually not true at all. You can have too much security when security interferes with the purpose and the function of the business.
00:47
And an important idea to understand is that security will always cost you something. Security isn't free.
00:53
And when I trade off what I trade off for securities either money after by security products or performance, I lose performance when I add security. Ah, lot of times users don't appreciate the additional security measures, and sometimes we have problems with user acceptability.
01:10
We could have backwards compatibility issues, resource availability issues there. She could be all sorts of problems
01:17
So the bottom line is, when we answer the question, how much security is enough? The answer to that revolves around risk mitigation. Well, really Risk analysis. Figuring out with the potential for losses
01:30
how much security is necessary and finding that proper balance between cost and benefit. And as a matter of fact, when we look at a cost benefit analysis,
01:42
most of the decisions we make in this world are on cost benefit analysis. What are the benefits to me?
01:49
How much is it gonna cost me? What's the trade off? Is there more value than there is cost? And it's so we generally make that choice.
01:57
All right, so how much securities? Enough. That's driven by risk analysis. Next issue. Defence in depth.
02:05
What? One mechanism
02:07
will protect your house or your home from a burglar.
02:12
Now, if you think about that, I'm gonna give you one mechanism to protect your house. You know, sometimes I will hear. Hear people say door locks. Well, locks can easily be picked. Well, having a guard dog. Well, I have a pug,
02:23
so I can tell you the truth. Absolutely worthless in the event of a robbery, but actually not really worthless. He does provide deterrence because you'll bark a lot.
02:34
Ah, and he might be part of a a more comprehensive defense system. But in and of him itself, the pub prevents no robberies. Well, what about an alarm system? No, you've got minutes before the police show up. I hear people joke and say Smith and Wesson gun is gonna be my defense mechanism.
02:53
Dunn's backfire. Guns could be wrested away.
02:57
Um, so the point I'm trying to make us there is no one mechanism that's gonna protect your home in the event of a disaster in an event of an attack. So what do we do instead? We have defense and death. I have offense. I have motion detector lighting. I have the attack pug. I have, um
03:16
Ah, watch doors. I have locked windows. So the idea is we don't rely on the single mechanism. We look for security to come in layers, one mechanism on top of the other on top of the
03:30
and that's defence in depth. Sometimes you'll hear it referred to as a layered defense
03:36
are at the next element fail safe. A system should fail in such a manner that it protects itself. You know, Ah, if you've ever seen the Microsoft Blue screen of death and I'm sure we've all seen that
03:49
that's the system failing in such a way that no further compromise can happen. Because when you do get that blue screen,
03:55
what else can you do? Can you copy? Can you open up? Files can export import. You can't do anything. So that system is responding to a security vulnerability in such a way that no further breach can happen.
04:12
Um, economy of mechanism keep it simple. Rather than having you know, a network designed that so elaborate and complex. It's much easier to secure a simple design. I'd rather have to protect two doors than 30
04:30
right? So we keep her design logical and straight Ford. We keep it simple because a simple design is easier to protect.
04:38
Keep it simple completeness of design. We make sure that we provide security all the way throughout the life cycle of the software, and we make sure that within any system, there is inequality of security throughout.
04:55
You know, from a physical security standpoint, if you've ever been to a building that has a security guard upfront swipe card. Access all those things and then you walk around back in. The loading dock is open. That's what we're trying to avoid here. From a software perspective, we want a good, complete security design
05:13
least Common mechanism means take advantage of what's already out there so you don't necessarily have to reinvent the wheel again and again. Open design.
05:24
There are two schools of thought with the design of of software and in with with the design of most systems, do I publish and make this information known. Do I publish the code from our operating system on my protocol? Do I make the details public, or do I hide them? Do I keep them to myself?
05:44
So, of course, the first was open architecture, the seconds closed. Architecture.
05:48
Sometimes when we see closed architecture, we think about the phrase security through obscurity. And what that means is, I think, because you can't see it, you can't compromise. That's like me saying, I've put my house can remind that you can't see it so you can't compromise it,
06:06
and that's not true at all. So as a general rule, we prefer open architecture because that allows for peer review now tell you that doesn't guarantee peer review will happen. And if you're familiar with the issues with open SSL really wasn't being refute properly. So just making software open
06:26
does not make it more secure, but it can make it more secure if the review happens. Well,
06:32
um, consider the weakest link.
06:35
And if you were to think about the weakest link in your organization, what could you tell me? Where is your weakest link?
06:44
And if you thought of the answer, people or employees or internal users, you're absolutely right. So when we are looking to design a system, we have to consider that we protect users. And not all of thes errors in these compromises are malicious. Not all users air malicious, but it doesn't take a malicious user
07:03
to delete a key file
07:05
or to destroy the integrity of information. So we always want to think about our users and restricting the damage that users can do.
07:15
Redundancy? Absolutely. We want to avoid a single single point of failure, so we want to design a redundant system that can withstand, um, one element, perhaps not working. We want to be able to have backup mechanisms, redundancy goes back to the phrase
07:34
Don't keep all your eggs in one basket.
07:38
Okay, so more than one, we don't want a single point of failure, basically.
07:44
Ah, psychological acceptability. Meaning your security is not so intrusive that users don't want to participate. Because if security is so complicated that a user doesn't want to do it, that user will find a way to bypass it. And I would so much rather see users on my side,
08:03
then users trying to find a mechanism
08:05
to bypass the security I put in place. Also, remember, security is here to support the business function. If users air having such a difficult time doing the things that they need to do in order to perform their business function that I'm probably not meeting the goals of of
08:24
my organization.
08:28
Alright, separation of duties Ah, separation of duties is gonna make sure that no one individual has too much power within a system or within an environment. So rather than having a single network administrator, what we prefer to do is have multiple network administrators
08:46
each assigned certain roles in certain permissions and certain rights.
08:50
If you work in an organization we don't want the same person that prints the paychecks to sign the paychecks. That's a conflict of interests or separation of duties should help us with that
09:01
other things, and these aren't necessarily essential to software design. But some other ideas for security things like mandatory vacation and job rotation making sure that we have detective mechanisms in place so that no database administrator, for instance,
09:22
is the only one that touches their system. We don't allow other people to come in for investigative purposes and detective purposes
09:30
and make sure that nothing fraudulence happening there least privilege and need to know we have to follow those principles absolutely, positively as much as possible. The principle of Lise privilege says, I'm going to give you just the bare minimum of rights
09:50
and activities that you need to do your job.
09:52
Nothing more, nothing less so when we talk about least privilege, that's about action. What can you do on the network when we talk about need to know that's about data That's about what can you access
10:05
so you don't get the access the sales folder, cause you're not a sales person that's need to know you don't get to change system date in time.
10:13
That's principal of least privilege.
10:16
Hey,
10:16
all right, And then the last element, Jule control. That's another good idea. And what you'll control revolves around is the idea that there's some activities that are so, um
10:28
potentially harmful that we don't allow a single individual in the network to do those activities. For instance, recovering keys for my users. Well, if I can recover your private key, then I have your private key
10:46
and the whole purpose of a private keys to be bound to your identity.
10:50
So that kind of gets us out of true authenticity. So what we might do is, rather than letting me Kelly Gander Han, recover your key, we may require to network administrators to be present before the key can be recovered, and we refer to that as dual control.
11:05
Sometimes you see the old war movies where the madman is getting ready to launch the bomb and he goes down in the control room. And there's a key,
11:13
Ah, that needs to be turned on each side of the room. And since it's on the different sides of the room, no one individual could turn the keys at the same time, and that's a good idea. Good example of dual control.
11:26
So you can get the feel of how all of these elements come together and we have to consider them for any type of system design that we're doing. We got but secure.
11:35
Ah, system engineering. What we're looking to incorporate into our design are all of these elements. So I hope this is helpful. Our next section, we're gonna move through and talk a little bit more about system architecture as a whole.

Up Next

ISC2 Certified Secure Software Life-cycle Professional (CSSLP)

This course helps professionals in the industry build their credentials to advance within their organization, allowing them to learn valuable managerial skills as well as how to apply the best practices to keep organizations systems running well.

Instructed By

Instructor Profile Image
Kelly Handerhan
Senior Instructor