CRISC

Course
Time
5 hours 20 minutes
Difficulty
Advanced
CEU/CPE
7

Video Description

This lesson covers enterprise risks in databases which consists of: - Code injection - Scripting - Aggregation - Inference - Entity, semantic and referential integrity There are also enterprise risks in utilities which consist of: - Power - HVAC - Humidity - EMI - RFI The unit also covers enterprise risk in network components and users. Finally, the instructor offers a brief summary of the key points of the entire model: - Risk assessment overview - Risk assessment techniques and tools - Evaluating current controls - Risk and control analysis - Risk analysis methods - Enterprise architecture risks

Video Transcription

00:04
all right now, in considering continuing our ah assessment of risks within the enterprise, one of the risks that we have to think about is the potential for harm to our databases and the old joke. Why drop the bank? Because that's where the money is. Same idea applies to database. Why would you target a database? That's where the data is.
00:24
And, of course, as an attacker. That's what's appealing to me. Whether I'm after credit card information,
00:29
proprietary company information, whatever it may be, it's stored in the database. So when we think about databases, several different attacks, we wanna look at code injection scripting those two can kind of come together aggregation and inference. So when we talk about code injection,
00:47
ah, garbage in garbage out.
00:50
And the way people input the information into a database is through the use of forms. So, for instance, if I ask you to fill out a customer satisfaction survey on the Internet where you input your information is called the database form and what happens with that information in the form that you input goes to the back in database?
01:08
Well, if I don't do some sort of analysis of what? The general public
01:12
inputs. Then I would take anything the public inputs dump it into my database, and I can wind up with corrupt information. I can wind up with inconsistent information, but much, much worse is code injection. If you can input it and it can go to the back, end the database with processes.
01:33
And you do not have to be a sequel expert to know that the command drop tables
01:38
Not gonna be very helpful from a database perspective. Right? So what do we do with you? Input. Validation. Which means I'm on Lee going to allow you to input the bare minimum. And if you think about if you go to a website, you make a purchase.
01:52
And they asked how many items she want. They don't give you a text box where you type in one or I would like one item, please. Why? That's way too much freedom for me as an unknown entity. So what did they do? They give me a drop down here. Oh, there's not a lot of damage I can do by clicking the number one right.
02:12
So ultimately that's input validation. I am forcing you to inner
02:16
information in the proper format. As a matter of fact, if you've ever done in the study on security, architecture and design, one of the things they talk about a lot is they talk about security models and security models are concepts on which systems are designed.
02:34
And there's one that's my personal favorite called the Clark Wilson Security Model. And the Clark Wilson security model essentially says you can enforce well formed transactions through the use of the access triple. That's kind of it. In a nutshell. What does that mean? What that means is keep users out of your stuff
02:53
or they'll break it.
02:54
A so four shoes er's who are untrusted through an interface that is trusted
03:01
before they access your back in data, which is precious to you. Keep users out of your stuff. They will break it.
03:08
That's why we don't give users the password to our databases and say, going in there and enter whatever you want, right. They get a front end application that's very tightly controlled. Other things that are important not just to regulate on give them drop down errors whenever possible, regulate field size.
03:27
There's not a whole lot of damage I can do in two characters.
03:30
You give me 500 characters. We've got all kinds of problems that could be calls.
03:36
We also want to scan the input for things like data control language. Like I said,
03:42
nobody's last name is Johnny. Drop tables, right? We shouldn't have brackets in our name. So those things that are very common from a code injected objection Ah, code injection standpoint. We've gotta scan for We've gotta check those
03:54
aggregation and inference. Aggregation is the collection of information.
04:00
Once that information is collected, I can then make an inference. So the idea is, if I walk into your office and I see your roll index because really a Rolla Dex is a database and a very pure and simple form.
04:14
I see a whole lot of customers. Some have gold stars. Some have
04:19
brown stars, black stars. Whatever I may may make an inference about the value of those customers based on the color of the stars, for instance, and with database is one of the problems is one of the big problems that we would think about with databases
04:36
is sometimes you have to allow users access to information of course,
04:41
and that information may not seem significant. But if I give you enough low level, significant information, you might be able to pull that together and make a high level assumption. I'll give you an example.
04:55
I went out to lunch with some friends of mine, and ah, one of my friends kind of scooted into me. So I knew she had some good gossip for me. So I like to perk up my years. I just want to go on record as saying I am not a gossip,
05:06
but I will listen, I've always I've always got time to listen.
05:11
So she scooters in and she says, Hey, did you know Holly was pregnant?
05:15
And I said, No, that's great news. When did she tell you
05:18
and the answer totally crack me out because she said, Oh, she didn't tell me.
05:24
Well, how do you know?
05:26
Well,
05:27
she went out to dinner with us last Friday night,
05:30
and she didn't drink a thing.
05:31
Okay,
05:32
that's not all. She went to the doctor last week.
05:36
That's not all that significant. That's not all. Ah, Karen saw her in the bathroom Tuesday morning, and she was sick, as she could be.
05:46
So what we have is we have an aggregation attack. All of these seemingly unrelated pieces of information. Ah, good gossip watches. And we'll pull that information together
05:58
when analyzing all the seemingly unimportant bits of information. When you pull it together, you can come to the conclusion of something in a much higher level of classification. Because we'll see instances like this in the military. If I give too much unclassified information, put too much out there,
06:16
someone might be able to pull that together
06:18
and make an assumption that I actually want to protect
06:21
the interesting thing about all this is my friend Holly was pregnant. So what? That shows you is sometimes aggregation and inference work.
06:30
Now, my friend Holly, if she really didn't want this information disclosure, she had two choices.
06:38
The first choice is she could have just said none of your business.
06:41
Okay. How come you're not drinking tonight, Holly? None of your business.
06:46
Uh, where have you been? It's kind of late for you to be getting in None of your business.
06:49
The problem is when I say it's none of your business to a gossip, that's like game on, right? That means. Oh, this is interesting. I'm gonna do a little more digging,
07:00
just like in the military when we label something top secret.
07:03
Oh, that tells me that is of a certain value. Just the label itself indicates this is no warning. This is important.
07:12
So, honestly, what Holly should have done is just lied.
07:16
I'm not encouraging. Lying. As a matter of fact, when I'm gonna call it lying, we're gonna call it Polly in Stan, she ation right off the bat. That sounds much better, right? Much better than lying. Polly in Stan, she ation multiple instances because of Holly had just said, Yeah, I'm not drinking because I'm a designated driver. Later, nobody would have given it a second thought
07:35
rather than saying she came back from the doctor. Say I came back from the dentist or I had an appointment, right? So the idea with Polly and Stan she ation and using that to protect information in our databases? Let's say I work in a naval base and I see a ship that's getting prepared. They're loading it up.
07:51
Ah, and I'm aware that it's it's gonna be leaving shortly.
07:57
Well, if I look at that information, and I see the location of the ship is top secret
08:03
right away. That tells me this is significant. Something meaningful is happening here.
08:07
But if instead as an unclassified individual, if I log onto the system and I'm given a view that says the ship is bringing food off the coast of Africa,
08:18
that's not particularly interesting to me. It's satisfied my curiosity case closed,
08:24
but someone with top secret clearance could log on to that database and see that it's actually bringing munitions somewhere out in the Middle East.
08:31
So what we have is multiple instances, and that's one of the ways that we can protect the sanctity of the information in our database.
08:39
There are lots of other things that we can do these air just a couple of ideas that we want to think about with databases as well. One other idea, and it's actually not on the slide. But I'll mention this to you.
08:50
Anything that I store in a database.
08:54
Let me back up on that personal information that I've stored in a database. I have an obligation to protect right personally identifiable information I've got to keep safe. It's appealing two Attackers
09:07
so some ideas around that
09:09
if you don't need it, don't store it
09:13
and
09:13
minimize it. So, for instance, I may ask for your credit card, your Social Security number. I'm a bank, and I want your Social Security number when we set up your account.
09:24
If I store that, I gotta protect it.
09:26
But if I just store the last four digits, I can use that his authentication information and that doesn't have the same value to an attacker as your full Social Security number. We refer to that as data minimization. So yeah, that's not on the slide. And quite honestly, that's not even a testable idea.
09:43
But just when we're thinking about risks and we're thinking about strategies to protect our databases,
09:48
those are some ideas. Okay, now gonna move on to utilities. Don't forget the world of utilities because,
09:56
first of all, risk management isn't just a IittIe facet, but second of all, man utilities play a huge part in providing clean power and a clean environment to our computing systems. So issues with power, too much power, not enough power. So what I have over here are some problems with power,
10:16
short term and long term.
10:18
So when we talk about a big increase in power, that's a spike. If it's a very short term, that is a surge. SAG is a temporary dip.
10:26
Brownout is much more long term fault. I always feel like the fault is when you're at work and the power flickers just long enough to let you think maybe we'll get out of work for the day
10:39
blackout. It's obviously much longer
10:43
H vac system heating, ventilation and air conditioning. We've got to think about. We want positive pressurization in our room so that air flows out. Contaminants flow out.
10:52
Humidity in a data center is very important. You wanted somewhere around 50% temperature in a database, or I'm in a data center or in your server room. And it's so important.
11:03
We want protection against electromagnetic interference and radio frequency interference as well. So utilities again can present a risk network.
11:15
You know, the network infrastructure itself, the equipment that we use their many types of attacks that are specifically directed towards switches and routers, proxy servers, network service is let me just tell you so we could spend a week talking about network components and the vulnerabilities and the configurations
11:33
from a C risk standpoint,
11:35
just some common basics. Okay, for any device, you always want to make sure that if a service isn't required, it's removed,
11:46
Right? So, for instance, maybe I have a device that supports both I p version for and I p version six. Well, if we don't use I p Version six on our network, we may want to consider removing it from our devices. Now, please understand when I talk about getting rid of unnecessary service is unnecessary protocols.
12:05
By no means
12:07
would I ever advocate doing that without going through the change control process. Because certainly in a Microsoft environment, you may not feel like you need i p six for your day to day. But so many service's rely on that being installed when you pull that off, and all of a sudden, hey, why is
12:26
branch cash not working? Or why's this element not working or another?
12:30
It comes back to I p. Version six. The importance of configuration management and change control cannot be overstated. So as a general rule, when you find elements that are not being used on your router switches, servers, whatever
12:46
start the process to examine whether or not they can be removed.
12:50
Change default settings. Ah, for a long time with certain devices. I believe Cisco's default administrative account was admin for a while, and the password was either password or Cisco or something very generic. And, of course you got to get into the box to set up the initial configuration.
13:11
I am stunned at the number of people that don't change those default settings, right.
13:16
You gotta change the default settings most of time of your usual link cysts or Netgear router, and a lot of small companies do. You can access that device at http 1921681.1 worth a try. Go ahead and give administrator a try and password to try for the password and that will work.
13:35
We gotta change those default settings.
13:37
Physical security. Don't forget. Lock your devices up. I was out at a shopping center the other day. Um, I think it was a A photo store. I was getting some pictures of my kids done something like that. At any rate,
13:52
I went to the restroom and I kid you not in the restroom on a shelf with their router, and they had the connections into the router. Little Link says Netgear route or something like that. And that's not really what we could would consider good security.
14:07
I could have walked out of there with a router. I could have reconfigured. I have done all these things.
14:11
And
14:13
again, it's not that necessarily. I would even be targeting this little photo shop in the mall,
14:20
But systems could be commandeered. Tow launch, downstream attacks.
14:24
Think about the information that was going through that router. So I give my credit card to pay, make a payment for the photos or whatever that information's going somewhere. Probably threw that unsecured router sitting in the bathroom, So we've got to make those just good common sense decisions. Some cable types are more secure than others.
14:45
Twisted pair is very easy to tap into. Its very susceptible to interference.
14:50
Um, fibers expensive though the cost of fibers coming down significantly so that maybe more viable option.
14:58
So many vulnerabilities come from the network. Absolutely. We gotta take care of our devices
15:07
now,
15:07
users,
15:09
enough set. Let's move on. Just getting users present a tremendous threat to the organization, and I was reading an article, and I'm just roughly what it says is 20% of your users will do the right thing because it's the right thing to do.
15:26
20% of them are just
15:28
absolutely beyond reproach. Regardless,
15:31
20% of users are going to see if they can find a way to get away with a little something
15:37
and the rest fall somewhere on a continuum.
15:41
So the idea is most people, the majority of folks, if there's enough pressure
15:48
and they get an opportunity and they have that ability to rationalize
15:52
many users that you wouldn't necessarily think would be involved in some fraudulent activity can be persuaded,
16:00
you know, think about pressure,
16:03
man. I'm on three months behind on my mortgage. All of a sudden I have access to a system, and maybe I could squirrel away just a couple of dollars from all these users.
16:14
1000 users there are gonna missed $3 from their bank account. They got that money to spare. There's your rationalization so those elements can come together and fraud could be anything from falsifying time sheets, salami attacks or what? I was just talking about taking a little bit of money. Ah, little bit of the time.
16:33
A term you may not be familiar with. It's called data diddling.
16:36
Um and I don't make these terms up. By the way, let's say you come through the drive thru Taco Bell and I tell you, Ah, this is gonna cost $5
16:47
and I ring it up his 50 cents and I pocket the other 4 50 That's Daddy's diddling. The benefit of that is, at the end of the night, my till was gonna come up right?
16:56
So can be difficult to track.
17:00
Hell hath no fury like a disgruntled employee, and I have story after story after story, and I'm sure many of you have your own stories. We have to have a good, solid process and set of procedures in place
17:17
to make sure that if there is a termination of an employee that it's handled gracefully and professionally and not to say the least securely so users can certainly present threats to us. So that was the data assessment element. We talk about assigning values to our risks.
17:36
Ah, and the idea behind that is that once we know the value for our risk,
17:41
ideally, we'd like a dollar value of the risk that will tell us the potential for loss and that will guide us in our decisions for risk mitigation, which is gonna be the next section. So we've done risk identification, enumerate the risks,
17:56
risk assessment, figure out a value forum. Our next step is gonna be respond to them.

Up Next

CRISC

Archived Certified in Risk and Information Systems Control is for IT and business professionals who develop and maintain information system controls, and whose job revolves around security operations and compliance.

Instructed By

Instructor Profile Image
Kelly Handerhan
Senior Instructor