Time
1 hour 22 minutes
Difficulty
Beginner
CEU/CPE
1

Video Transcription

00:02
So let's take you into I C s Scott of fundamentals. What is I t vs I. C s.
00:08
The learning objectives for this portion of the course is what is I t vs I. C s technology. We're gonna define what a nice ES system actually is. Some of the General I. C s threats and General I CS risks.
00:26
So I t vs I see us. How can they be that different? Let me show you.
00:33
We define what a nice ES system is. Uh, it's kind of an all encompassing type of definition. So a collection of personnel, hardware, software and policies involved in the operation of the industrial process and that can affect or influence it's safe, secure and reliable operation.
00:52
Now I want to stress the words safe, secure and reliable because that's very important. And i t you're not so concerned most of the time with safety in terms of loss of life or limb. But when you're dealing with an I. C s system
01:10
that can involve things that can unfortunately cause a lot of harm
01:15
and
01:15
we also have a similar problem in the I. C s world that we have in many other worlds where we have a lot of different names for the same types of things. And I'll explain some of these right now a PLC as a programmable logic controller,
01:33
uh, I C s industrial control system production, industrial
01:38
control, distribution. All of these air kind of wrapped into that. And what you're doing is you are using technology to move things and produce things.
01:49
So we're used to your typical I t system, and that's what you're viewing this course on right now. And typically it's confidentiality, integrity
01:59
and availability now flippant, upside down and think of availability, integrity and confidentiality. The reason being is when you turn on the faucet to get water out, you want clean water and you want the water to actually come out. That's where availability comes into play
02:17
integrity. You want to make sure that any commands that are sent over
02:22
and I see s network or production network eyes actually a true command and has not been modified or mangled because it's very important and an O. T. Or I. C s environment see now through new terms that you, um that everything has to be very interoperable when you're dealing with I t networks and systems.
02:42
There's air in control, for example, the TCP Protocol.
02:46
But there's no real error protection and control for an I. C s protocol. So you have to be very careful. What you do can also be thought of as, um and the I. C s world.
03:00
Think of a puzzle that's made out of metal pieces that have to tightly fit in. Exactly. And when you put it together, you can't even see the seams.
03:08
And that's a huge difference between I CS and I t.
03:13
Now why is I c s important? Well,
03:16
think of the different things that I CS runs on. You can have factory production. You can have nuclear power plants, water and everything in between. The medications that we may or may not take are actually produced using control systems
03:36
on assembly lines. And so it's quite important that we get the actual medications that were intended.
03:43
So, um,
03:45
there are certain trends with I. C. S and production systems. And as with all things, money rules the world, there's a lot of business that air driving, ah, lot of decisions which unfortunately can cause security risks.
04:00
Now think about some of the security risks. If you outsource part of your security for outsource part of the maintenance of your system, they're not going to have the same vested interest in your organization that your team would. If they actually worked for your organization.
04:16
There might be changing regulations, which may make things better or might actually weaken the infrastructure.
04:24
Ah, lot of companies are very, very concerned with return on investment. Are they going to put the money into securing these things if they don't actually see a value in it?
04:34
Sometimes when you're dealing with third parties that have access to your control systems, they may not be using best practices for security but written in the contract. They may require
04:46
something, for example, like a remote management connection so that they can see what's going on in the environment. And they might be using something very weak, like V N. C. And a very old version that doesn't encrypt or doesn't even require log in.
05:01
So these are very concerning things, so you can definitely think of it as business first,
05:08
safety second and security third
05:11
again, one of the major elements with production networks is, these things can actually be dangerous, so you have to think of the safety aspect.
05:21
Now, back in the day, the technologies that were used were not well known unless you were a process engineer. Electra electrical engineer, were certain other types of rules and would work in a production environment.
05:34
So a lot of folks in the industrial world Hey, nobody really knows about these protocols. So because there's so kind of niche, then they're not really at risk. Unfortunately, that is not the case anymore.
05:49
Um, the knowledge. But I see those protocols is now more and more known because it's actually
05:56
use much more widespread nowadays.
05:59
Now that puts us in a very bad spot, because when these protocols were actually invented and engineered, they didn't think about security. Some of them are very, very, very old,
06:11
and this poses a huge problem because it's always easier to secure systems and protocols when you think of security first, versus if you try to both things on top of it.
06:24
In addition to that, because some of the risks are not quite understand by operational technology engineers or the entire business itself, there is typically a lot of ah, very limited risk management done to them.
06:38
Another issue is because there is a poor auditing due to the legacy devices, the different devices might not be capable of logging anything. So if somebody is trying to log in in an unauthorized manner, there might not be a log ever created. And you might never know
06:58
if you're with a security team. That means that you'll never get an alert in your security systems.
07:03
And this is very perplexing, because if a system happens to be broken into, you don't have the logs toe audit to see if anything has been done.
07:15
There's also a problem with auditors themselves. If they come from the I t world, they might not know how to audit the I. C s world in an appropriate manner.
07:27
Now the reason why this is so concerning is because of the threats that are currently facing production networks I CS networks. Now I put down Cyber as a weapon for reason, and in 2014 the German government's secret service, one might say it'd be Andy came out with a report
07:46
referencing a German steel mill which had been attacked.
07:50
The Attackers showed a very high level of sophistication in the business network and high level of sophistication in the I. C. S production network.
08:00
And what happened Waas There was most likely a mistake. It seemed, almost as if most likely a nation state
08:07
got into both sides of the network. And there was a pressurization issue in the steel mill steel mills. They're kind of dangerous. And when the over pressurization occurred, the safety systems which were supposed to cool things down and shut things off, no longer functions.
08:26
And so that steel mill actually
08:30
overpressure rides and blew up it killed at least three people and injured horribly many other people. This is not a good thing.
08:39
We also have to deal with hacktivists now. When I used to lecture for something called CP and I, which is a Center for the Protection of National Infrastructure, is part of G. C H Q. In the United Kingdom, I had an opportunity to lecture at all of the English nuclear facilities and the last remaining
09:00
coal facilities for energy production.
09:03
And at the coal facility, uh, one that I visited last. They had very, very tall offenses, and I asked, why do you have such humongous fences? And they said, Well, it's because of the risk of protesters actually overtaking this plant.
09:18
Now it's always nicer if you are, say, activist. Ah, where a person protesting something. It takes a lot of effort to get to a physical location, but if you could do it remotely from anywhere in the world at any time, then hey, that's way easier.
09:37
So a lot of these facilities have to worry about the hacktivist threat.
09:41
Another issue is because we're connecting a whole bunch of things without understanding that risk without the understanding that we are introducing new technology until legacy systems. Then what happens is there are a lot of people in these environments that are completely untrained
09:58
or not trained up as they should for security.
10:03
And so when something happens, unfortunately, they just don't know. It happens war. If they do know something happens, it means that usually something has failed and brought the production network down. That's a very bad scenario to be him.
10:20
Now there's ever increasing malware that can affect I CS systems. Recently we've seen in the past 10 years something called Stuxnet out of Stuxnet. There was a family with flame.
10:33
There was something called Shamoon which I handled myself, which did actually affect to production facilities in Saudi Arabia.
10:41
And it's moved on to something a bit more recently called Triton. There's also been issues with three Ukraine where electricity has been shut off using attacks against the I. C s infrastructure, as there have in Israel
10:56
and in the country of Georgia, there actually attacks against the satellite infrastructure that belonged to Georgia. These air also control systems, so this is very, very concerning
11:09
now. We also have to worry about accidents. Human errors, thes are things that exist everywhere, but how you can react to those are very important before, say, there's an over pressurization as you and a steel mill blows up. That is not the case as you, not the situation that we want to be in.
11:31
The reason why availability is so important is because I think, like all of us are all computer systems. We have electricity, we have running water, but having a blackout can be a very serious matter. You need electricity to produce water, you need water to produce electricity.
11:50
So in order to have those two major elements of the modern world. We have to make sure that availability is paramount when we're dealing with I. C s systems.
12:00
Now, some of the other differences is with an I T network. You can have an occasional failure. It's tolerated.
12:07
It might be a bit of a hassle if you can't get to your email, but in an I. C s environment, that could mean a blackout
12:15
in 19 network loss of data. Well, that is a pain. You might have to redo a lot of work, but
12:22
in the I. C s environment,
12:24
we could have a loss of life
12:28
in an I T environment. I'm sure everybody's hurt. The words Have you turned it on and off again? You can't really do that with a nuclear power plan, which, by the way, in many cases run Windows systems on the control network. So that's very problematic. And I don't think anyone ever wants to hear the words
12:46
Have you turned your nuclear power plant on and off again?
12:50
Now, in a 90 network? Ah, when you're a business or regular business, typically, organizations don't have very high security.
13:00
Andi in an I. C s environment Because of the criticality. They have typically very tight physical security anywhere near what's called the control room where the engineers and operators are there actually controlling these control environment sees I CS Environments
13:16
with the 19 network. We're building faster and faster networks faster and faster Internet.
13:22
However, in an I. C s environment, especially older environments, they have very low to modest throughput and that also places limitations on things like an I. D. S or an I. P. S.
13:39
What can happen is if too much network traffic is going over an I. C. S network, that could mean that it can flood that network and halt commands that are legitimate being sent over.
13:54
Now with a 90 network you can usually test in the field. However, uh, you need to test things out before you put them in an I. C. S environment. That's very important again for availability and the fact that loss of life and safety can be an issue
14:11
for 90 network. The big focus is your central server, and that could be something like active directory or S a P to keep the business running.
14:20
But because there's a lot of field devices and I CS think of a water pump from your water company. They're not gonna put a water pump in a residential area because it's very loud. Eso Sometimes you've got these edge devices, and you have to be aware of that.
14:37
Now, on I I t network again, safety is usually not an issue. Um, in an I. C s environment, you have to think about all the different types of hazards that can come into play if something is modified or tainted or manipulated or halted.
14:56
Now, in a 90 network, you usually have a minimal separation between the different Internets, so you might have a printer network. You might have a regular business network and email network, but at the same time, you can't quite do it in that manner with a nice ES network.
15:13
What you usually want in an ideal situation but it doesn't always happen,
15:18
is that the business network is completely isolated from a nice ES network, and in many times that can also be called an air gap. Although that's not the exact definition of an air gap
15:31
in an I T network,
15:33
you can have delays and variations and that's okay. But any sort of hide delays in an I. C s network means that that availability and that integrity can be highly altered. You want to get a command from A to B as quickly as possible.
15:50
That also means that if you start incorporating encryption
15:54
in an I. C. S network many times you'll cause what's called Leighton see and they'll be a delay in those commands. That's one thing you definitely do not want.

Up Next

ICS/SCADA Fundamentals

The ICS SCADA Fundamentals course is taught by world-renowned cybersecurity expert Chris Kubecka and will introduce students to basic critical infrastructure concepts.

Instructed By

Instructor Profile Image
Chris Kubecka
Founder and CEO of HypaSec
Instructor