7 hours 35 minutes
everyone. Welcome to another episode in the S S c P exam Prep. Siri's I'm your host, Peter Simple own.
This is less and four in the second domain.
So far in the second domain, we've taken a look at the code of ethics, which describes the standard behavior of all SS IKI practitioners. We've looked at the C I A triad, which is the basic foundational aspect of cybersecurity. We've looked at how to build system architectures,
frameworks for secure systems and how to control them. Using managerial, technical and operational controls,
we've looked at assistant security plans, which are detailed documents describing cybersecurity in systems.
We've looked at secure development in acquisition life cycles, howto build secure systems.
We've looked at secure development and system vulnerabilities and how using secure development reduces system vulnerabilities. And today, in this lesson, will be taking a walk it data and had a manage it.
We'll be looking at how to manage it,
keep it secure,
what to do with it and had to dispose of it securely.
Let's get started.
Data is a driving force.
It is what keeps organizations alive and going. All all business decisions are based on the data and organization has,
and so for that very reason, it's very important for the S S C P practitioner toe understand all aspects of data as well as how to protect it.
The best definition for date of management is the development, execution and supervision of plans, policies, programs and practices that control, protect, deliver and enhance the value of data and information assets.
Very important for the S S c P practitioner to be able to engage in data management activities in order to ensure the confidentiality, integrity, integrity and availability Off that data
different aspects that are very important for secure information storage are things for database size, performance and application compatibility.
There's a lot of different ways to manage control and incorrect data. One of them is known as data scrubber, which is really a security control to protect the integrity of the data. So when the data for testing purposes is important to have real data being used in the testing environment, but at the same time
you don't wanna have sensitive private
production data being used in a non controlled environment,
the way around this is to use data scrubbing, which is overriding sensitive data values with meaningless Once. For example, with these 12 digit number, if you take the
eight digits in the middle and just X amount, you still have the number which you can use for testing purposes. But those who are testing cannot actually see what the whole number is, and therefore the integrity of the private data is maintained.
A way to reduce data to make it smaller is known as David D. Duplication, which is the process that and scans the entire collection of information, looking for smaller chunks that could be consolidated.
Let's take a closer example of this
on the left. In this box, we have a chunk of original data. As you can see, it's made up of different blocks, all them being labeled a, B, C and D
After the data.
After this original chunk of data is de duped,
it becomes a single block of a B C D.
Simply because with the digital data on Lee, the original pieces off data are kept.
If they're similar chunks, they could be consolidated. So, for example,
if you look on the block on the left in the original data pile,
we see a B c D. That is the top row.
If we go down to the next row, we see d
well, we already solved d. So the D in the second row becomes a pointer which points to the original D in the first round.
Therefore, the D in the second row is really not needed anymore. As long as
the pointer exists from the spot where the D in the second row exists to the D in the first room,
the next block is see same thing.
The C block in the second row is no longer needed as long as we place a pointer from the position of that piece of data to the original C block in the first round.
This process is continued throughout the second row and the third row of the original data. And so all that is left is that he do data, which is simply a, B, C and D.
David needs to be protected.
The way is best protected is to have it encrypted
you in crypt and decrypt dated unique keys. You need encryption keys and you need decryption.
These keys are only effective as the organization's ability to securely managed to keep
key management refers to the set of systems and procedures used to securely generate store distribute. Is our cog revoked? And the Leakey's key management policies are very important.
Considerations include roles and responsibilities, which is who has access to the keys and who could use the keys
he generation, which is how keys air generated through random number generators
and using dumb desire key lengths and to make sure they are sufficiently random so that they cannot be guessed
distribution. How keys are given to other people, how they are authorized and authenticated.
Expiration the keys to make sure that he's are deactivated and tossed away when they are no longer needed or after a certain part after a certain period of time,
re vocation and destruction,
which is the getting rid of keys that have been compromised or no longer valid.
Audit and tracking, which is all key management operations should be written down and event logs or records to prevent unauthorized access in modification
and emergency management,
which is a man. Imagine policy,
which specifies the emergency replacement and re vocation off encryption
inspiration. Reds management assigned specific properties to an object such as how long the object may exist and who were what may access it.
Data retention Indisposed
Once data has reached the end of its time, or if it's no longer needed or becomes out of date, it is important to dispose of the manor and dispose the data in such a way that it can no longer be retrieved or seen by anyone else.
So, thankfully, there are several different ways to dispose off this data, depending on the policy of your organization. So one of the ways to securely dispose of data is through shredding. Fretting is cutting the little doc the documents into little tiny anybody pieces
more or less like the size of like confetti.
So you see this a lot and most office buildings. You just put it into actual shredder, and it comes out the bottom in 1000 little pieces.
To get rid of data that's on a hard disk, you can do reformatting, so reformatting is more or less like
doing a factory reset on your hard drive. It removes all pointers to the data so the operating system could no longer see the data.
Now it's important to note with this the data does not actually get disposed off or destroyed. The pointers are simply removed, so the data can no longer be access. But the data is still on the hard disk until it gets overwritten.
More methods off. Disposing of data we have diss, wiping or overriding. This is where existing data
is overwritten with a steady stream of zeros, ones or both. So on a hard disk, if you have 1010111 etcetera, all that gets written over to its all zeroes or all ones, or a random pattern so no one can piece the data back together.
Another method of disposing of data is known as the bossing. So this is races magnetic data on a disc or take using into Gloucester.
Now the Gosar in this photo, in the bottom right corner is a machine that admits or sends out magnetic waves. So the hard drive is put into the D Gosar and the hard drive data gets all scrambled up because of the magnetic waves. So data is stored
on a hard drive in magnetic patterns,
and since this Dugal officer also sends out magnetic waves,
it scrambles up and destroys the magnetic patterns that are on the disk. So in the picture, both in the picture, on the left, we have before the pulsing and after the gold,
as you can see before the bossing the data is stored in specific magnetic patterns. And after the bossing, all of that is all scrambled up
In summary, we discussed data how to manage it, maintain it encrypted and destroy it.
A process that scans the entire collection of information. Looking for similar chunks of data that could be consolidated is known as a big tossing.
Be data scrubbing
sea dated, the duplication
or D data surging.
If you pick, see data duplication. You are correct. Remember the duplication scans, entire bodies of information, looking for similar pieces that can be removed and replaced with a pointer that points back to an original piece.
Thanks for watching guys. I really hope you learned a lot in this video. I'll see you next time