Architecture, Monitoring and Additional Controls

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *

Already have an account? Sign In »

9 hours 59 minutes
Video Transcription
data security architecture
monitor auditing in alerting
additional data. Security controls are all topics that we will be covering in this video.
As a cloud customer, you rely on the provider for strong meta structure security. We previously discussed the financial incentives of the provider to keep this level of security.
Otherwise, nobody will want to use them.
The Medicine Lecture Security protects the management plane, but it also protects the overall cloud network from compromise. You can then architect your data security, knowing it's safer to keep things within the providers metaphorical walls.
As an example, let's consider transferring data between resource is that reside in different AWS regions. BBC is the AWS term for a virtual network that you define using STN, a BBC, cannot span more than one region, which in this example means the different cloud resources that need to talk will be in different VP sees.
You could create public endpoints in each vpc and have the resource is communicate over the general Internet.
Assuming you encrypt the data in transit between the two endpoints, this is still going to expose an attack surface that you can avoid with other architectures.
Aws is vpc peering approach allows you to connect to VP sees even if in completely different regions and the resource is within those separate BBC's can communicate.
However, all the communication goes over the AWS network backbone and this means it does not get exposed to the general Internet. If you've employed network isolation and the VP sees have overlapping I p space, this specific approach won't work
depending on the payload size of the information that you're transferring between. Resource is in the two VP sees there are other options like message queuing services and even s three cross regional replication, both of which also benefit by keeping the data within the providers backbone. I hope these different examples hit home to you about using architecture.
Leveraging the providers infrastructure to keep your data is secure is possible and keeping it within the walls within the constraints and control off the providers. Resource is
domaines 36 and seven,
which are covered in modules 47 and eight of this training. Talk about monitoring, auditing in alerting and more detail.
Looking at these through the specific lens of data security, there are a few information sources you'll want toe lean on more than others.
For example, meta structure logging focus on getting the A P. I. Activity logging as well as logging related to past services that you might use
Apple A structure logging. This is pulling traditional event logs from virtual machines and applications running on those virtual machines, and you're gonna pipe them all into a software incident and event management system.
Also, consider using database access management to monitor and keep track of all the data use. Be sure these log files are all in a safe and secure location. This makes us hard for an attacker to remove evidence of their escapades
and also establishes a good chain of custody in the event that you need to rely on those logs in some sort of prosecution.
We're gonna cover a few additional data security controls to take into consideration cloud platform and provider. Specific controls can be very valuable. They'll very based on the specific provider, and they're constantly expanding, especially those I as providers.
For example, Ezer has Web application Gateway, which provides a built in Web application firewall capability.
Google's Cloud Security Command centre gives you anomaly detection using machine learning, and it also provides some data loss prevention capabilities.
You're going to really want to rely on your provider and knowledge of the providers documentation to understand the specific capabilities that are out that are coming and, more importantly, how to use them and integrate them into your cloud based architectures.
We spoke about DLP earlier in this very domain since is repeated in the C. S. A security guidance. It's something you really want to know about for the CCS K exam. So we'll summarize a few key points about the LP here.
It detects the data. Ex filtration and data misuse,
however, requires a lot of configuration in training. Often this will be done by the DLP provider itself and then tuned for your specific scenario in the kind of data that you're on the lookout for. It sits at either end points network egress points, or
it can be pointed to specific data storage locations to monitor those.
And that's how it gets integrated into the full picture so that it can monitor accordingly the use and transfer of data. We also spoke about casts and the fact that DLP is often come
bundled with or integrate closely with the cast providers since the two technologies often go hand in hand when we're looking specifically at the use of cloud and the use of sass
based services.
Enterprise a Little writes for management, allows you to control the actions performed on specific media. Personally, I'm quite familiar with this. You may recall that I worked in the entertainment industry, and so as a consumer, you've been exposed to the consumer equivalent, which is referred to his digital rights management. This is where they're keeping track of and putting controls around. Who can
use which music files who can watch videos, replay videos, send videos to your friends. Long gone are the days of MP threes, which had absolutely no digital rights management in them.
So literally, if you had the file, you could play the music in more modern situations such as the iPod,
the music and that you get can only be played by you. And if you were to take that file and send it to somebody else, they can't use it well under the covers. They're implementing digital rights management technologies, and this actions also can be applied into enterprise rights management. Since you're going to create digital media yourself
that you may want to have tighter controls on
full DRM actually relies on encrypting the file. So technically, the files that you're getting in your iPod they're not MP threes. They're encoded using a particular technology called a C Apple audio Codec. But then there's an additional layer wrapping around them to encode that information, and it can only be decrypted
through the use of a centralized server, which then says
who you are. Do you have the right to view this file? And then it's gonna allow for the decryption to take place on the device. Same situation applies for enterprise rice management. So if you were to send his encrypted file to some sort of a SAS provider, they're not going to know what to do with it unless they themselves support this particular
technology. Used four rights management of the content.
However, providers may have their own controls available. For example, you can restrict how someone can interact with your office 3 65 document that you're sharing. You can restrict it on the device that they can view it on the actions they can take etcetera, data masking and test data generation is based on token ization something we previously covered,
You may recall this is where you preserve the format
of specific data commonly textual string data. But you're altering the values of that actual data using neither substitution techniques, data shuffling, format, preserving encryption or just standard mask out. So here I have a simple example. You have a credit card number
and then the mask out technique, he would be Xing out all of the values. Except for the last four digits of the credit card,
you can accomplish data masking through two different approaches. There is that the test data generation also referred to his static masking. In this approach, you're going to do an extract data from the production database. You're gonna transform it, and otherwise you're gonna do a pass through and perform the masking activities, imply the different
algorithms to essentially clean your data,
and then you're gonna load it into a test environment.
Alternatively, there is the concept of dynamic masking, which typically is going to involve some sort of a proxy on the way that date is leaving, either leaving the storage or leaving the underlying database. And it's going toe intercept that data in transit and then modify and alter the data in transit.
In this video, we went over data security, architecture, leveraging the platform providers capabilities. We talked about monitoring, auditing and alerting specifically the kind of logs that you want to take into account and how you want to be sensitive in the way you treat those logs.
Then we went over a variety of different data security controls that exist cloud provider controls, DLP enterprise rights management, data masking and test generation.
Up Next