3 hours 16 minutes
welcome to Lesson two, module one within the Attack based Stock Assessments training course. In this lesson, we're going to be discussing a few various methodologies you can use for running an attack based stock assessment
to kick off this lesson. We're gonna look back to 2017. When we ran our first attack based stock assessment,
this assessment was for an organization that was doing a really good job with cyber hygiene and protecting their perimeter and just generally running a modern, up to date sock.
But they were looking at their practices, looking at everything they were doing and saying, Hey, how can we get better? You know, how can we start orienting our operations towards the real threats, the real behaviors adversaries are exhibiting in the wild and really take our security posture to the next level
and running? The assessment helped them a lot to understand where their current gaps were and really where they should pivot towards, you know, advancing their security operations.
Since that first assessment, we've run a variety of you know differently, scoped assessments be they from those that you know are a little smaller scale. We only look at, say, a little bit of the attack framework or a portion of the of the sock or those that were full attack based stock assessments where we look at, well, everything.
Throughout the process, we've learned a lot about how to run assessments, and we've also seen a lot of organizations expressed a ton of interest in running an assessment. And a lot of organizations recognize the value in the attack framework and threatened form defense and see assessments as a nice stepping stone for
really pivoting what they're doing towards more, more more threatened formed operations.
We've also had a variety of positive outcomes When running these assessments, more often than not, when you run an assessment, you identify gaps. You can help an organization build out a more structured analytic development program. And you also have other positive outcomes, which is general growth.
Things like, you know, better tooling, more data collection, just more enhanced sock processes. These all come out of running an attack based stock assessment.
Speaking a little bit about our methodology.
We really designed it in a way to map sock capabilities back to the attack framework, but do so in a way that paints broad strokes of technical capabilities. Our goal is really to provide a rapid first look view into the socks current state,
our target Orleans being socks, wanting to integrate attack into their day to day operations.
From a process perspective, it's completely hands off. We just do documentation, analysis and interviews, sock personnel with no hands on systems.
You know, the idea here is we want to have a scalable, rapid timeframe ranging between 1 to 4 months that is super low overhead for the sock we're working with here. From an input perspective, all we take in are the socks, documentation and then less than 16 hours of interview time.
As an output, we produce an attack coverage, heat map,
prioritization plan for things the socks should focus on, and then recommendations that the sock can use to really improve their operations,
diving in a little bit deeper to the methodology you can see here this nice little arrow. It shows kind of the four main stages that we like to to walk through in the first stage. We like to set the stage where we work with the sock to make sure we're on the right page here We'll provide a little bit of a preview of you know what? We want to talk to them about what the interviews look like.
We'll make sure to set expectations. So we're on the right page,
and then we'll also discuss the timeline to make sure that you know we're executing at the time step that's acceptable to the sock we're working with.
Once we're on the same page, will then start asking the sock for documentation and we'll analyze their tooling and tooling analytics and data sources
here. We're looking for documentation specifically about, you know, the tools and sensors they've deployed analytics. They put into the same platform log collection, guidance and really any other relevant documentation the stock can provide. There are things like incident summaries of reports, red team reports. They're hunting procedures,
really anything the sock thinks is relevant
or even if it's on the fringes of relevance. We like to ask them for it so we can review it and see how we think it maps back to the attack framework.
After doing a first pass of the documentation well, then interview socks staff here we're looking to ask a set of standard questions you know, things that just generally help us understand operations. Also questions that helped clarify the documentation. You know, I can't tell you how many times we've done an assessment and found out that what's written down in the documentation is
just a little bit different than what's being done in practice. And then, of course, any known gaps or strengths here. The sock often knows what their strengths and their gaps are, and we can just help them kind of communicate. Those with the attack framework
finally will then produce and deliver results here. We don't just deliver them blindly, but we work with the sock to make sure they agree with what we're saying. You in some cases, you might find that, Hey, these are some huge gaps you have or these are some huge strengths, and you might show that to a sock, and they might look at you and say, No, that's that's not him.
So incorporating feedback is a huge part of this process, and once we're there, we then once we have that feedback, we then compile the final heat heat map, an out brief and a report to deliver to the sock to help them improve
their operations and orient around the threats.
Now this methodology has been super helpful for running these third party assessments. But they're not very generic,
you know, and and this has been a shortcoming in in the methodology in that those socks that want to run their own assessment really understand what they're doing internally. It's hard to scale this approach towards
towards that that other use case.
So that said, we started looking at this assessment methodology from the perspective of, you know, how do we make it generic and really diving into that? There's three main ingredients that that should go into any attack based stock assessment.
The first is, of course, a heat map showing coverage.
This is one of the key parts of an assessment. You really want to use the heat map to not just measure what the sock is doing, but communicate and convey how their current attack posture looks like.
The second is a summary report. A heat map is, of course, great. It's a key ingredient. You should always have it as part of an attack based stock assessment, but a report is essential because it helps describe the heat map and really highlight key areas. The socks should focus on when they're trying to build out improvements
and towards improvements. Recommendations are the other big part of an attack based stock assessment, where it's not enough to just run an assessment, produce a heat map and then walk away. But rather you want to run an assessment, produce a heat map, produce a report, then also provide the stock with recommendations
that are useful that allow them to build a better threatened form defense.
So with those in mind, we've come up with kind of a generic way to run an assessment that's useful for those who are doing this, say, as a third party or those who are doing this in a house
and the first part of the assessment. You want to frame the assessment. Here, you determine the scope and you set the expectations. Your goal is just to make sure everybody's on the same page, and they want to run an assessment.
After that, we'll set your rubric.
Hear what you're doing is defining what coverage means and you'll find throughout this course. I I say the word coverage a lot, and so this is very important when you're working with the sock because it's a key part of running the assessment.
Once you framed the assessment and you set your rubric, you're ready to start deep diving. Here, you analyze components. The goal is to look at the technologies in the sock and map them back to the attack framework.
From there, you'll then interview staff here. You'll get real experiences to understand how things actually work.
With that in hand, the interview results the analysis of each components You're ready to compile your results. Now you combine all of this analysis to get your final heat map in your report, and from there you're ready to propose changes. Don't ever just run an assessment and walk away, but always provide recommendations to help the sock get better.
So close out this lesson. Some some summary notes In a few takeaways. First assessments can be conducted either in house or by a third party.
Either way is viable, and there's a pros and cons to each approach.
Assessments should always produce number one a coverage Heat map number two are written summary of coverage and number three proposed changes that can help the sock improve their coverage.
And then, lastly, most assessments should follow a relatively straightforward script or methodology of first framing the assessment.
Second, setting a rubric for what coverage means. Third doing a deep dive. Technical analysis of the socks components.
Fourth, interviewing socks staff to get that on the ground understanding of what the sock is doing.
Five. Compiling the results into a final heat map and, lastly, proposing changes and recommendations that the sock can enact to better improve their coverage.