Communicating with ATT&CK®

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *

Already have an account? Sign In »

3 hours 16 minutes
Video Transcription
Welcome to Lesson two, Module three. Within the Attack based Stock Assessments Training course. In this lesson, we're going to discuss how you can use the attack framework to communicate results from an attack based stock assessment.
This lesson fits into the compiled results phase of our generic assessment methodology and in particular is kind of a precursor for compiling results here. In order to actually come up with the final heat maps. Heat map looks like you need to choose a heat map that you're going to be presenting to the sock at the end of the SS.
We have three primary learning objectives for this lesson.
Number one. You should walk away understanding the types of heat maps that you can present.
Number two. You should understand and know the trade offs with a different heat map styles and number three. You should ultimately be able to choose the right type in style heat map for a given context and assessment,
so a heat map ultimately has three primary ingredients. When you're when you're coming up with how to construct a heat map to communicate your message,
the first one is the scope,
including the right tactics and displaying or hiding sub techniques. This is very important. You might have run through an assessment and done the assessment over all the tactics, every single sub technique, everything but the final heat map that you present. Maybe it doesn't need to show everything. Maybe it only shows a portion that you specifically want to highlight.
Or maybe it does show everything, if you can get it all to convey the right and the right heat map.
The second key thing is the measurement abstraction. This is mainly either categorical or quantitative.
What are the buckets and what he scores? Measure.
This is slightly different than scoping from
in the prior module. In that really you're choosing how you wanna bucket things or how you want to present things at the abstract level.
And then, lastly, the color scheme.
It's very important to choose the right colours to convey your message, and we'll give you a couple of examples of good color schemes as well as those that might not get the message across quite as easily.
So here's an example. Heat map from that was inspired by some work we did, you know many years ago, This heat map is super useful to me, at least because I can look at it and I can say Oh, no confidence, low confidence, some confidence or high confidence of detection. Those are my red, orange, yellow and green.
And then in grey, I've got this other category of some static detection might be might be possible. Or maybe I'm not looking at things behaviorally. But maybe I'm just, you know, my sock is doing a great job of looking at things like from an IOC perspective. And this seems okay and even great at first glance. But the reality is, it's really not that great of a heat map.
It has too many categories that can really make a confusing for readers. And for consumers of the heat map, it's really easy to get lost in the labels here.
Remember, when you're going through an assessment, your task is to paint broad strokes, and your heat map should convey that ultimately
instead, make sure to settle on something that conveys the right information at the right layer. I know it's smaller, but in the lower right hand corner this is the same heat map. But instead of using
instead of using the low in orange and the red. And no, we've replaced those with just a low confidence of detection in white. We've also changed the gray for static detection to this kind of like see through yellow that shows it's just a little bit different than some confidence of detection.
Just making these two changes totally changes the face of the heat map.
A lot of the content is still there. There's less with regards to know and low, but
most of the information is still being presented.
Regardless of your use case for attack, you know it's always important. Regardless of the use case of the assessment, it's always important to have a good scoring scheme, always defined categories that are relevant to your domain and, when possible, avoid mixing category types.
Confidence in likelihood. You know that
that that's a good example of something that where you say, you know, hey, I have confidence of detection and the likelihood of it being executed. You know, it makes a lot of sense when you think about it, but in practice it's hard to combine things when you're combining these kinds of categories.
Always know your audience to leadership, wants the big picture. Where's your Where's the socks? Staff? They need to focus on details for actually implementing changes.
Always choose good color schemes. We'll talk more about that later. Be the gradient Discreet. What colors you use et cetera. And lastly, metrics can be great
if you're using metrics as part of your assessment or what you're delivering. But particularly with the final heat map you produce, it's imperative to always have a good justification for your numbers and your categories.
Here's another example. Heat map. It's a little simpler. We've only got three categories. Low summer high confidence of detection.
This here map looks functionally great, but in practice it's really got a problem.
When people look at this heat map, they see a lot of flashing lights. They see the color red, and that color Red says, Hey, this is a problem.
Ultimately, Red often communicates the wrong message. People see it, and they they'll often assume it's saying like, Hey, look here. This is a huge problem. You got to fix this right away, glaring red error. You got to solve this,
and really, you should use red sparingly. Use it only as needed to call attention to specific areas that need to be focused on only when something really is a glaring gap or a critical issue. That's when you use red
instead, whenever you're coming up with the heat map, trying to settle on maybe a softer color palette.
This heat map replaces red with white, and it conveys the exact same message. But for many, it's easier to digest because because you're not coming in there and saying, Hey, huge, glaring problem, you're coming in and saying Hey, here's a heat map.
This does a great job to position the results as
less antagonistic.
These are areas of improvement, not areas where you failed.
Thinking more abstractly, even outside of assessments. Always be cautious when using red on a heat map. I know it makes the most sense to say problem is red, but
sometimes you often get You can get into trouble using the wrong colors
and then being realistic. Heat maps are not in and of themselves
well, strict. They're they're great, they're they're easy to understand. Their tangible, a straightforward They're useful to tons of people. But there are a couple of gotchas when you're working with them.
Number one coverage. Execution doesn't always align with attack, execution in practice
techniques and detection or technique, execution and the corresponding detection. Those can vary, and there can be ambiguity between the two and then per technique detection. That's not always the right level of abstraction. That's why we like to recommend that when you're coming up with an attack by stock assessment, don't just turn over a heat map,
but always include a paragraph or two that described the trends.
Because when you just look at the technique level, you don't often get the right impression of coverage.
Coverage is also not static. What's green or well covered today that can easily change
change tomorrow. Attacker, https and the defender practices those rotate Attackers change their stuff all the time. From a defensive perspective, what we did this month might be different next month.
Ultimately, don't ignore what you cover today.
Just because it's green now doesn't mean it'll be green in the future.
Then, lastly, always remember, attack Heat maps are almost always approximate. If you're doing this as a third party, always make sure the sock you're working with, make sure they understand and know this.
We're doing this in the house. Make sure your colleagues understand as well.
So we're not going to walk through a couple of example heat maps and just kind of throw some commentary
here. We've designed these exercises. I'm again. You pause the video,
we'll go through each answer because the video and hear your task is to look at this heat map,
come up with a couple of comments, maybe maybe score it a little bit and then try to figure out the right audience, be it a Cisco executive board, socks, leadership or sock Engineers Try to figure out who might be best to use this heat map, and in some cases your answer might be nobody. So pause the video. We'll get back to it in just a second.
Okay, let's not walk through what we think of this heat map and
really using those three ingredients. So those three things to look after heat maps that there's a few things to see.
Scooping wise, this is all tactics. There's no sub techniques
from a measurement abstraction perspective.
We've got an interesting scheme being used. We have detection and mitigation where each are distinct. I have low detection and low mitigation, low detection and some mitigation, and so on and so forth.
There are categories, but there are nine of them, which is relatively on the higher side. And the colour scheme is I I just use the phrase hodgepodge to describe it. It's almost random. It's just there's no real rhyme or reason to what the colors mean.
Ultimately, the conclusion is that this map is just It's really hard to use and understand if someone had to use it. There might be utility for sock engineers just because it does give you a good level of detail when you think about it from a theoretical perspective.
But the way that detail is conveyed makes it hard to understand and hard to put into practice.
Here's another heat map name, exercise. Try to choose an audience, pause the video and then we'll walk through a solution.
let's now kind of analyze this one a little bit more detail.
Um, here we kind of use the three key ingredients. From a scoping perspective, we have all tactics, no sub techniques,
measurement, abstraction.
This has a small number of categories here. We have unlikely to be detected or mitigated some likelihood and high likelihood.
And then we've combined detection of mitigation, which which has pros and cons to it.
The color scheme is super simple. I I know if you've been following along in the course you've seen, this is our favorite color scheme, so that that's definitely a positive.
Ultimately, the conclusion of that is that this is
really useful at the abstract level. It's great for leadership, and maybe some engineers might benefit from it. Really. The reason there is a distinction between leadership and engineering here is that by combining detection and mitigation were really painting a very broad stroke. You know, it's really a big picture of what things look like, and it might be harder for engineers who'd rather say, Hey,
is this detected or is it mitigated? And there's not much ambiguity or there's not much differentiation here.
Here's the last one. It's
one of the same as before. Choose an audience and then we'll walk through it.
so let's not walk through this one again. It's very similar as before, we're looking at detection and mitigation.
Looking at the 33 areas we focus on scoping is the same. All tactics, no sub techniques here are abstraction is again combining detection and mitigation. But here we're now using some sort of quantitative scheme. We don't see the numbers here, but we know that they're underneath it.
And then the color scheme is a relatively straightforward gradient. Green means a high ability to detect or mitigate, and white means no ability to detect or mitigate.
Our conclusion is that this provides a reasonably good, actionable level of detail.
It's really useful across multiple layers. Leadership can understand big picture from it, but sock personnel might benefit the most. Just because it gets into a little bit more detail than the previous example, which was. Just hire some now because I'm using a gradient. You know, I I see a little bit more nuance in my coverage.
We still have the downside of not having differentiation between
detection and mitigation, but we think that that's kind of offset by that increased kind of differentiation between the techniques themselves. So ultimately, just about most people could use this, and maybe sock engineers have a little bit more advantage using the CMAP. Of course, a lot of this is indeed interpret up to interpretation.
The ultimate point is to make sure that when you're crafting these,
you think about the audience and how they're going to receive it.
Some closing commentary A lot of our examples just focus on primary techniques. Some techniques are definitely things you can include. But if you look at this chart here, you can see that including all of them
well, from a communication perspective, it's really not great.
Sub technique Information is useful, but it can be harder to visualize some strategies you can employ for
including sub techniques. In your visualizations are one visualizing a subset of sub techniques you've looked at maybe only a small few that you think are most relevant and then let the sock No, you've hit a few. Others
also know your audience.
You know, looking at sub techniques is great,
really, for engineers and at the lower level, including them for leadership, that might be a little bit harder. It might be too much, too much detail. Of course, that depends on the sock you're working with,
but ultimately, as long as you know your audience, you'll get a better feel for how much detail and how many sub techniques do you think are worth, including then. Lastly,
if you do end up throwing out visualization of some of the sub techniques, you've still done the analysis. So keep that sub technique information in a in an appendix or in another form, you can always consider turning over to heat maps, one big picture, abstract one and then one more detail that dives in more
so a few summary points and takeaways to close out this lesson.
Number one, he perhaps have three important facets. The scope, which is the parts of attack to include the measurement abstraction, which is really what scores you present. And the colour scheme, which dictates how your scoring is presented.
You're heatmap delivery strategy ultimately depends on your audience.
My experience. Leadership tends to prefer the simpler, easier to digest charts or as technical staff can make more use of the details at nuance.
Everybody still can use all almost all heat maps, provided that they are conveyed in a way that's digestible. But there tends to be a little bit more biased between who you're talking to
avoid. Red. This is a silly, simple recommendation, but it can go a long way towards really making it sure you can effectively communicate your message
and then, lastly, you don't need to present everything. Include more detail in an appendix if you feel it's necessary.
Up Next