Okay, so let's think about cognitive bias and logical errors. As I was stating in the introduction, one of the biggest challenges with
with these kinds of concerns is trying to identify when
a bias or other kind of logical thinking error has occurred. Or is a current maybe you notice
in the middle of your brain trying to send you down a path which
may not be the right one.
This is certainly something that, as I was saying before, is a cover feature of humanity.
We're thinking creatures. So
in a general sense, what were often trying to do is find patterns that we recognize trying to find a shortcut
to get to a desirable result.
And from an evolutionary standpoint, that's certainly a positive thing.
But when doing some kind of hard analysis,
looking at lots of different types of evidence, trying to fit together timelines,
trying to correctly a sign attribution, and so on. There's so many different ways that we can get tripped up by our thinking.
So it is worthwhile to date a little bit deeper into this topic
and think about how we can better monitor are thinking, knowing some basic definitions of how these things can go wrong
when the first ones to consider is what is a bias to begin with.
And generally the definition is accepted to be some kind of deviation.
I might have a favorite tool that I like to use for for analyzing
log files, for instance.
It's my favorite because I started using it,
and I think it's better than other tools that are available.
And I may continue to use that tool knowing that there are other ones available that could actually be better. Maybe the tool app I have biased towards is out of date
and doesn't have some of the advanced capabilities. Other tools might have
that same type of favoritism thinking could also extend towards other conclusions, like preferring
a certain conclusion over others because it's comfortable and because it's
something that is consistent with other work that's been done.
So that's a very basic definition of what a bias might be. But we can dig a little bit deeper than that as well
other aspects of this overall process,
for instance, sometimes our judgment is just incorrect.
Siri's of pieces of evidence of evidence
and make some drops and conclusion,
and that turns out to just be plain wrong.
You could think of lots of different ways why that might happen. It could be that the the analyst was distracted while they're doing their work. Maybe they're trying Thio tackle too many tasks at one time
multitasking, even though that's something people often brag about.
research that has been published over the last several years
indicates that multi taskers
don't necessarily get more done.
They might feel like they're getting more done, but in reality, that may not actually be the case.
So there is something to be said for
putting on the blinders and focusing in on the task at hand,
especially since that helps to prevent. Some of these problems were creeping into your
to your work. As it is.
There's the concept also of perceptual distortion.
This means that the your response to some action or some kind of stimulus is different than what would be expected.
Your perceptual distortion can can occur for lots of different reasons. For instance, if you are not feeling well when you come to work,
you might have in a hard time focusing.
You might be thinking about wanting to be home, getting better instead of doing your job. Simple example.
And that could alter your ability to make logical decisions and logical conclusions
that would otherwise be more or less easy. For Yu Thio. D'oh!
People who are under the influence of alcohol and drugs have perceptual distortion distortion.
In many cases, they do this on purpose, right? They want to perceive
what their senses tell them
differently than what they would normally expect to have.
So any time that there's a chance that your surroundings or distraction level your current health status,
having too much on your plate, whatever you wanna call it that could certainly have it have an impact on your ability to
clearly see through the evidence in front of you
and reach the correct or the most correct conclusion that's available.
Then we have to consider irrational conclusions.
Someone who's irrational is perhaps overreacting, or even under reacting to a situation or in this case, some evidence that they're analyzing.
It might be that there, again, because of other combinations of different factors
a conclusion has reached that doesn't seem to make sense when someone else looks at it, they might look at the evidence and say, Well, I see this, this and this
and you You reach this conclusion, but that doesn't add up. There's something missing here.
And this hopefully would be something that would happen very infrequently
because as a c t I analyst, you're certainly going to be
expected. Heavy, finally trained mind and
being irrational should be something that's very, very, uh,
But yeah, it happens,
maybe for some of the reasons that I've already cited having a bad day
being stressed out about family life, home life,
having too many different conflicting pieces of information
that concluded the issue that's being studied. All these are possibilities that that should be considered to make sure that there's no
major flaw and thinking.
Now, if we move on to cognitive bias errors,
one of the biggest ones is trying to avoid taking shortcuts.
As I stated earlier, this is a feature of human evolution that certainly has helped hasn't served us very well.
If you're faced with doing some kind of task, especially if it's repetitive
our minds are generally designed to look for a shortcut to try to get to the same result
and that's a good thing in most cases. But
in this line of work, taking shortcuts can
caused a ripple effect of other kinds of errors.
If something early on and analysis wasn't done correctly, there was a little shortcut taken then that may ripple out to other pieces of the work because you're basing
subsequent decisions on conclusions. Reached that, or perhaps and correct.
That's a dangerous thing to
to engage in. And it has to be identified and rooted out wherever possible. Gods have confirmation bias.
This is a tendency for people in general, too.
Try to find information that supports the conclusion that they already have.
I might think. OK, I've got enough data here to say that
this this incident was caused by this threat actor.
I already believe that,
but maybe I haven't finished my analysis. So as I do more analysis, I might continue to pay more attention to
details and evidence that seems to support the conclusion I've already reached already know who the bad guy is, and so I
keep myself on that path,
perhaps paying less attention to things that would allow me to deviate from that path.
you know, anyone who's doing any kind of any kind of investigation,
Whether it's in a cyber threat scenario or criminal investigation and legal investigation, for instance,
they all have these same things to worry about.