lesson 2.5 Incidents involving insider threats
for the objectives on this course will look at understanding how to prepare for and respond to incidents involving insider threats.
Secondly, we will look at identifying common non technical indicators of insider threat activities,
and three identify common technical indicators of insider threat activities.
When we look at insider threats, we generally divide them into two categories. One is negligent insiders in the second is criminal insiders.
These air further define negligent
insiders or those that
through miss configuring things, not following policy or doing other things that created some sort of unnecessary risk. They became a threat to the organization, but their intent was not to cause harm or steal information or anything like that.
Still certainly a threat.
Criminal insiders, however, are those that are knowingly willingly, intentionally going after information or data or access. They shouldn't have were trying to destroy things that would be considered a crime. So that's generally the two categories we see
when we look at statistics we see about 62.
62% of insider threats are in fact negligent, still causing threat and risk to the enterprise, but not intentionally and then we see about 14% of the insider threat activities being criminal and intentional insider threat activities.
I've got some personal
experience investigating insider threats. I sat on an insider threat working group, was part of bringing insider threat capabilities to one of my jobs and have actually investigated several insider threats, including some that were referred for criminal prosecution. And these
can be very interesting. Cases can be difficult cases and can be difficult as well just to investigate and identify. And if you are working, insider threat type of cases.
Those things we've talked about in the past on rigor and structure and chain of custody and handling evidence is extremely important in these cases.
Well, look next at how I our teams manage insider threats first, Don't wait until you have an insider threat to figure this out.
You should not have your first conversation about insider threats when you are investigating one. Just like we talked about not flying the airplane while you build it with your I R plan. Same holds true for insider threat activity.
You should develop a foundation and discuss with executives and and your stakeholders
what insider threats are. What you're doing to be prepared for them, how you monitor for them or maybe you don't and you want to add that capability. And if you were to identify an insider threat, how would you handle it? Who would need to be notified? What happens if the insider threat is somebody on the Insider Threat working group or within this team?
So you should have several different scenarios figured out?
You should also identify high risk employees, segments and activities and develop personas. That means that you look across your user base and you develop those that would be most devastating if they were to be insider threats,
or perhaps those that have more likelihood because of the type of access they have or the type of data they work with. And you develop personas, meaning you create examples of. If this persona, maybe a database administrator, is a persona. If that type of
employees were to do the following things,
the risks would be X or the way we would catch them would be X. So those air reasons that you want to develop these personas and how you could build out that playbook
develop a business case for a insider threat capability.
Build the structure and the processes that I've talked through a little bit already, and then also develop a monitoring an investigation protocol.
You have to be a little bit careful here on just randomly looking through people's information. That's why it goes back to that acceptable use policy I spoke about earlier that people don't have expectation of privacy. Or maybe they have some expectation of privacy on devices.
There's been all sorts of court cases. This has been litigated
multiple times. But for example, if companies give employees a section of their hard drive, maybe I've seen this done where you have a private folder that's actually put on the hard drive as part of the baseline image on the hard disk
and anything in there is your data. It's not the companies,
but if you start looking through that because you see employees moving things to that that you don't think is appropriate, is that okay? If you find something in there, can you use it?
So these were the kinds of questions that need to be all played out well before any type of incident,
but also are you collecting intelligence on employees or you just looking for insider threat activity, and you want to be really clear on that, have it well defined and get your legal department involved in this heavily as well.
Some considerations for Insider Threat Programs Consider an insider threat. Working group. This is typically a cross functional group of people that represent HR employee relations, Legal I t cyber executive staff that will look through anything that's brought to that organization and
make determinations on next steps.
Collaborate with HR and legal closely on all of this.
Consider building insider threat indicators, and I'll give you some examples of what those might look like.
Insure notifications between I T H R and cybersecurity are happening.
So if you have, for instance, an employee who's worked at the company for 10 years and they're moving because they got a promotion from legal to HR, then is there any notification that goes to cybersecurity? Oftentimes, no. Usually organizations were getting better about
we're hiring this person were firing this person. This person resigned.
But internal moves, whether it's promotions or demotions or transfers, normally don't get well communicated.
But in my example, if they move from legal and they've got all the access to legal SharePoint sites and shared folders on the server and who knows what else.
And now they moved the HR where they'll probably get access to all of HRC files that they need to do their job. Rarely does their old access ever get terminated. So now we have an employee with access to legal
and HR data, which both of them are a treasure trove for Attackers or insider threats. So you want to make sure that you've got a process in place that HR notifies. I T and Cyber of any movement of employees also have a reporting structure in place that allows for bypassing the chain of command. So
what happens if the C. I. O. Is the suspected insider threat?
But your document says that the C I. O. Is notified any time there's a suspected insider threat, you need to have a written protocol that says you can jump over the chain if necessary and go straight to the CEO, for example, or the chief risk officer or the chief legal officer and be able to report that way.
But it could only be used in those circumstances where you can articulate
that you believe the next person above you or whoever's been identified may be suspected of being an insider threat.
Here's some example indicators that are not technical, that you might want to watch out for work performance decline so you wouldn't know this unless HR told you, which is why you need that link. But if somebody has a clear and distinct difference in their work performance, that may be an indication of something going on.
Increased absenteeism increase in violation of company policies. Foreign travel Now not just Somebody took a vacation toe one place, but maybe foreign travel to countries that are known problems or increased multiple trips to foreign countries.
Suspected social media posts that are inappropriate or just strange out of character, maybe posting information about the company over use of words that express negative emotions. There are companies out there that have tools that will scan instant message communications and emails, looking for
words that are associated with negative sentiment
and report back. So if you have people writing emails back and forth about how much they hate their job, how much they hate their boss,
how they'd like to see the company fail. That is all an indication, of course, that something may be going on that might need to be looked at closer,
a sudden and unexplained increase in wealth. So maybe they're being paid as a spy or something else going on. They're embezzling money. That would be certainly something to look into and demonstrating ties to high risk people or organizations.
Now, some technical
risk indicators that you would look at introduced introduction of unauthorized devices like a USB. Maybe you have a company policy that no USB devices air allowed or only certain ones are allowed, and you're seeing evidence of unapproved devices. Connected,
installing, authorize arms are installing unauthorised software
so they get caught installing software that they're not supposed to install, and especially software that could be used like wire shark, Teoh tap things or some other malicious type. Not the wire sharks. Melissa's but malicious software, anything like that that would raise a red flag
attempting to increase privileges. So maybe they're trying to make themselves an administrator.
Hi, printing volume all of a sudden, there, printing way more than normal. Maybe they're smart enough to know that you're monitoring data flow out of the organization, and they don't want to email all the secret documents,
but maybe they'll print them and carry them out in a briefcase
access during non standard time. So, of course, people checking their email every once in a while over the weekends. Or if you have a pattern of behavior, that's one thing to look at. But if all of a sudden and employee starts coming into the office or logging in late at night, and that's not normal for them, that might be something to investigate.
Unusual network traffic is something else. You might want to look at
excessive uploads or downloads,
accessing prohibited websites
in accessing systems without authorization.
All right. Quiz question for insider threats. Which type of insider threat is most prevalent?
A malicious insiders? Be criminal insiders or see negligent insiders?
Well, if he answered, see negligent insiders, You are correct. That is the most prevalent kind of insider threats that we see
second question, which is not an example of a technical indicator of insider threat activity,
a work performance decline. Be installing unauthorised software or see excessive uploads or downloads.
The answer here is a All three of these are indicators of insider threat activity. Potentially
but a technical indicator would be B and C, where A is a non technical indicator of insider threat activity.
Well in summary of the insider threat lesson, we talked about how to prepare for and respond to insider threat
activities within an organization, and we also talked about common technical and non technical indicators of insider threat activity.
Before the Incident: Good Cyber Hygiene and Vulnerability Management
Resources to Protect an Organization
Partnerships Between IT and Security
Service Level Agreements (SLAs) and Metrics to Monitor Protection Abilities
Zero Trust Networks (ZTN), Edge Computing and Other Considerations