Incidents Involving Insider Threats

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *
or

Already have an account? Sign In »

Difficulty
Intermediate
Video Transcription
00:01
lesson 2.5 Incidents involving insider threats
00:05
for the objectives on this course will look at understanding how to prepare for and respond to incidents involving insider threats.
00:13
Secondly, we will look at identifying common non technical indicators of insider threat activities,
00:19
and three identify common technical indicators of insider threat activities.
00:25
When we look at insider threats, we generally divide them into two categories. One is negligent insiders in the second is criminal insiders.
00:34
These air further define negligent
00:36
insiders or those that
00:38
through miss configuring things, not following policy or doing other things that created some sort of unnecessary risk. They became a threat to the organization, but their intent was not to cause harm or steal information or anything like that.
00:54
Still certainly a threat.
00:56
Criminal insiders, however, are those that are knowingly willingly, intentionally going after information or data or access. They shouldn't have were trying to destroy things that would be considered a crime. So that's generally the two categories we see
01:14
when we look at statistics we see about 62.
01:17
62% of insider threats are in fact negligent, still causing threat and risk to the enterprise, but not intentionally and then we see about 14% of the insider threat activities being criminal and intentional insider threat activities.
01:34
I've got some personal
01:37
experience investigating insider threats. I sat on an insider threat working group, was part of bringing insider threat capabilities to one of my jobs and have actually investigated several insider threats, including some that were referred for criminal prosecution. And these
01:55
can be very interesting. Cases can be difficult cases and can be difficult as well just to investigate and identify. And if you are working, insider threat type of cases.
02:07
Those things we've talked about in the past on rigor and structure and chain of custody and handling evidence is extremely important in these cases.
02:17
Well, look next at how I our teams manage insider threats first, Don't wait until you have an insider threat to figure this out.
02:27
You should not have your first conversation about insider threats when you are investigating one. Just like we talked about not flying the airplane while you build it with your I R plan. Same holds true for insider threat activity.
02:40
You should develop a foundation and discuss with executives and and your stakeholders
02:46
what insider threats are. What you're doing to be prepared for them, how you monitor for them or maybe you don't and you want to add that capability. And if you were to identify an insider threat, how would you handle it? Who would need to be notified? What happens if the insider threat is somebody on the Insider Threat working group or within this team?
03:05
So you should have several different scenarios figured out?
03:08
You should also identify high risk employees, segments and activities and develop personas. That means that you look across your user base and you develop those that would be most devastating if they were to be insider threats,
03:23
or perhaps those that have more likelihood because of the type of access they have or the type of data they work with. And you develop personas, meaning you create examples of. If this persona, maybe a database administrator, is a persona. If that type of
03:43
employees were to do the following things,
03:46
the risks would be X or the way we would catch them would be X. So those air reasons that you want to develop these personas and how you could build out that playbook
03:57
develop a business case for a insider threat capability.
04:01
Build the structure and the processes that I've talked through a little bit already, and then also develop a monitoring an investigation protocol.
04:10
You have to be a little bit careful here on just randomly looking through people's information. That's why it goes back to that acceptable use policy I spoke about earlier that people don't have expectation of privacy. Or maybe they have some expectation of privacy on devices.
04:26
There's been all sorts of court cases. This has been litigated
04:30
multiple times. But for example, if companies give employees a section of their hard drive, maybe I've seen this done where you have a private folder that's actually put on the hard drive as part of the baseline image on the hard disk
04:46
and anything in there is your data. It's not the companies,
04:50
but if you start looking through that because you see employees moving things to that that you don't think is appropriate, is that okay? If you find something in there, can you use it?
05:01
So these were the kinds of questions that need to be all played out well before any type of incident,
05:08
but also are you collecting intelligence on employees or you just looking for insider threat activity, and you want to be really clear on that, have it well defined and get your legal department involved in this heavily as well.
05:20
Some considerations for Insider Threat Programs Consider an insider threat. Working group. This is typically a cross functional group of people that represent HR employee relations, Legal I t cyber executive staff that will look through anything that's brought to that organization and
05:41
make determinations on next steps.
05:44
Collaborate with HR and legal closely on all of this.
05:47
Consider building insider threat indicators, and I'll give you some examples of what those might look like.
05:53
Insure notifications between I T H R and cybersecurity are happening.
05:59
So if you have, for instance, an employee who's worked at the company for 10 years and they're moving because they got a promotion from legal to HR, then is there any notification that goes to cybersecurity? Oftentimes, no. Usually organizations were getting better about
06:16
we're hiring this person were firing this person. This person resigned.
06:20
But internal moves, whether it's promotions or demotions or transfers, normally don't get well communicated.
06:28
But in my example, if they move from legal and they've got all the access to legal SharePoint sites and shared folders on the server and who knows what else.
06:39
And now they moved the HR where they'll probably get access to all of HRC files that they need to do their job. Rarely does their old access ever get terminated. So now we have an employee with access to legal
06:50
and HR data, which both of them are a treasure trove for Attackers or insider threats. So you want to make sure that you've got a process in place that HR notifies. I T and Cyber of any movement of employees also have a reporting structure in place that allows for bypassing the chain of command. So
07:10
what happens if the C. I. O. Is the suspected insider threat?
07:14
But your document says that the C I. O. Is notified any time there's a suspected insider threat, you need to have a written protocol that says you can jump over the chain if necessary and go straight to the CEO, for example, or the chief risk officer or the chief legal officer and be able to report that way.
07:32
But it could only be used in those circumstances where you can articulate
07:36
that you believe the next person above you or whoever's been identified may be suspected of being an insider threat.
07:45
Here's some example indicators that are not technical, that you might want to watch out for work performance decline so you wouldn't know this unless HR told you, which is why you need that link. But if somebody has a clear and distinct difference in their work performance, that may be an indication of something going on.
08:03
Increased absenteeism increase in violation of company policies. Foreign travel Now not just Somebody took a vacation toe one place, but maybe foreign travel to countries that are known problems or increased multiple trips to foreign countries.
08:22
Suspected social media posts that are inappropriate or just strange out of character, maybe posting information about the company over use of words that express negative emotions. There are companies out there that have tools that will scan instant message communications and emails, looking for
08:41
words that are associated with negative sentiment
08:43
and report back. So if you have people writing emails back and forth about how much they hate their job, how much they hate their boss,
08:50
how they'd like to see the company fail. That is all an indication, of course, that something may be going on that might need to be looked at closer,
09:00
a sudden and unexplained increase in wealth. So maybe they're being paid as a spy or something else going on. They're embezzling money. That would be certainly something to look into and demonstrating ties to high risk people or organizations.
09:16
Now, some technical
09:18
risk indicators that you would look at introduced introduction of unauthorized devices like a USB. Maybe you have a company policy that no USB devices air allowed or only certain ones are allowed, and you're seeing evidence of unapproved devices. Connected,
09:33
installing, authorize arms are installing unauthorised software
09:37
so they get caught installing software that they're not supposed to install, and especially software that could be used like wire shark, Teoh tap things or some other malicious type. Not the wire sharks. Melissa's but malicious software, anything like that that would raise a red flag
09:56
attempting to increase privileges. So maybe they're trying to make themselves an administrator.
10:01
Hi, printing volume all of a sudden, there, printing way more than normal. Maybe they're smart enough to know that you're monitoring data flow out of the organization, and they don't want to email all the secret documents,
10:13
but maybe they'll print them and carry them out in a briefcase
10:16
access during non standard time. So, of course, people checking their email every once in a while over the weekends. Or if you have a pattern of behavior, that's one thing to look at. But if all of a sudden and employee starts coming into the office or logging in late at night, and that's not normal for them, that might be something to investigate.
10:35
Unusual network traffic is something else. You might want to look at
10:39
excessive uploads or downloads,
10:41
accessing prohibited websites
10:43
in accessing systems without authorization.
10:48
All right. Quiz question for insider threats. Which type of insider threat is most prevalent?
10:56
A malicious insiders? Be criminal insiders or see negligent insiders?
11:05
Well, if he answered, see negligent insiders, You are correct. That is the most prevalent kind of insider threats that we see
11:11
second question, which is not an example of a technical indicator of insider threat activity,
11:18
a work performance decline. Be installing unauthorised software or see excessive uploads or downloads.
11:30
The answer here is a All three of these are indicators of insider threat activity. Potentially
11:35
but a technical indicator would be B and C, where A is a non technical indicator of insider threat activity.
11:46
Well in summary of the insider threat lesson, we talked about how to prepare for and respond to insider threat
11:52
activities within an organization, and we also talked about common technical and non technical indicators of insider threat activity.
Up Next
Before the Incident: Good Cyber Hygiene and Vulnerability Management
Resources to Protect an Organization
Partnerships Between IT and Security
Service Level Agreements (SLAs) and Metrics to Monitor Protection Abilities
Zero Trust Networks (ZTN), Edge Computing and Other Considerations