this step in building your insider threat program is a big one. Just look at the video duration. So why is this such a big step?
Well, this is where you define your insider triggers.
The concept of insider threat triggers has dual meaning, both of which are critical to understanding and mitigating the various types of insider threat risks your organization may face.
The first is what an insider does
these air the observable actions that indicate a risk.
The second are events or situations that motivates an insider to commit their act.
At this point in the process, you should try to anticipate what an insider is thinking and what an insider might do because your plan needs to include work flows around various triggers for insider threat behaviors.
And for that, we're going to look at three things to help you define your triggers, intention,
Okay, an inside threat can be anyone from an employee or contractor to 1/3 party vendor.
Regardless of their status in your organization, they're risky. Actions fall along a spectrum of intent.
On one side, you have the oblivious
these air, the insiders that are unaware of the policies, rules and risks.
Take, for example, phishing attacks or other methods of account compromise
if a user clicks a link that installs malware on the computer while the attack came from the outside, the threat is now on the inside because the user made a mistake.
Obviously, such a user needs more training on what not to click. But what about reporting of this incident?
Your insider threat program should allow for self reporting of mistakes such as this without fear of Reprisal. As long as the mistake is reported quickly.
Self reporting in itself can be one of your triggers.
In the middle of the spectrum, you'll find the complacent
the complacent user may understand. What they're doing is not exactly following established procedures, but feel the time savings is worth the risk.
Take, for example, a user who enables a personal cloud service on their work system that accidentally sinks and exposes all their work data from their personal clown.
Most likely, they were looking at productivity or collaboration versus established security protocols.
Sure, this user could also use some more training about proper tool usage. But ultimately, as much as you may try to block these kinds of actions. Users will find other ways to make their jobs easier and thereby put assets at risk.
Your insider threat program should allow for the realities of the collaborative workplace
monitoring four and responding to these triggers without blocking the legitimate uses of workplace technology.
On the other side of the spectrum, you have the malicious.
These insiders are purposefully seeking personal gain or attempting to harm the organization or other individuals for various reasons.
And malicious insiders have all the classic motivations, ranging from financial gain to wanting to Harmon organization in a specific way.
Malicious insider threats have a significant degree of motivation behind them and are on a completely different level than those who either don't know any better or simply trying to make their jobs easier
for the malicious actor. Additional training probably won't dissuade them. If they're motivated, they'll find a way.
Having proper controls in place can mitigate the risk, and proper monitoring can help speed up the response.
Ultimately, because insider actions fall on a spectrum, your insider threat program should account not only for the oblivious and accidental, but for the malicious as well.
Of course, you can't directly monitor for intent.
While motivation will give clear context to your defined triggers, it's the actual activities that we can systematically monitor. For
the easiest part of defining your triggers is simply listing out the observable actions that expose you to risk,
for example, improper use of, ah, personal cloud account.
It could be accidental. It could be malicious monitoring for that one. Action covers a range of possible scenarios and to talk about the scenarios, let's hear from Jake Dwoskin.
Generally, I T sabotage is carried out by a disgruntled employee with elevated privileges, either as a last act before leaving or creating a backdoor to be used after they depart.
Some acts of sabotage require physical access, but others could be carried out remotely.
Separation of duties and using the principle of lease privilege can help mitigate these types of threats from bad actors.
Although I t sabotage from within is statistically infrequent, this threat vector uses information technology to impair the availability of data or services. Not only can it cause damage to your systems, but it can introduce additional risk, which can impact your organization's reputation.
Conducting audits and shacks of the admin is for your most critical systems at regular intervals helps to identify excessive privilege, Privilege Creek and legacy elevated privilege privilege audits help validate appropriate levels of access and informed the use of controlled access and identity management programs.
Some common triggers that could indicate that a possible insider risk is afoot include the addition and removal of user rights in a short window,
logging in while away from the office or even unexplained system performance or disk capacity issues.
Another thing to look for is disabling of security controls or even the installation of hacking tools.
Closely related to I T Sabotage is insider fraud, which consists of people who manipulate data for personal or professional gain. Typically for monetary gain.
No one person should have excessive, unchecked privilege over systems allowing for the manipulation of data to their advantage. And unfortunately, common scenario is an employee who uses or passes along sensitive information, such as financial statements or details about mergers and acquisitions.
Critical data and those who have access to it
should have special controls and enhanced monitoring to capture unusual and or suspicious behavior.
Then there's insider theft.
The classic insider thief is stealing data and selling it to an outside party.
When it comes to insider attempts to exfiltrate data, you may want to use the Peredo principle.
Some refer to this as the 80 20 rule.
Typically, 80% of the data is excellent rated using 20% of the methods available. Common threat vectors for insider theft include removable media,
email attachments and cloud sinking and sharing.
So defining triggers on those methods is a great place to start.
And while insider actions and behaviour are a good basis when defining your triggers, don't forget events or circumstances that can prompt insiders Toe act.
These are triggers to and are generally associated with malicious actors
From reviewing past cases, the I T saboteur has typically been with an organization for 2 to 5 years with elevated privileges on multiple systems. Most of these cases involves some kind of aggravating factor, such as poor performance being overlooked for promotion,
concerns about business stability and or life changing events.
Keep in mind the use of these human behavioral indicators within your insider threat program should be bettered carefully through your HR and legal stakeholders.
More recently, Workforce turnover is factoring into insider incidents. The simple act of changing jobs, contempt employees to take data. Roughly 2/3 of employees admit to taking data when they depart. While some are merely trying to make their next job easier, others believe the files belong to them
because, after all, it's their work.
More malicious employees might make use of sensitive data as leverage when negotiating a new job offer.
And these motivations can change. If the departure is voluntary or not,
A voluntary departure could be someone announcing their retirement. Starting a new career or moving to a different organization
and in voluntary departure is when the person is leaving the organization, but not on their own terms.
Studies have shown that people going through layoff move through the five steps of grief and loss. And once the anger stage kicks in,
this employee could turn into a malicious departing employees or possibly a 90 saboteur.
Your insider threat program should have a procedure for dealing with involuntary transitions.
Departing employees most likely would use removable media, unsanctioned cloud shares or attachments to personal email to exfiltrate data.
All of these events can be triggers for your insider threat program. Work flows
to be fair, not all departing employees are malicious, some easily land closer to the oblivious side of the spectrum.
But a non malicious departing employees may still collect files that air, considered intellectual property of the organization, erroneously believing to be the owner of the files.
One of the biggest steps in building an insider threat program is defining your insider triggers.
This includes what an insider does, namely the observable actions that indicate a risk,
as well as the events or situations that motivates an insider to commit to their act.
Because the intent behind insider actions fall on a spectrum, your insider threat program should account not only for the oblivious and accidental, but for the malicious as well.
Using the Peredo principle, otherwise known as the 80 20 Rule, your insider threat program can identify the most common methods of expatriation and prioritize these areas
for accidental threats. You should have a reporting system in place that allows users to report their mistakes quickly without fear of retribution.
While the first signs of I T. Sabotage and insider fraud cases involve changes in behavior,
additional process, auditing or other checks should also be put in place to protect from the various malicious insider threats,
and it's understandable that you'd want to jump right to the end and have the perfect insider threat program that covers every scenario in vector possible and detect all possible triggers
instead of building a monster program with classifications, schemes and policies that attempt to monitor every potential scenario and ultimately fail. Start by focusing on the most common scenarios.
With technology ever changing, you really don't know where your threat vectors are going to be in 3 to 5 years, but you can identify the most likely ones.
be sure to keep your plan flexible and reassess it with your stakeholders to be better prepared for new potential indicators as they become apparent.
And now, with your insider triggers defined
time to build work flows around them.