OSINT Ethics and Moral Dilemmas

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *

Already have an account? Sign In »

1 hour 56 minutes
Video Transcription
Welcome back
in the last video, we did a dive into the ocean cycle, and in this lesson we will talk about the ethical and moral questions that could and will appear at one point in your investigation. Ethics and morality of Austin is a big field, since it could be easily translated and moved to the official legal side.
There are a great deal off potential gray areas in assent. Actually, intelligence has always been associated with morally shady practices,
not because data is by itself illegal. Rather, how did you obtain that data and what kind of context it will be used as information because maybe it has been clearly marked as confidential
when doing open source intelligence. It is key to learn the laws around data privacy in any country and to ensure that you're working within those laws at all times.
What you share online can have a lot of personal data attached to itself. Digital footprints are just that foot prints and traces of your digital activity since we are talking about the legal, ethical and moral side of house, and I would like to mention a great description of hosting by Mr Robert Steel
and recommend for you to check his work.
I have left this in the resource section of this course he describes chosen as an intelligence based on information which can be obtained legally and ethically from public sources. In my opinion, every human being knows when it's doing something wrong. You know the feeling when something is off about something.
Now I will briefly touch on misinformation and this information.
These two words sound very similar, but they're different things, and either one of them could become your friend. Misinformation is information that it's false but not created with the intention off causing harm.
And this information is information that is false and deliberately created to harm a person, a social group, organization or a country.
I say this because maybe you will use exposing misinformation or disinformation to make a verification off conflicted media. Or you will be the creator of that kind of information to acquire the valid one. And here I have to put a new emphasis to consult your legal team or a lawyer. If you find yourself in this kind of situation
now, I have a hypothetical example for you to think about.
Let's say you do your investigations and you stumble upon a database that is exposed but clearly marked as confidential should immediately contact the owners and tell them about it. Or will you first look to see Is there anything interesting?
This is the point where you should really think about this, because if you go deeper in your investigations, these kinds of questions will appear more than once There is a school of thought that says the fact that data are openly available does not mean that they can be processed without regard to legal and ethical standards put in other words,
the mere fact that data are publicly available does not imply on absence off
restrictions to research them. You know the same. If you got a bad feeling about something that you're doing, then that's something is probably bad or evenly legal. In the end, we could go in tow philosophical discussion on what is right or what is wrong from Plato, Aristotle to modern philosophers like Zizek or Chomsky,
which I highly recommend for you to check out.
But in today's world, we have many legal frameworks that tell us what could happen if we're bad, so I will mention this again. Please always always consult your legal team or a lawyer when you conduct these kind of activities or try to think about it in this way.
Well, you're also interactions affect somebody's life or someone's organization or a company in a bad way.
In this video, we covered or, better to say, touched on the ethical and moral side of conducting innocent, and in the next one we will do a quick recap off this module.
Up Next