Vulnerability Disclosure Program Part 3

Video Activity
Join over 3 million cybersecurity professionals advancing their career
Sign up with
Required fields are marked with an *
or

Already have an account? Sign In »

Time
8 hours 10 minutes
Difficulty
Advanced
CEU/CPE
8
Video Transcription
00:00
Hi, I'm Matthew Clark. This is less than 6.6 vulnerability disclosure programs, Part three, and this lesson will continue. Our conversations about bug bounty programs and talk about the pros and cons will also discuss more about safe harbor and responsible disclosure programs.
00:17
And finally, we'll discuss about what it takes to build a vulnerability disclosure program.
00:23
So we've been talking about bug bounty programs, which involved crowdsourcing. So what's kind of in numerator some of the pros and cons?
00:30
One of the prose is that it is proactive. People are potentially looking for problems all the time. It's also cost effective
00:38
because you're only paying for actual issues, not the potential ones.
00:42
You're not paying for how long it took someone to find a problem, and you're not paying for them. Toe. Learn on the job, either.
00:49
Another pro is that organizations get toe limit impact
00:53
because bug bounty programs have been accused of buying silence. In fact, I'm left a reference to a non article called Bug Bounty. Programs are being used to buy silence in the reference section and encourage you to read it.
01:07
The cons are that crowdsourcing is not really designed for inside the perimeter. I was watching the movie The Hobbit with my Son the other day, and Bilbo Baggins is astounded when 12 dwarves come knocking at his door and expect dinner. They eat everything that Bilbo has in his kitchen.
01:23
There's even one dwarf, the It's like three wheels of cheese,
01:26
and we'll just like Bilbo Baggins. You don't invite people inside your perimeter that you don't know because you can't always be certain that they're going to sing is they? Throw around your priceless antique china, either.
01:38
The reality is that most bug researchers don't make a lot of money doing this.
01:42
In the CSO Investigative Report about bug bounties by J. M. Poor up, he found that people who made over $100,000 the entire time they did bug bounties was actually in the low hundreds of people.
01:57
And that's $100,000 the entire time that they were doing. Bug Bali's not Annually
02:04
and Bug Bounty programs can have restrictive terms. In his March 31st 2020 article for fortune dot com, David Morris details some of the difficulties that security researchers were having while trying to use bug bounties.
02:16
MIT researchers found bugs and votes on online voting application
02:22
votes was offering a bug bounty. However, researchers almost immediately ran into trouble because the bug bounty Terms said buy votes restricted their ability to conduct proper testing.
02:32
The term stated that security researchers could not test on the actual application but had to use a test app provided by votes. The test app did not function the same as the actual app.
02:45
There's a link to the article in The Resource is, although it's behind a paywall.
02:50
This is the article I spoke about by GM. Poor up. I included a link to in the resource section. It's a great article because it sheds light on the topic of crowd sourced penetration testing. The crux of his article says that security researchers are being bought off. And how does the author say that's happening?
03:07
He says It's simple that they're being encouraged to trade safe harbor for a non disclosure agreement.
03:15
Poor abuses PayPal's terms of service To illustrate the Catch 22 the security researchers often find themselves in
03:23
PayPal reserves the right to withhold safe harbor at their own sole discretion, and the only way for researchers to achieve safe Harbor is if they agree to PayPal's India.
03:36
So going back to our previous lesson, can you now see where they made me? A better offer? Looks so tempting.
03:43
Poor up. Some of this type of conditional safe harbor, as signed, this India to report a security issue. Or we reserve the right to prosecute you under the Computer Fraud and Abuse Act and put you in jail for a decade or more.
03:55
So what is responsible disclosure? Well, responsible disclosure programs give companies a mechanism to accept disclosure. They're generally a formal program with policies and process. It may include Safe harbor in exchange for an N D. A. It is a mechanism toe limit impact to control outcomes.
04:14
And one of the main difference between this and bug bounties is the lack
04:16
of of a bounty and disclosures performed after a reasonable amount of time, usually about 90 days.
04:23
Which takes us to Google and Microsoft. Google has a 90 day policy disclosed vulnerabilities. So in January of 2015, Google published details about a Microsoft Windows vulnerability, including the exploit code. One day before Patch Tuesday, Microsoft then
04:39
publicly blasted Google for a reckless behavior and irresponsible disclosure. But it hit their 90 day window,
04:45
and
04:46
there are a lot of different frameworks for building a vulnerability disclosure program. We're not gonna go into each one of these, but feel free to pause the video to review.
04:57
So let's talk about what it takes to build a vulnerabilities closure program. It's probably surprising that it just takes the same things that we used to build in the other successful security program, policies and process and people in technology
05:10
On the policy side, what are the internal expectations? Can everyone agree on? What you're trying to accomplish? Is legal going to write the policy, or do you need to bring in external counsel? And certainly it'll probably probably be at least more than one discussion with senior management
05:26
and the process. Regardless of how you ingest a disclosure like Bug Bounty program responsible Disclosure program, you have to have a process for What are you going to do next?
05:35
Is there a racy involved? Who's responsible? Well, at least one person who's accountable Well, that's Onley. One person
05:44
who's consultant who's informed does engineering get to just get the disclosure to verify it? If so, who in engineering gets it. How long do they have to review it? Are they simply verifying that it's a possibility? Are they trying to recreate there? Can you recreate the conditions that were reported? What if the air goes beyond what was reported?
06:03
What if there are actually many conditions where this behavior could be observed?
06:08
Who's prioritizing one disclosure of the other? What priority does this initial investigation have ever? Normal work?
06:15
If it is verified, what's the communication out to the security researcher? How do you communicate? What do you communicate?
06:23
Who decides who's going to communicate?
06:26
How long will it take to draft the communication? Who's going to research the things? How does that patch process work and how is it prioritized against other enhancements?
06:36
Is there a scale to determine which bug fixes have a priority, and how long will that patch process take? Is it 90 days?
06:45
How long will what you do with the security researcher, or how are you going to communicate with them?
06:50
And what's the culture gonna be?
06:53
On the technology side? You have the right work flows to even get the right people involved and to track the process
06:59
on the people's side. Do you have the right people involved? Um are enough people assigned to this? How does this get prioritized in people's normal day job that they have to keep up with?
07:10
Hopefully, you can see the purpose of having a dedicated, talented professional sip. So dope lead and guide this process
07:17
during a vulnerability disclosure event. One of the most important roles the sip. So is gonna be their ability to work with the business. This is where having a defined process for handling incident management is so crucial.
07:30
You have to shield your developers and engineers from people in the business. They're constantly asking, Is it done yet? And how much longer,
07:36
if this sounds like, are we there yet? Coming from the back seat of the car during a road trip, it get ISS, but the effects are much more damaging. Whenever you bring the concentration and effort of the people thinking through a solution, then you literally halt the progress for longer than it takes to hear and answer that question.
07:56
That's the way you train of thought works.
07:59
Well, that's it for this lesson. In this lesson, we completed our journey into the mysterious world of vulnerability disclosures. We peered into the underbelly of the bug bounty programs, finding safe harbor and responsible disclosure programs as well. And we asked ourselves, How do I make my own vulnerability disclosure program?
08:18
Well, we decided we needed a framework and policy and process and a great sip, so
08:24
I'll see you next time.
Up Next