Ep.04 Casey Ellis | Bugcrowd and Responsible Disclosures
In this episode of the Cybrary Podcast, we sit down with Casey Ellis the Founder and CTO of Bugcrowd. Speaking with Cybrarys VP of Engineering Mike Gruen, Casey explains how Bugcrowd is a community for creating feedback loops between people who think differently and how the term responsible disclosure got its name.
Share it with friends now!
Mike Gruen, VP of engineering of Cybrary and Casey Ellis, founder chairman and CTO of Bugcrowd have a candid chat about their experiences and thoughts on Pen testing, DevSecOps and also Casey shares his career transition from an IT apprenticeship to Entrepreneur of Bugcrowd.
Casey talks about his curiosity, interests and moral code from a young age, and different experiences which made him an Entrepreneur after bailing out from university and finding an IT apprenticeship which gave path to his interest in Pen testing then to Sales and Solutions Architecture. His mix of different experiences made him found Bugcrowd. He also shares how the idea of bug bounty program starts appealing and the drawbacks during a meeting at Melbourne and how he thought to level it off and kicked off the bug bounty program.
Mike shares his thoughts on the SQL injection attacks on Cybrary. Casey agrees up on the hostility and shares how the attacks are ever changing. They also talk about the bugs found and how to use feedback to make appropriate changes. Casey speaks about bug bashes or life hacking events which are conducted by them where they bring all the skilled talents to one table.
Mike also shares his experience of a Developer, Engineer and learning from black hat sessions also how he did DDoS unknowingly in his campus during his college days. They both share their stories and thoughts on DevSecOps, attack surface management and vulnerability disclosure programs. Mike excitedly announces that Cybrary is being part of the responsible disclosure program. He also shares how Cybrary is trying to enable people by providing labs and environment to learn from a safe space.
If you are interested to listen more about the journey of Casey and want to hear about the upcoming trends in the Pen testing world, don’t forget to hear the fun-filled discussion between Mike and Casey.
Mike: So, this is Mike Gruen from Cybrary, the VP of engineering here with Casey Ellis from Bugcrowd. Let him introduce himself.
Casey: Yeah, thanks Mike. So, Yeah, Casey Ellis, the founder chairman and CTO of Bugcrowd. I'm originally from Australia. So, that's with the funny accent you are hearing comes from. I have been out here in the US for about seven years now, we started Bugcrowd midway through 2012. Got into an accelerator program and once things started working rather quickly, it was always the plan to actually relocate over to the Bay area, raise some financing and do all that stuff. So yeah, here I am.
Mike: Cool. How'd you get started? What got you into cybersecurity?
Casey: Ah! Ooh! Wow! At a young age, you know, I would say I've always had this, I've always appreciated the idea of thinking like a bad guy and kind of enjoyed that, but never wanted to be one, if that makes sense.
Mike: Yeah, that totally makes sense.
Casey: Like seeing a locked door and automatically wondering what's behind it and how you could find out and all that sort of thing, but always having this, you know, being raised, I think, was with a fairly strong kind of…
Mike: Conscious! [laughs]
Casey: Yeah, conscious and moral code. Like, it's this idea of that's all fascinating, but don't do it in ways that cause harm to people. So, that was kind of the precursor to it. My old man was a science teacher. So, I was, you know, pulling apart radios and playing with computers in elementary school back in the eighties and stuff like that.
And really, you know, my career started, finished up high school in Sydney. Took a gap year as you do down there and came back to start university and loved what I was learning, but couldn't really connect it with any sort of purpose. So, I bailed out, after a couple of weeks, and kind of tripped over into an IT apprenticeship, which was where my Pen test career started because I was, you know, doing break fix and things like that.
And then I just started hacking things, you know, taking what I learned on BFS’s and IRC and through friends and stuff like that, and actually applying it. And then, you know, pretty quickly realized that this sort of feedback is really valuable to the organizations like they don't realize how bad, especially back at that point in time.
And I think we're having a Renaissance of that now, but it was so easy to do and like the demonstration of that risk to the organization, it was in and out itself a really valuable thing. So that's what got me started from a career standpoint. Yeah, I mean, from there going forward, doing the tech side of things for about six or seven years, moved across into sales and solutions architecture.
It was a bit of a weird move, but basically, I got married. [chuckles] My wife sat me down at one point and said, Hey, like you computer good, but you people good too. And I don't think you realize that's not that common.
Mike: Right, not that common.
Casey: Yeah. Maybe you should try your hand out the front of the house and, and see how you go there. And at that point I was like, I was intrigued by the idea of not just solving problems, but actually connecting the solution to where the problem exists, which is more around the sales and marketing side of things than it is the tech side. So, I did that for a period and then I think those two sets of experience got together and conspired and I got it in my head that I wanted to be an entrepreneur.
So, I broke bad, started doing that and, you know, Bugcrowd was really the evolution of a bunch of different experiences. In all three of those phases of my career, like looking around and actually at the time running a pen testing company that was white labeled and outsourced. You know, it was a great business.
You know, frankly, I miss it from time to time because I was working a lot less hard than I am now. But, looking at ,you know, the setup, like one person being paid by the hour, no matter how good they are is eventually going to be outsmarted by a crowd of adversaries and you know, especially when you consider the fact that they're trying to secure the product of a crowd of creative people on the builder side as well.
Casey: So, the math is wrong. That’s the problem that was kind of keeping me up at night and you know, saw the opportunity to solve it with Bugcrowd and so far, so good.
Mike: That's really interesting. So, what were the early days of Bugcrowd like?
Casey: Yeah, it was fascinating. I think, the origin of the idea of when it kind of coalesced, like I'd already started bringing in like incentives and game theory with the pen test team that I was running at that time, and that was really a precursor to some of the game mechanics that we use in Bugcrowd now. But the thing that was really the, call it the napkin moment or the light bulb moment, was a business trip I took down to Melbourne to meet with a set of customers we were working with in the traditional consulting context from all sorts of different verticals, like telco’s, you know, technology companies, there was a bank in there, you know, whatever else.
And what was happening at the time was Google and Facebook were starting to pretty actively market, the VIP - the vulnerability rewards program, that bug bounty program. They sort of gotten to the point where it's like, this is working. We want to go out and tell the board about it. And, you know, everyone I spoke to on that trip wanted to talk about it and, you know, they're asking me like, what do you think? Like, this seems logical, like it's subjectively cool. And you know, it's kind of crazy Bay area thing that people are doing that has a certain like sex appeal to it.
Casey: But it was more than that. It's like, this seems like a logical way to level the playing field. Like, if we've got this, you know, economic and resourcing in balance in how we're trying to outsmart the adversary. The idea of throwing, you know, the net as wide as we possibly can to get a diversity of talent, seems like a logical approach to it to, you know, basically creating a defender's advantage instead of being stuck with the attacker's advantage. Right?
So, my question at that point was, well, what's stopping you. And they all said the same things. It's like hackers are scary, you know, I don't know how to pay someone in Uzbekistan like my team's constrained as it is. So how am I going to actually manage the overhead of having a conversation with the internet?
Like there was a set of maybe seven or eight things that they all. Basically, said without having spoken to each other and it was literally on the flight home that I realized that it's like, hang on. Like, those are all potentially solvable problems. If we can, if we can build out a platform and a service layer behind that to actually address those issues that people have in accessing crowdsourcing in the first place, and then start to build out this army of people that can be brought to bear, you know, maybe we can. Take this concept from being a crazy Bay area company thing, and actually make it into something that looks more like the future of work inside in cybersecurity. So…
Casey: Yeah, that was the, you know, and literally that was, it's still a vivid memory. It was like, you know, a can of Pringles and a crown lager on a flight from Melbourne to Sydney. Came up with the name Bugcrowd on that flight. Came up with the initial reward model on that flight. Got home. Cause this is pre, you know, buying things on your mobile phone and registered the domain, like at the Twitter handle set up and all that sort of stuff. So, if you do a, whois on bugcrowd.com, you can actually see what day that happened and that allows part of our folklore.
Mike: Oh..Wow! Casey: Yeah, that's really how it'll go kind of kicked off.
Mike: So, and Cybrary suffers a little bit from the same thing, which is, you know, we service the cyber security community, but we ourselves don't really think of ourselves as being a cybersecurity company. Right. We're sort of this in this weird view. And I wonder if you guys sort of feel the same way, like I have to worry a lot about security because you know, like Fun fact, our SQL injection page is the one that gets most hit by SQL injection attacks. You know, the one that, you know, the courses or, you know, cross site scripting is the same. So, I imagine you guys are sort of in some more boat where you probably are a valid tar, you know, a valuable target or …
Casey: It's an extremely hostile platform.
Mike: But even more hostile than ours.
Casey: Yeah, totally. When you consider, you know, the power and the creativity and like the chaos, and I mean, chaos in the positive sense, like the chaotic thinking that comes with being a breaker containing all of that on the one platform is definitely, you know, we've learnt especially in the early days, but even still, you know, learning a lot about, you know, defensive architecture and Yeah. Even like the new attack techniques that people are using because I think the thing that's been really interesting that we've observed out in our customer base, but even, you know, with our own experience and the conversations that we have with the crowd, you know, half the attacks that are going to be really relevant tomorrow aren't broadly known or even like considered today.
Casey: So, like there's new stuff that comes over the Hill that as it turns out was always there the whole time. We just weren't thinking about it or looking at it in that way. So yeah, I mean that side of it's really fun. And obviously we get to see that for the customers as well. Like, we started eating our own dog food very early on in the piece, just because we thought it was the right thing to do.
Mike: Right. I think you drink your champagne, if you're going to try and sell it.
Casey: Yeah, but that doesn't I'm failing to invoke the Aussie self-deprecation if I talk like that.
Mike: So, at the last place I've worked with also, eat your own dog food, despite our sales team's best efforts to say no, drink your own shit.
Casey: Don't say that. Say it's not official. So yeah. It's an interesting, cause I, you know, I do think as well and this is probably, I sort of heard you tapping into this at the front of the question, like my view.
And I think our view as a company is that security, like this is not about bugs. It's not about, you know, vulnerability scanning. It's not even necessarily about some of the higher order things like, you know, disclosure and like legislation around, you know, good fight packing and things like that. It's actually about creating feedback loops between people that think differently.
Casey: You've got builders who are very good at, you know, making a thing work and they're incentivized to make that thing work, but they're not necessarily good at KPI add on is making, not do all the stuff. It shouldn't. Right. So, like logically you need a mindset that can come in and thinking that way. To, you know, identify like the risks that are obvious and need to be fixed straight away. But I think it gets even better when you start to get a bit of a, almost like you know a Vulcan mind-meld thing happening from one side to the other. And you end up with builders that, you know, they're not security experts necessarily, but they're just mindful of the fact that yeah, bad stuff can happen.
Casey: If you aren't thinking about this as you do your job.
Mike: Yeah, no, I mean, that was my experience. So, my background, I was a VP of engineering, but all the way back, I've been a software engineer. Right. And I got into cyber security kind of similar to you in the sense that like, I just sort of like I've always been a tinker. I've always taken things apart. I always wondered like, yeah. I wonder how I could break that. Probably shouldn't and then, you know, flash forward a little bit. I got exposed. I was on a con, was working as a contractor and I was the liaison between the company that was doing the security testing on the platform and the company the contractor that was, had built the thing and then the client. Right and it was amazing. And one of the things that you mentioned about being like, well spoken and also being able to do the work.
The particular person who was doing the testing was really good communicator and felt in a way threatened by my presence and showed me all these cool tricks. And I was like, Oh wow. Now I know how to prevent this. Now I can test this stuff on my own and make your life easier. Next time you go to do the scan, go to do the testing. He doesn't want to have to write up a big report.
Casey: Well, and the, you know, I think pen test is one of the, it's always been. And it's, you know, continues to be an issue in the pen test industry and with automated tools as well. You find a thing and you know, you come back 12 months later and you find that same thing again. And then, you know, talk on soft to that same thing still there. And it's because like, really, it's a failure to actually sell the bug, to the engineer. It's like, you get what's going on. Like, you've heard what I've said. You can probably even possibly understand the technical side of it, but, you don't care enough to prioritize it above all of the other things that you need to get done in a given day.
Mike: Be a little bit, I think in a lot of cases, I think it's, it gets back to what you're saying before about that sort of education that just being mindful of it. It's not, I don't think any engineer goes into it sort of dismissing those things. They just forget or don't really think about it or whatever. And so that education of, because to me it's never exactly the same thing. It's always some sort of slight variation on a theme and it's like, if I could just teach you the theme, like sanitize your inputs is the solution not like The SQL injection, you know, there's, you know, there's other things that.
Casey: Fix that one thing in that one spot. Yeah. No, totally. Yeah. And just, just to be clear there, I'm not talking about it being a product of ambivalence it's really priority. Right.
Casey: Cause like, if you are building stuff as was saying before, if you are building things and that's the KPI, that's the thing that makes it business money.
Casey: Then logically any of the goals needs to fight to get to the top of that list.
Casey: And I think that's the hard part. So, you know, the bit that we've seen be really effective when you start getting reports in, from the inside, from the outside world, rather from the crowd, it feels ,to a security team, but then also to an engineering team and anyone else around, like you being hacked for real, it's like, Holy crap, this kid from, you know, 8,000 miles across the ocean. Just owned my staff. I didn't think that the boogeyman was real in the way that I, that I now think, because you know, he's friendly and we're going to pay him and that's all great. But I wonder what his next-door neighbors like and all of a sudden, this awareness of the fact that no, not, not only do vulnerabilities happen, but they can be exploded by the outside world. I think that's, The Oh, shit moment is what we refer to it as.
Mike: Yeah, no, I agree with that. I think I'm making it a little bit more real; more tangible makes a lot of sense because I think a lot of Devs sort of don't necessarily think about boogeyman breaking in their stuff. They think about like inept users breaking their stuff. you know, which is also a very valuable thing. And, you know, QA teams are there for that. And it's the same sort of thing of like engineers are good at building. Not everybody has the same comp you know, the same capability when it comes to destroying it, it's definitely, a different skill set and it's very, very hard to find someone who has both different sets,
Casey: different mindset. Yeah, exactly.
Mike: And so yeah, when it comes to, so just having some awareness of how your bridge may fail is an important. Like part of building bridges. Right. And so, somebody has already figured, you know, and so I think that education, and then also making that boogeyman real or more real is definitely is good. I think that's been my experience as well.
Casey: Yeah. I, to me, cause it's, you know, when you think about bridges, it's a good analogy. Cause you know, structural engineers and mechanical engineers, a lot of what, they learn, is around failure.
Casey: It's not, it actually starts with failure. I think for a lot of, you know, the training and the course that those type of people do. And I think those sort of any patterns and that kind of training, there's a lot of that around in engineering in general. And definitely security has tried to create more of those and get them across. The reason that I keep on bringing it back to this, you know, this kind of bogeyman concept is that Yeah. And the, you know, the priority stuff that we're just talking about as well. Like once the awareness of the importance of this is internalized, then it changes. Then the behavior changes like all the while that its external, you can choose to do it. Or you can choose to deprioritize might not be a function of, you know, ambivalence or like understanding or anything else. It might just not be the most important thing. That needs to be done at that point in time. Right?
Casey: So once, once the need and the importance of doing it is internalized. That's when the behavior starts to change more long term, I think.
Mike: Yeah, definitely. And I think it's actually more than just engineering. I think most, a lot of engineering, especially the backend engineers, it's already important to them. It's already sort of there and I think it's actually at the business level where there's, where their struggles and, you know, making it more real to them. And you know, the problem with the bridge analogy or the building analogy is when a build a bridge collapses, people actually die, right? When a website collapses. Probably not so much, but it's becoming more of a thing right. As, as these breaches happen and as there's way more money on the line there's businesses that can go under.
Casey: Well, and I would, I mean, you're absolutely right. I think, the ties to safety criticality from the cyber domain, or way less obvious.
Casey: And then they, I was, you know, bridges and aircraft and. And cause, and, you know, dams and different things like that. But the thing that we're seeing more and more, and actually getting a lot more engaged with over the past couple of years is the fact that we're putting the internet on cause and dams and...
Casey: You know, bridges and all of that other stuff. So, so the, I think, the bridge between the cyber realm and the physical realm, you know, safety, critical engineering and site, the critical testing is kind of having it's like, Oh crap. We need to get, go to that really quickly.
Mike: Yeah. I mean you look at all of the like parts of the grid that are online and available online. I mean there there's the incident out in California, they shut down the power and somebody actually did, you know, unfortunately pass away because they're on an oxygen tank. And, so it, we are definitely crossing that boundary. I agree with you. Also and the, I do think that the businesses are becoming more aware and one of the things I like about Bugcrowd, one of the reasons we've, you know, sort of move forward and we were happy to announce that, we're also doing, a responsible disclosure program with you guys is that, you know, it'll make it a lot easier for me to report up to my management, the things that we're seeing. It doesn't just come in as yet another bug, into our pipeline. I can actually really segment out what's, you know, the vulnerabilities and those issues, we have a really good dev sec ops mentality here.
Our whole load process has vulnerability scans and dependency checks and vulnerabilities on those dependencies and so and so forth. But in the end, those don't catch. A lot of you know, that they can't catch the more complex or the more nuanced problems. And, so having reports from those systems plus reports from the outside, I make makes it so much easier to sort of say like, this is, this is where we're at. This is how I; you know, this is how much time we're spending on it. This is how I feel like from a security perspective, it sort of puts, it makes it more real for them as well.
Casey: Yeah, definitely. And it aligns the business behind it as well. I think going back to what you're saying, like the idea of, you know, securities, like it's a fascinating space.It's really interesting. Like the stuff that we get to see is just. Fun. There's elements of it. That it's just fun, frankly. But when you boil it dry, it really is just a subset of QA. like these are bugs with benefits. Right?
Casey: And you know, it also really just. Functionally, is a subset of, business risk management in the same way that like high chart risk is of thing or risk of like Forex or arbitrage or whatever else. And I think, you know, where we really up to as a market everywhere, like the hackers and the defenders and the people that are practitioners in the middle as well is, you know, beginning to realize that, but then starting to work out how to operationalize it.
Casey: If this is a risk issue, like how do we, how do we quantify it? How do we make sure that we're investing appropriately in it? Like not too much, not too little, you know, all of that sort of stuff. I think the need to do, that's becoming pretty clearly understood that we haven't quite figured out how yet. But you know, the whole idea of, you know, you referred to, you know, responsible disclosure program.
I think part of it is this idea of. A business being mature enough to say, okay, I'm ready to accept and actually invite feedback from the outside world on my security posture. You know, it's interesting cause that's not a new thing. Like that's been going on for 30 or 40 years. It's the, use of the word responsible. In that's been, a hot debate, pretty much that entire time, because, you know, I like and I've actually defended the use of responsible as responsible disclosure versus, you know, vulnerability disclosure or coordinated disclosure, one of the other ways that you could frame it because there is a responsibility on both sides to, you know, see the conversations through, to actually measure the risk, you know, to communicate, to empathize, to do the right thing by the user.
Casey: That's where the responsibility is. I think the challenge with that term is historically it's been used as a way to put researchers back in their box. It's kind of, there's a loading almost.
Mike: I'm interested.
Casey: If you're a hacker and you're trying to help me, and you're not doing everything exactly the way I ask you to do it, you'll somehow therefore irresponsible which you know, can be true. But I think the way it's been used has been, a pretty kind of heavy hammer. That's been dropped on the heads of a lot of people historically.
Mike: That's really interesting, because I've never really spun it that way. The responsible part being applied to the person, doing the submitting, I've always considered the responsible part, being also the company, owning that responsibility of disclosing, which we know that we have these problems,
Casey: Which is perfect. And it's the way that I think that's, you know, should be used. I think there's been some really interesting debates. It's one of these ones where if you start a fight about it on Twitter, it'll go on for like three days.
Casey: Almost, without fail, but you know, the whole idea of like, you know, people having this default, on your side, as an engineering leader, like the default is no, this is our responsibility and it's the responsibility of the submitter. You know, to do the right thing as they do that. And we're both all going to be responsible and shit's going to get fixed and it's great. That's not historically, how it's been I think perceived or interpreted.
Mike: So right now, I can see that, like I'm not from that world. Right.
Mike: So, I can see not because I'm naive, I assume. And my naive take came from like, assuming that the reason, why it's not called a vulnerability disclosure program is because some marketing person at a company said, we don't want to put up that we have vulnerabilities. And so, let's come up with a different term. So that.
Casey: Honestly, that was one of the other reasons that we, that we continued using responsible as a term after, we started Bugcrowd for a period, like we we've only really shifted away from it of the past. Two years or so. It's for that exact reason, like responsible as a marketing term, you know, when you think about it, you've got someone in the security team or engineering team, or whoever within an organization that decides that this is a good idea and they want to do it, but then they need to sell the concept of the rest of the business and, you know, going out and saying, cool, we're going to have a bunch of hackers, like look at our stuff and tell us what's wrong.
That can be frightening to some people, right? So, like whatever can be done to soften the language around it. That was actually, part of it as well. But yeah, I mean, you know, vulnerability disclosure I think is, is technically accurate. We're at a point as well, where especially for, you know, actual first and kind of cloud first companies, the idea of vulnerability is just being a product of human creativity and not like this thing that we never have because we don't talk about it ever.
Casey: That's that I think is becoming more normalized now. So, you know, it's, I think there's, I can see pathways forward on this one. We'll see how we go.
Mike: It makes sense.
Casey: I'm not, I'm not, I'm definitely not going to die on that hill, but it's, it's one of those ones where it's just interesting to understand. There's a lot of history behind what we're doing. Like Bugcrowd, didn't invent the idea of, of bounties or crowdsource single disclosure. Like we pioneered the concept of intermediating that through a platform and then extending the use case. Like that's the part that we did invent, but the rest of this stuff's been around for a really long time.
And I think you know, some of the history behind it's actually really important and good to understand. So as to avoid similar mistakes in the future.
Mike: Yeah, no, that makes, yeah, it makes total sense. So have you guys, when you think about like what you're doing and you sort of, do you also see the, sort of the benefit to the hackers in terms of like, by providing them this outlet, to do this work and get recognition for it, that that also is having a positive effect, you know, sort of in that world?
Casey: A hundred percent! My very favorite stories of the ones where it's, you know, like kids who frankly, have access to a lot more ability to create damage, than I did, you know, as a young hacker, like everything's far more connected and moving far more quickly and yeah, so that's a nine year old can reasonably do some pretty evil stuff.
Casey: And that's, that happens, right. That's actually a thing that does happen. So, you know, there's this phenomenon where you've got these digital natives that think in these quirky ways that we're talking about before. That ended up in a position where, their power and their ability has kind of outpaced the development of the moral compass.
So, they're not like they don't like actively deciding to be a bad person. They're just kind of wondering, wandering off it's a sketchy territory. And, and the challenge is that, you know, for a lot of the people that get recruited up into gangs and, and, and organized crime and all that kind of stuff, that's often where they get picked up. Because you know, they'll get deemed as a juvenile offender. They'll get. Yeah, they'll have a record. They'll get actually set, on a path that repeats that. And then they get identified and basically drawn into some pretty, pretty shady territory. So, you know, the fact that we get the opportunity to intercept that, and to create like a, you know, it's like the, the CSM, the Jedi kind of thing. It's like, he's the Jedi option that you've got now. It's actually pretty easy for you to get into let's roll. That's pretty cool.
Mike: Yeah, no, that makes a lot of sense to me. I sort of see that I feel like had I had a place early on in my career. Like, I, there was a moment where I thought about going more into the cybersecurity realm or into engineering or wherever and I didn't feel, I, you know, for whatever reason, I continued, continued to pursue building things. , But I wonder, if there was, if a platform like yours existed at that time, if I would have been more inclined to at least have that outlet on the side at the end, in the evenings and stuff like that,
Casey: Honestly, I think, you know, what, what I would love to see I've got a theory that DevSecOps, is evidence of the fact that security and engineering are eventually gonna merge.
So, won't be, you know, us versus them, or like there's different specializations and skillsets and the builder break things still happens but they're not seen as much as different teams. Like you see that with like security champions. I mean, even the story that you told, your origin stories, a little bit of an example of that, like bridging those gaps and bringing the whole thing together.
So the idea of engineers that, you know, do a little bit of hacking in the spare time to, you know, stay sharp or to learn new things or to, you know, make extra money or whatever else like that, to me would be an ideal goal state, like builders that learn how to break and oftentimes being the most effective, because they understand, where the bodies are buried. So, to speak from that standpoint.
Mike: Do you think the opposite direction?
Casey: Yeah. I think it's an interesting one because break is that learn how to build. You know, they tend to, they can have a tendency to get tripped up by all of the bad stuff they know can happen. But, if they've got the right kind of, you know, mental shape and the right kind of approach to product development, then that can be really effective as well. I mean, to be frank, there's more, I would say that the, the skills deficit and cyber security is a lot. We'll pronounce in the skills deficit in engineering. So, I haven't seen as many go in the opposite direction.
Mike: If that makes sense. I've definitely seen, I think it's also a lot easier to get into building things than it is to get into breaking things. Right. Like the first things I ever did on a computer were writing little, I mean, it was one of those, a mechanical engineer in college, right. It was just writing little shell scripts to do this, this and this and incidentally like my first foray into cybersecurity was when I accidentally DDoSed Colby college trying to just see if my girlfriend was online.
Casey: Ping, ping, ping of death or something like that.
Mike: But yeah, I know tons of finger commands. So yes, So, yeah, so I think maybe, that's true. And I also have seen like a lot of people make that transition from building into breaking because they sort of, they have that right mindset and, they're interested in it.
Casey: Well, I think with that, it goes back to, you know, I've been. Like involved in, in cybersecurity, in various forms since you know, the 90’s really, and working in it from a career standpoint since the early 2000’s. And I think the thing that's shifted a lot over the past five or six years is that, you know, people care now. Previously, they kind of didn't like that, those folks that got it. And, that was else about it. You know, there was people that were fascinated by it, like myself that were trying to push that whole thing forward, but there wasn't this kind of backdrop of, okay, we understand that this is an issue that needs to be solved and improved. You know, my theory is that Snowden was actually kind of one of the trigger points in, people shifting how they think is it all of a sudden, it's like, Oh, wow. Like this actually does affect me. It's a thing and it is connected to me in some way. That's something that the entire population got exposed to as a concept all at the same time.
And then when you think about what happened the year after that, you know, 60% of the credit cards in the US got pops, you know, the year after that you got like Ashley, Madison, and healthcare stuff, getting breached the year after that you've got, you know, election interference and all that fun stuff.
And then it's just gotten progressively more dystopian but I think more easy to understand for the average person over that space of time. So, what that nets out to is, you know, if it's repeated enough times the dinner table, it makes its way into the boardroom, which means all of a sudden, a whole bunch of people getting recruited into that space, which creates opportunity.
Mike: So, yeah, I'm looking forward to a whole bunch of much better hacker movies and television shows that are a little more accurate.
Casey: Yeah. I think the problem with that is the hacking, it's really boring. It's actually really boring to watch. No, I know, right? It's like you spending half your time waiting for scripts to run and it's like, it doesn't really make for good television.
I will say the exceptions of that. You know, we do a thing called bug bashes, which are life hacking events. And yeah, it's almost like trying to construct a mini kind of drama, you know, eight-hour TV thing. there's like a lot of prewiring that goes on and people. You know, half the stuff that I think that they're going to do on the day, but we fly in hackers, from all around the world that have skills that are relevant to the target or the organization they'll ship in their engineering team and some executives and whatever else.
And we basically just, yeah, knuckle down and break stuff over a weekend, like doing that with cause for example, that's actually pretty fun to watch because you know, people are getting the crowbars out and trying to like, I mean, IOT is interesting as well. Because you know, if I could like J tagging into stuff like dumping firmware, where you can actually see the manifestation of what happens when you do hack stuff.
Mike: Right. Turning it way thermostat.
Casey: Yeah. Yeah, exactly. Exactly. I mean, the, you know, the vehicle stuff, is crazy. Because it's such a big, like this such complicated systems and, you know, we've been like slapping the internet on them for the past 10 years without necessarily thinking of it as a safety issue until the last four or five.
Casey: And there's a lot of, you know, there's a lot of stuff to test that, which makes it really fun.
Mike: You know, my favorite Black Hat. So, a session that I went to a few years ago was on the car hacking stuff.
Casey: Right, one of your interests.
Mike: I think that was, it was, yeah. And one of my friends also as a side gig consults to Detroit. So, he was the one who was like, you got to go see this, you know, it's not, you know, it might not be applicable to what you're doing, but it's going to be the best thing you see at Black Hat.
Casey: Was he wrong?
Mike: No. Well, he was wrong in one way. It was applicable to the stuff I was doing. I might not be building cars, but we actually had, at the time we were building out a message bus type system. And I came back from Black Hat and I was like, okay, here's a whole bunch of new requirements for you guys because I don't want replay attacks. I don't want, you know, all of these different things that we hadn't even considered. If we just assumed that our little bus was going to be completely unaccessible. Like it's just, yeah, of course. We'll trust every message on this damn thing.
Casey: And that's, I mean, that's like literally, you know, when we're talking about critical infrastructure and all that stuff, or like, that's the same problem that's replaying now. You know, in aviation, in, you know, shipping, in power, all that stuff. Like the systems like these are, these are networks that have been around for 30 or 40 years.
Sorry, my phone's ringing here. So, you know, networks and systems that have been around for 30 or 40 years and all of a sudden, the Internet's inside them. And it's like, Oh, hang on. You know, we didn't design, you know, access or permission models, or even like, you know, traffic control models, that actually support.
Yeah, the potential for a hostile actor to be within this network. What do we, what are we going to do about that? Like the first thing obviously is keep them out, but then okay. Assume that they get in at some point, how do you start to redesign the things that you've built?
Casey: So, like that what you went through with the message bus thing, like it's still, I think an anti-patent that exists in a lot of places. Oh, No one's ever going to touch this.
Mike: Right. If we have a hard enough outside that we don't right. Then we don't need to worry about this.
Casey: Yeah, obviously this shift is shifting as everything turns inside out, you know, cloud and digital transformation, like for the older organizations in particular, they're really struggling to get their heads around that because it's a fundamental shift in threat mode.
Mike: Hmm. Yeah, that makes sense. I think it's also, I think there's some people, even within those organizations that have their head wrapped around it, but, you know, you're steering a giant ship. The larger the organization, the slower it is to turn
Casey: Yeah and the more moving parts you've got to line up in order to turn, I think that actually ends up being the bigger issue.
Mike: Right. Yeah, definitely and all of the legacy stuff. I mean, there's I did a brief stint at a larger organization who I think, you know, they were taking the right steps in the right direction in terms of, even in a DevOps steps, like Ops type direction, but they're just so large that I was like, I don't know that I have the patience, to be here for, to see this all the way through.
Casey: Rolling Dev, like rolling DevOps, rolling Agile out, you know, digital transformation teams that work in companies that are like 30 or 40 years old. Honestly, I tip my hat to those folks because it's a big, like that's a lot to manage. You've got, you know, the folk that have been doing it the old way, looking at some of the newest stuff and saying, well, why aren't we aren't doing that?
So, they have to keep their arms around, you know, becoming accidentally Agile but then if you want to fully transform, it's not just, you know, engineering processes that need to change. Like it's literally the entire cadence of the business.
Casey: It needs to be basically forklifted at that point to be able to support it. So...
Mike: Right. And sometimes...No go on...
Casey: I spend a lot of time with you. You spend a lot of time with the legs on either side of the fence. I think.
Mike: Yeah, no, I mean, I was in several meetings where the business side, the people who were, you know, in a lot of ways, the most affected because they didn't have direct access to engineers anymore. The shift was like, Oh, you know, there's like programmers assigned to these guys and they just tell them what to do and they hacked whatever they need to get hacked into the hacking a building kind of way, not, the...
Casey: The original definition of the world.
Mike: Yeah, exactly. So, yeah building some things and then now all of a sudden there's like more processes in place and this notion of like, no, no, no, you don't each have your own tab. Like we're gonna pull these guys and then build sort of these multifunctional teams and you're going to be a member of a team. And like, none of it was its, and there was no quick, there was no way to sort of roll it out in a in like a phased in, it was definitely like a, we have to do this kind a big step and they were...
Casey: Big bang. Yeah.
Casey: Yeah. I mean, it's interesting without, you know, we just announced a new product called the Texas office management and part of it, is actually, to address the problems that the organizations, that are going through that have because usually, there's two, two big challenges. Like the idea of where's my old stuff but then, you know, also where's my new stuff.
Casey: The legacy IT issue, like the, where are the things that came in through that acquisition that we did back in 2002 that we thought we switched off, but, you know, it's probably good to go and find them and make sure, because usually by that point they're gone and you need to literally go out to the entire internet and go looking for stuff that you believed to belong to them.
Casey: And then on the shadow IT side, you know, especially for these larger organizations where there's like pockets of transformation happening, but it hasn't been rolled out right across, you end up with a whole bunch of stuff that's happening. Not necessarily under the governance or the oversight of the security team, or even the IT or engineering team.
Casey: It's just like, a yellow credit card and an AC too. And off you go, and that's a really common thing, which is reasonable because you're trying to build a product and get things done. Right?
Casey: But the risk is …
Mike: As long you keep it under this budget. My corporate card is going to be okay.
Casey: Yeah...Exactly. So, there's really what attack surface management was born out of, you know, watching what happens with bug bounty programs. You know, a lot of the risks comes from people like from things that people have just forgotten about that are on the internet. Like, that's what we've seen and because, you know, the traditional bug bounty model is the first defined each unique issue as the one who gets paid and the more severe the issue is, the more they'll get paid for it.
The different hackers have developed their own workflows to look in places. They feel that others aren't, which has netted out to this whole, like, Holy crap, we don't know where our stuff is kind of realization we've worked with. And I think the internet at large, like this has been a problem for a long time, but I feel like, it's actually had the light shown on it, you know, partly with credits of the to the bug bounty hunters that have gone and done this.
So yeah, we've optimized just to do that part. It's like, all right, let's go find your stuff. It's, you know, it's 11:00 AM. Do you know where your assets are? And then prioritizing it and actually being able to use the context that we have from having done this now for seven years to say, okay, we're not going to test each one of these assets, but based on what we've seen before, these are the ones that you should go and look up at first.
Mike: Yeah, no, the whole shadow IT thing. Even small company, you know, so I've worked predominantly at small companies. Last couple of jobs I've been in charge of IT and then like, as the VP of engineering, I'm head of security and IT and infrastructure and all sorts of …
Casey: Many hats.
Mike: Yeah. Many, many hats, right because small company but it's amazing that how much procurement and other things don't necessarily go through even in a small company because you're trying to move fast. You don't have all the processes in place. No, nobody even knew, Oh, I needed to get, I needed to talk to somebody before I bought that thing, you know? And I think that that's, you know that happens a lot. So, I don't even think it's a, just a big business problem. I think it happens at any company and I think there's more and more technology coming online and for people like, the, my go to example is like marketing, right? Marketing has access to all of these tools and all of these capabilities and making them more security aware when they're looking at how they're, how they're going about doing their thing, is really important because now the choices they make can actually affect your security posture.
Casey: Most definitely. Yeah. And I mean, adding to that, I think the default route to market, for any of these platforms is actually to go through that back door.
Casey: You know? Okay. How do I find the person with the credit card and the idea, well, the, all the pain point or whatever, and have them, you know, try this thing out and then kind of dig yourself in and then off sell from that point? Like, yeah. I love that as a go to market motion because I think it's, it's really given the fact that digital natives are growing up and getting checkbooks. It's probably how things are going to look very often than not in the future, but the risk it creates, you know, you've gone and gotten this platform.
And then the next thing you're doing is integrating it in with Google drive and plugging your emails through it and all of this stuff. And all of a sudden, you've got this like authenticated. Basically, doctoral into your organization that was actually built to be that, that isn't the work of a malicious actor. It's like, okay, that's a fun, that's a fun problem.
Mike: Right. Yeah, yeah I mean, when I think about Chrome extensions that potentially have the ability to read all of your emails, it's like, I can't control, you know, it's very difficult for me to control that or all the Slack apps it's like, and then people complain that I've cut off their access. It's like, I'm sorry like need to worry about these things. Slack is our major form of communication. We don't use email, so everything's in there.
Casey: It's a really like the, my favorite quote around that is the idea it's aspirational because I don't think anyone really does it like lately or that well yet, but the idea of like making security easy and making insecure, obvious as like the, the goal cause like this whole, but you know, I need to use that Chrome extension or, you know, this tradeoff of usability for security, that's not going to go away.
Casey: So, the problem becomes, okay, well, how do you allow as much freedom as possible to be able to get your job done whilst making insecure things obvious to the user into the business as well.
Mike: Right. And that's actually Cybrary. I mean, it's almost as if I put you, you know, you're a shell for cyber, but that's a lot of what we're looking at is, we call it security enablement, which is really how do you educate more and more of the tech team where, you know, that's where we're getting started is, you know, in the step SecOps and engineers and others and helping them to make good decisions. And then like, beyond that, like I, you know, I sort of see it, you know, security going well beyond first of all, well beyond security awareness, which like, don't click the link. Okay. We get it. But yet that's actually really hard to, to not do in many cases. And then.
Casey: Internet's kind of built on links. So, spout advice.
Mike: That was one of my other favorite black hats by the way, was a, I can't remember her name, but she had done. She basically invited somebody to try and spear phisher. She's like, I'm a security researcher. There's no way I'm going to fall for this. And like, sure enough, she felt like for three out of five of their attacks, because like, it's just that easy.
Casey: Yeah. I mean, people, you know, people are the weakest link in all of this.
Casey: It's a phrase I trot around a lot. Like cyber security is, is a human problem. The technology just makes it go faster.
Casey: The idea of like people making mistakes and the idea of, you know, people being motivated to exploit those to their own advantage and to the detriment of the person who's made the mistake. Like that's, that's a concept that predates the internet by a couple of thousand years.
Casey: But now we're in this position where it's, it's all, you know, hyper-connected growing and changing. Like by the second and really the ease of entry to be able to, you know, both make those mistakes, but also exploit them is going up alongside that. So, you know, that becomes the challenge.
Mike: Yeah, no it reminds me one of my favorite quotes and I'm going to get it wrong. Probably is a Mitch Radcliffe from like 93 or something along the lines of “No other invention in human history that lets you make mistakes as fast as a computer with the possible exceptions of tequila and handguns.”
Casey: And if it was today, it'd be like, well, and Twitter, you probably started Twitter on top of that list now. But yeah, that's a great quote. Yeah, I like that.
Mike: The other one, you know, we were talking about, the sort of shadow IT and people finding that guy with the credit card of that, that person with credit card to just get you in and get you embedded. It all goes back to, you know, another, like it's the, it's easier to ask for forgiveness than it is for permission.
Casey: Getting something at a lower level.
Mike: Yeah. Because it's so much easier to get, like once you, once we rely on a thing, I mean, that's every place I've worked at where we've really sold. Like solutions to a team that's like that's, how we, that was our way in, right?
Like, we'll get through mostly in the government space. Like, how do we get this up? The approval process is impossible, but we have this awesome natural language processing engine. And like, if we can just make ourselves invaluable to them, then somebody up. On top, we'll say, yeah, no, they need this and we'll figure it. Well, we'll take the risk.
Casey: That's totally. And, it's interesting because that forgiveness not permission like Grace Hopper is practically a patron Saint of Bugcrowd. We're massive fans of hers and have been for a long time. And that's that quote is usually attributed back to her. It's a funny one because I think, you know, really, the direction that technology and business, like right across the board, and obviously this is true in tech, but I think it's happening elsewhere as well is that model of forgiveness, not permission. It's like, like I how, how hard can you push a thing to get the things that you need done either as an individual or as a company was your view of what the future should look like?
And then, you know, it'd be able to see forgiveness you know, if you get it wrong. Right? So, I think there for security, the challenge becomes how do you. Yeah. How do you minimize the damage? How do you minimize the impact of the people that, you know, might need to ask for forgiveness later? Because it's not like you can ask them not to do that.
Mike: Right. Right.
Casey: That's, that's part of the challenge.
Mike: Yeah. That's very interesting.
Casey: Very cool.
Mike: I know we're getting close to sort of the end of the time we had booked them. Do you have any, I mean, is there anything else that we want to cover before we sort of sign off?
Casey: Yeah, I think, I mean, just in terms of, I'd love to hear a little bit about, how you know, this whole kind of offensive security side of security, cause there's obviously a lot to it and you know, the idea of breaking things is just one piece, but you know what what's Cybrary, like how you guys are thinking about the evolution of that over time because it clearly is rising in popularity. Mike: Yeah, Let's see.
Casey: How do you teach your team to be breakers? Let me reframe it like that. We'll just think like breakers.
Mike: Right, to think like breakers.
Mike: So, I have an advantage in that I've as a small team, I've hired some pretty senior folks. So, you know, it's a matter of having a couple of the right folks on the team that have, you know, you sort of interviewed for that mindset. At a senior level and then you know, they're the ones who pushed those ideas down to the other. So, I really looked for the senior engineers to have some of that already and you know, there's an extent that I have that background as well, so I can push off on that, you know, talking to people, you know, so I think I might be at a, you know, I don't know if I'm a good representative set or not, but definitely, I mean, having a couple people on the team who have those sort of ideas and, and have been through it before, definitely help and DevSecOps I, mean, there's a reason why I've really embraced it.
It's more than just, you know, Oh, let's treat our infrastructure as code, and all the rest of it. It's how do we really secure the entire pipeline? And impart some of the things that, you know, we have to do in order to make engineer's care is like, make it so that you can't commit, you know, you can't merge the master. If you have some dependency that has a vulnerability in it, like go fix that or, you know, and just making it part of their job in an automated way. Just like any other sort of tests or other thing. I'm curious, actually, if you guys have ever thought about like some sort of model, where can you get more involved earlier on like I would, by the time we're out in production or even in a QA environment, I feel like things have gotten a little too far. I would love, you know, the earlier I can bring some people into sort of look at what we're doing the better off or happier I would be.
Casey: Yeah, not definitely. I mean part of what we've done there, we've done partnerships with like Dust vendors. We've done some partnerships with security training like security awareness. Specifically, around remediation and prevention of, of, you know, bug classes in, in cars like that particular section of it. And the interesting thing that we get to do is, you know, because the crowd is so diverse in its skillset, but also because they're incentivized based on impact, we get a pretty good view of, called systemic anti-patterns that exist within, within an organization.
It's like, okay, you've got. A lot of reflected XSS kicking around the place. Like, is that an education issue or is there a framework fix that, you could deploy or, you know, doing things like what you're saying, you know, putting controls in the CICD pipeline or linting or whatever.
Casey: Yeah, there's that, there's you know, different, there's different kind of outcomes for different types of class of vulnerability obviously. And, you know, some are far more common than others, but the ability to use the crowd is almost like a temperature gauge on, you know, what is this team or even groups within a team most needing education around when it comes to best coding, best practices and coding.
That's something that we've been, been doing now for probably six or nine months. I think, you know, getting deeper, into the like shifting left to borrow the Trobe. Some are like Why is it left? Why isn't it like...
Mike: It's funny because I was talking to someone the other day who called it shifting. Right?
Casey: So, I've heard people say like East and West and I don't know what we're doing here. Just tell me. ...
Mike: We're just linear. Right?
Casey: There you go. It's all aligned apparently, but it was a circle, so that's confusing as well. But anyway, yeah so, the idea of being able to, you know, we did a geek for the Air Force which, we talked about at Blackhat where you know, they basically got the crowd in to help out with source code analysis. Amongst other things, which is not something you'd necessarily think. Oh yeah, that crowd could do that because it's like, Oh, this is intellectual property. Like, can you trust these people to the degree that you're comfortable doing that? And we're able to satisfy those burdens of trust to the level that the department of defense said yes to it. So, you know, I think, I think there's examples of really being able to connect the creativity through the platform, into like parts of the development pipeline, to your point that earlier.
That's something that way, you know, I mean, our bread and butter is production and Dev. Like, if it's on the internet, we can hit it. That's just easy. And there's a lot of stuff that we can do. From there working backwards but you know, as you said, like the earlier you can capture a problem, the cheaper it becomes to fix.
Casey: And I think honestly the more impactful learnings can be to the organization as well, because at that point, the commit is fresh. Right?
Mike: Right. Not just that. I think it's also easier because it's not that I want to blame anyone, but you it's a little bit easier to track back to like, how did this happen and make sure that the right people are being educated in a way. Right? I can't tell you how many times we'll get, you know, that sort of, Oh, there's a vulnerability here and then, you know, it gets kicked over to one of the senior engineers who goes in and fixes it, but he had no, you know, hand in its creation and, you know, the important thing is like, let's just get this fixed and sometimes there's an educational moment around it and sometimes any of them, but frequently it's just seen as any other bug that just gets, just gets resolved.
Casey: Definitely and that part really does go to, you know, what we were talking about earlier around, the diplomacy of the communication skills, and then honestly, the acceptance on both sides, not just, not just the security side, but the engineering side that these things happen. Like if you, if you can write, you know, 20,000-word essay without making a spelling mistake like Congrats! But...
Mike: You're right. Your grammar mistake or any number of other.
Casey: Yeah, exactly. Right. It's like, there are things that are going to happen naturally just as a product of the fact that humans are creatively powerful, but we're also not perfect. So, like starting with starting with that as the benchmark and being able to say, Hey, like here's the thing, like the whole, you know, the allergy to get blame that you can, you can see. I understand it.
Casey: And you know, I've been on various versions of the receiving end of it at different points in my career. So, I get it but at the same time, like if you can't type that feedback. And if that feedback is seen as inherently toxic, then no one's going to learn anything.
Mike: Right. So, I think...
Casey: There's some sort of balance.
Mike: Right? I mean, I think there's a lot of roles to play in that, like making sure that that feedback isn't toxic, Right? It's...
Casey: Yeah. A hundred percent.
Mike: It's not just how it's delivered because you know, I mean, everybody has communication issues at some level, so, but also being able to hear it in that positive way of like, Oh, you're just, we're all on the same team. We're just trying to make things better but then also is on the manager, or lead or whoever to make sure that people don't feel like they're being punished for having made a mistake. Right. It's gotta be okay to make a mistake as long as you're willing to say like, oh crap, I made a mistake. Let me go in there and fix that. And hopefully, you know, I can't guarantee I won't make it in the future, but I certainly be less likely.
Casey: Absolutely. So, pulling, all pulling towards the same North star and having empathy seems to be key in all of this.
Mike: Empathy. Yeah. That's a great way to think about it. There was... What was that?
Casey: That's a good place to land this.
Mike: I did want to get back to the Cybrary question because you know, I'm a little bit about what one of the things you mentioned is trying to educate more of the engineering and that's really what that security enablement that's one of the things we're leaning towards, right, is giving, giving these environments, giving the training giving labs and hands on training to engineering teams with the, goal of educating a little bit, we don't want them to be, we don't expect them to become security experts, but we want them to be making good decisions.
Casey: Yes. Yep. Yeah, no, definitely. I mean the decisions around, yeah. It's that's cool. So, you're talking about that from an awareness standpoint, you're talking about it from an enablement standpoint. So actually, knowing how to make a good decision, as well as understanding the need. All the importance of that is that right?
Mike: Yeah. And you know, and giving them those environments to work in and really learn it, in a, more of a safe space than just on their application. Right?
Casey: Yeah. Yeah. That's awesome. Very cool.
Mike: Very cool. Yeah. I really appreciate you taking the time to talk to us now.
Casey: It's been a fun chat as always. Appreciate it too.
Mike: Take it easy.