Listen to the Audio
Enjoyed this podcast?
Share it with friends now!
In this episode of the Cybrary Podcast Jeff Man, who is currently working as a Senior Information Security Consultant, speaks with Cybrary’s Head of Infrastructure Jonathan Meyers. Jeff is a well-known name within the hacking community. He takes us from how he got his start in cybersecurity to contributing to the book Tribe of Hackers: Cybersecurity Advice from the Best Hackers in the World, on sale now.
Jeff has had a very interesting walk of life and started his career at Naval Ordnance Laboratory as an intern where he had to read the book called The Hunt for Red October and he met his first PC here as well. His learning curve on this PC was more just kick and poke and trial and error and just figure things out. After graduation, he went back to the Naval, the same place, and was working just as a clerk typist. Later he went to work at NSA and I was there for 10 years where he started as a cryptographer. He left the NSA 23 years ago and went to work for a commercial organization that was setting up computer security networks. He continued working in credit card security, the payment card industry. He went to work for Trustwave at the end of 2004 and ended up doing PCI for about 10 years. About six years ago Jeff went to work for Tenable as a PCI expert.
He found his way to be one of the hosts of Paul's Security Weekly and after some sort of freelance work he joined a company called Online Business Systems. Jeff and Jonathan discuss the development of the security market, about breaches, and the hacking community.
At the end of the episode, Jeff shows the background of the book Tribe of Hackers: Cybersecurity Advice from the Best Hackers in the World, which was written by Marcus J. Carey.
The main takeaway from this episode is that security awareness is important not only for security professionals but for all the members of an organization. It can be increased with the help of training courses.
Jonathan: So if you kind of want to just give like a brief background of yourself, I know you have a background.
Jeff: I can't promise I'll be brief.
Jonathan: Oh, that's fine. Cause I know your first degree was not anything related to cybersecurity, information technology, anything like that. So how did, how did that kind of come about if you will?
Jeff: So, my background is, very much not technology related or computer related. I was actually a business major in college and that was probably my fourth major in college. And I basically kept looking for the one where I had to do as little work as possible and, and pass. And, I grew up in the area. I grew up in Silver Spring and both of my parents worked for the government, for the department of the Navy. My dad was a physicist and he actually was involved way back in the ‘50s and some of the early testings of the hydrogen bomb. And he actually was on one of the ships that watched the detonation of the first hydrogen bomb, which eliminated a little atoll called Enewetak, you can look that up in the history books, but he worked at the Naval Research Laboratory. My mom was in HR at the Naval. What used to be called the Naval Ordnance Laboratory later became Naval Surface Weapons Center. And my start in all of this really was I got a summer job between my junior and senior year of college as an intern, working at the Naval Surface Weapons Center. And I went to work for a physicist who did any submarine warfare research. On my first day on the job, he was trying to explain what he did. And he's like the easiest thing to do would be to read this book that just came out recently and the book was called The Hunt for Red October. Might've heard of it. And so I, my first week on, you know, my first real job, I got to read a book, which I thought was kind of cool. But my project for the summer as an intern was this physicist had a locked cabinet, five drawer filing cabinet filled with research material that he'd collected over his years. And he had gotten some money and was able to buy one of these new-fangled things called a desktop computer. And he got a database program and he wanted me to basically go through this cabinet full of all sorts of research material, catalog it and build some sort of rudimentary searchable database. So I spent the summer going through looking at all these different documents and books and articles and research papers and trying to extract keywords. And I got to play on this. PC and learn how to use this database program. I want to say it was DBase2.
Jonathan: And what was the learning curve on that? Did you have to read like three more books to learn how to?
Jeff: No, I mean, back in those days, there weren't three more books to study on anything, the PC itself, and this was before the days of windows. This was 1984. When you turned it on, you had the choice of booting into one of two operating systems, DOS or something called CPM. And I think in the first couple of weeks, I effectively blew out. I accidentally deleted basically the operating system, so we had to start over. So the learning curve was more just kick and poke and trial and error and just figure things out. And I'm part of learning the database programming not only was I doing the cataloging, but I was very much into sports at the time, the Redskins, they were sort of on the upswing, they were that back when they were good and winning Super Bowls. And so I kept track of all the stats, which you could see in the newspaper, you got it every day, but I tried to translate that into learning how to do the database and learning how to, you know, putting in stats for a game and then compiling it within the program. So it would have the cumulative totals and averages and things like that. But my lesson in security, and I've talked about this several times, conferences were really, I came in one day and I opened up the safe and found in the drawer that you opened up a pink sheet of paper saying, please come and visit the security office. It turns out I had left the safe unlocked overnight, and I was a young kid in college and I said, well, what's the big deal. You know, I'm in this government facility, that's got fences around the borders and to get into the building, you have to go pass the guard desks and to get into the office. It was locked and so to me, there were all these layers of security. What's the big deal if a safe is left unlocked inside a locked room inside a locked building, inside a locked secured facility. Well, turns out it was a big deal and I didn't really appreciate it until many years later, that I realized. Oh, well, when it comes to security, there're lots of moving parts and any one little element. In isolation may not seem like a big deal, but it all works together to secure in this case, classified materials that were locked in a safe. And I have reflected on that over the years, especially the last couple of years when I've been called upon or asked to come to do talks and try to explain sort of the state of cyber, what we now call cybersecurity. We didn't call it that back then. And I think the biggest lesson that I've learned, or the thing that I've come to know or come to realize is that, back then, security at this organization that I was working for. Everybody understood the rules. Everybody knew their role, and everybody followed the rules. It wasn't, you know, we were mentioning before we started broadcasting or recording. You guys are trying to move towards trying to explain cybersecurity to the whole organization, rather than it's the responsibility of one little group that's kept in a corner. Back then everybody understood security and everybody knew their role to play in. And what was important about following all the rules? A lot of it was physical security procedures. But I think there's an analogy there, there's a crossover to what we call cybersecurity today. And I think it's something that we've lost as a society. If not as an industry within an organization, everybody has a role to play and everybody should not only just know to follow the rules just because.
Jeff: But because it's part of the whole, and it shouldn't be the type of thing where security is invisible and security is something that's done by somebody else. If I'm opening up my tablet or my laptop or my smart device, there shouldn't be the assumption that everything is secure and I can just use it in whatever way I want to. I have a responsibility for what I have control over in terms of security and that. It's a little bit discouraging in some ways, cause I think a lot of that has been lost somewhere along the way. But, in turn, so that's how I got my start. I went back, finished my senior year of college. Graduated actually, I went back to the Naval, the same place, and was working just as a clerk typist. I was working for the research library at the time. But my mother happened to have a friend in HR, whose daughter had gotten a job at NSA. I grew up in Maryland. I had never heard of the NSA back in those days. It didn't exist. Nobody said that they worked at NSA, but they were hiring. And so I filled out the application form, sent it in and got, they got in touch with me and I ended up going up to Fort Meade and for two or three days of testing, I took all sorts of different types of aptitude tests and the long, and the short of it was I scored well and they hired me and they hired me without any job in mind, they hired me simply because I was qualified or I had the aptitude for doing the things that NSA does.
Jonathan: The typical government stuff. Right. They test your aptitude, then they tell you what you're qualified for.
Jeff: Right, right. So I went to work at NSA and I was there for 10 years. I started out as a cryptographer. I actually worked on what we call it this at the time, the InfoSec side of the house, the defensive side of NSA, which unfortunately more or less doesn't exist anymore. And I was there for a couple of years doing some fun things and then eventually became an intern, cryptanalysis intern and my part of being an intern was you had diversity towards six months tours in different offices. So you could learn all the different aspects of cryptography and all the other things that NSA does. And my last tour, I ended up back on the InfoSec side of the house, working for an organization that was called Fielded Systems Evaluations. And their mission was, somebody had figured out, Hey, the way the NSA very often on the operation sides exploits the communications traffic of our adversaries is we take advantage of the fact that the systems that our adversaries were using were being misused. People wouldn't change default settings. People would reuse keys that were only supposed to be used for one time only, or maybe for a week, or they would find bypasses to the security altogether. And somebody said, well, gee, we put, you know, NSA produces the best cryptographic systems in the world. We're experts at it, but how do we know that these great systems that we produce are being used properly once they're deployed in the field? And very often they're deployed to 18, 19 year old communications officers. So you've been in the army, you know, the type.
Jonathan: I was that guy.
Jeff: You were that guy. And so you, your job is to get the message through and whatever shortcuts and things you can do to make the job easier. You're going to do it.
Jonathan: Especially when people are shooting at you and you're just tired at the end of the day. Like you're just, sometimes you just put it on autopilot after you're.
Jeff: Right, right.
Jonathan: Check out. And then that's where it.
Jeff: So I started in that office and finished off my intern program and ended up just staying in that office. And there was a seminal event that happened, while I was in that office, which was the release of Mosaic, which was sort of the first commercially available free web browser. And that's what really made the internet begin to explode and made it accessible to the general public. So part of the office that I was in started focusing on what we call it at the time network systems. How do we test the security of network systems? And there was a small group of us that, we were aware of this whole thing called hacking and breaking into computers and so we started to learn that and to make a long story short, we ended up setting up what came to be known as the first red team at NSA. And in doing that, we had to overcome all sorts of barriers, one being the reticence of NSA management, let's say to venture into the software land and into computers. We were accustomed to building little black boxes where messages came in and code came out the type of thing. So there was a sort of reluctance to get into the whole computer space. And then there was also these, I wouldn't say it's a political issue, but a very key issue where NSA has this charter that says NSA only does what NSA does to foreign adversaries for nationals and NSA, very specifically, doesn't do what NSA does to US citizens. And of course, when we were trying to do good guy, ethical, white hat hacking, whatever you wanted to call it, against our own stuff, that very quickly became a political legal issue of can we do that? And so we had to navigate the waters of how do we do ethical hacking. For the greater good.
Jeff: We're trying to determine the security of systems by breaking into them and do it under the auspices of NSA, but not violated NSA’s charter. There's a very long story, involved in that, which is best told over Brown liquor. But it is a story that I sometimes tell at various conferences under the right circumstances, the long and the short of it is like many, many people that have grown up in the military and in the DoD and getting this type of experience, we saw the greener pastures and the more lucrative opportunity to go out in the private sector. And so I left the NSA 23 years ago and went to work for a commercial organization that was setting up computer security network, security consulting type operations, doing ethical hacking. We didn't really call it red teaming back then, we called it vulnerability and threat assessment.
Jeff: But essentially it was, we're going to break in and find all the holes and tell you how to improve your security by letting the good guys do it first. Very much modeling and some of the things that we inspired us were movies like war games or movies like Sneakers. And much more so probably Sneakers, cause that was all about hiring an expert firm to come in and show you how bad you were. So did that for several years, ended up working for a dot com startup and limped along that way for several years, always having a lot of good experiences in terms of the customers, but organizationally the vision of management was we need to produce a product cause that's where you get the force multiplier and get the big payout. Whereas we were just doing consulting and advisory work, but somewhere along the line I ended up working for some people that I had known from the NSA days and previous employments and I went to work for what at the time was a startup company. That was started by a bunch of NSA people called Trustwave. And that got me into credit card security, the payment card industry. So I went to work for Trustwave at the end of 2004, and they handed me this document called the PCI data security standard and they said, read this, this is what we're going to do. So that got me involved in PCI and I ended up doing PCI for about 10 years. One of the people that I used to work with at NSA, someone that you guys are familiar with, Ron Gula, he obviously had gone to work in and founded Tenable and we'd always kept in touch and saw each other a couple of times a year. And he finally came, approached me. It's been about six years ago now and said, hey, I finally got a spot for you in Tenable. Come work for us and be our PCI expert and help us, market our products, and make sure our products are fine-tuned to work in the PCI space. So I left consulting land, I became more of an advisor. I think technically I was a product marketing manager when I started at Tenable, but one of the things that Ron asked me to do was I want you to go out and start speaking at conferences. I want you to represent Tenable. You know, you've been in the business for a long time, I want you to tell stories. So I started doing that and that's been gradually ramping up over the last several years. And one of the people I met at Tenable was Paul Asadoorian and somewhere along the line, I found out that he was doing a webcast podcast type of thing. And I found my way to his studio, at some point I started being on the show. And of course, I'm still there. I'm one of the hosts of Paul's Security Weekly. And so I did that for several years. My time at Tenable ended and I did some sort of freelance work for a while and I finally ended up going because mostly because my wife wanted me to have a steady paycheck and benefits, including healthcare. I went back to work basically with the guys that I had been working with right before I left Tenable. I was with a group, a team working PCI primarily, and my old boss had left the company where we worked and had started a security practice at a company called Online Business Systems. I don't know, the camera can see that.
Jeff: And he basically was building a security practice at an IT services company and again, we've always kept in touch and I finally came back to him and said, do you have a spot for me? So about a little over a year ago, I went back to work for them. So I'm back in PCI land. I'm back doing advisory work consulting work, which is really my passion. What I've enjoyed most over the years is being able to go out and meet different companies, meet different people, and teach them about security. Explain to them to see what security is all about. And the crux is very often in PCI. And most companies that I've experienced that have to do PCI or any other regulatory compliance. Again, like we talked about earlier, this institutional knowledge either was never there in the first place or has been lost or just didn't transfer very well. I have found great success, personally and professionally in terms of being able to go into companies and kind of explain to them how this thing works and how it needs to work for them beyond what the words are on the page and beyond the mentality that. Oh, well we just have to do the bare minimum check a box.
Jonathan: Yeah. So I think that's, it's really interesting because like how fast technology was basically adopted by the mainstream, you kind of lost, like you said, in the early days, like everybody in the room, back at the NSA with that safe, like everybody knew why it needed to be locked up like everybody touched it basically on a daily basis and it was a bunch of specialized people, it wasn't everybody. And then how technology became so ingrained in our daily lives. Nobody really thought about it, I would say, until like all these hacks started happening and being very public hacks to where people actually realized what was actually happening to them. So like when the, what was the credit, the credit agency that they got hacked? I think that was probably one of the first ones.
Jonathan: Equifax. I think that was probably one of the first ones. Where your average person realized how it could affect them. Like seriously, they were, it made it so much easy, easier for like their social security number now to be leaked and things like that. And I think that's where it started to hit home. And so back before that kind of happened, everybody, just kind of brushed stroked and was like, oh, security, we have a guy.
Jonathan: It does the security where now I think people are starting to finally take notice of how everything they do can impact the bigger organization and cause like serious issues and companies now topple over hacks like that happening and things like that.
Jonathan: And I think that's, it's very interesting how it's finally coming full circle and now Apple is like super on the privacy bandwagon and they're making that their number one marketing strategy. And I think it's great for a cybersecurity type mindset. But I still think we're like super far off because these super large organizations have so many people and it's tough. And so we've kind of started at Cybrary. We kind of start now talking about this thing called security enablement, where it's like, how do we start to educate everybody on all of these things? And it doesn't have to be like a deep dive like we're just trying to like, let everybody know. These are generally the best practices so that when they're making that game time decision. They error on the best side, as opposed to the wrong side.
Jonathan: And things don't actually kind of build up.
Jeff: Right. Yeah. It's, I certainly agree that it's a big problem and I agree that, I agree on what the current state is, where there just isn't enough institutional knowledge about basic cybersecurity hygiene if you will. And it's lost on too many companies. And I think, you know, I don't know chicken or egg, cart before the horse. I think our industry has helped to promulgate the bad. And it's ironic that was, me having been in this business, 30, depending on how I think about it, 37, 38 years. I chuckle very often when I hear people that are giving a talk at a conference where I listened to somebody that we're interviewing on security weekly. And they're so excited because they've come to this realization about something and I'm like, wow, that's like sort of fundamental, and we've known about that for 30 years, if not longer, but it's new to you, so that's really exciting. I think where the industry struggles and has really contributed to the problem is sort of feeding into the mentality that well, security is just something that somebody else handles, somebody else takes care of. It's done for you. So that you're just the average rank and file employee at a company. You don't have to worry about it.
Jonathan: It's not in my job description.
Jeff: It's not in your job description. Why should I care about it?
Jeff: And ironically, in all the vast majority of the companies that I've dealt with has an advisor or a consultant over the years, the worst offenders are typically the ones that hold the keys, the admins, they know what the rules are, but they've got to get their job done. And so they routinely do all the bad things that they shouldn't be doing. And it starts there. But it certainly doesn't end there, but like PCI, for instance, and I'll. Usually, when I say PCI at security weekly, we all have to drink. Cause I say it that too often, but one of the requirements in PCI is to make sure you have a security awareness program for all of your employees in the company. And that's been productized, and basically boils down to, and we've all seen it some 30 minutes to 45 minutes training course, online training course, I think Cybrary to some degree had its origins or at least saw that and said we could do better. And that's part of where Cybrary is coming in and trying to fix things or change things and make things better. But the notion that 30 minutes of what's a good password, what's a bad prep password. Or there's, I've seen too many companies out there now that are focusing on like phishing attacks, and thinking, well, if we can just get people to not click on the link.
Jeff: I happened to get, I got a phishing email yesterday, you know, my work account. And it was from some weird looking online address and it said, “Hey, we've just set you up in the system. We need you to click here to register.” And I'm like, yeah, yeah. But. And I forwarded it, I followed the rules and forwarded it to our IT security group and put it out on, we have a Slack channel for our employees. I said, hey, be on the lookout. I got one. Others are probably going to get the same thing too. And a couple of people responded, “Oh yeah, I saw that”. Or, “yeah, that's, that's a really good looking one”. And it was very convincing because the link was OBS global. That's our domain. But if you hovered over it.
Jeff: You know, it was all exposed as being a bad link, but there's more to it than just...
Jeff: Don't click on the links. There's more to it than pick a good password. And again, these fundamental makings the notion of cybersecurity is sort of part of them, I like to refer to it as part of the culture of the company. And everybody needs to understand their role, no matter how big or small their role is in the business function of the company, what they do and what they don't do matters to the greater goal or greater calls of protecting whatever it is that they're trying to protect.
Jonathan: Right. I would also say that I've experienced in some of my other companies that some of the key security experts and stuff tend to silo themselves off and don't really let other people know what they're doing. I don't know if it's because job security or other things, but they tend to make it seem like cybersecurity is this super tough thing that only they can do. And they're not about the knowledge sharing and things like that. I remember once a startup, I was working out, we got acquired. We then had to send our software at the end of each development release. Through like a penetration test or a vulnerability assessment. And I'd get on the call with these guys as I was the guy that basically installed it at all the customers. So I was the guy that knew basically the ins and outs of how it's installed, how it runs, how to break it, how to fix it. And I would get on these calls and they would talk about their findings. And I remember the first couple of times I got on the call, these guys got on and they're like, Oh, well, we're just going to run a penetration test against it. And me coming from the startup where I was the guy also running the security assessments back in the day, I started like trying to dig deeper and ask like, “Oh”, like, “cool”. Like, “what are you doing to test?” Like, “how are you?” Cause you know, I'm trying to make it more secure. Like I understand the security walls that we put in place. And so I was trying to figure out, are they trying to go behind that security wall and try to like actually figure out what's broken or they just typing in an IP address to an automated scanner and hitting scan.
Jeff: We can say it, they just run a Nessus scan.
Jonathan: Correct. Well, they weren't running Nessus.
Jeff: Or something.
Jonathan: That's alive. They were running Nessus
Jeff: Or the equivalent.
Jonathan: Right. And I was kind of upset because I, we. I mean, we were from a modern enterprise software startup and I was running a lot of the DevOps pipelines. And I was like, no, this is automatable like when I tag a release, like the Nessus scan, should just kick off and then give me that report because that's exactly what you're going to do for me.
Jonathan: And so they were very, they wouldn't tell me what software they were using. They were like, no, this is our department. Like, you need to get in line. The other departments submit theirs to us and you're late this month. So like, you're now at the back of the list, you can't release your software until we've passed it and things like that. And it was super frustrating trying to like get past those gates when it's I knew what they were going to do. I could run the scan myself, but the way the institution was set up, that person was the gate checker for that software. And it kind of just screeched the whole release process to a halt, which is frustrating for developers because, you know, they're working overtime to hit this date that we told the customer that it would be out, and now. The security guy was like, no, I don't have time. It's going to get bumped.
Jeff: And that's a great example of how an organization was sort of following the script of what they're supposed to be doing, whether it was PCI or some other regulatory compliance standards, somebody some way somehow was dictating. We need to have these processes in place, but there wasn't. From what you're describing, sort of the deeper understanding of what's a better way to do this. It's more productive and more effective, especially more cost effective. And what can we do to make sure that things are happening earlier and more thoroughly, and why not involve, you mentioned that a lot of times, the security practitioners seem to pigeonhole themselves. And I think that the reason why covers the gamut, I think it's different for most people, but there is certainly a train of thought within the industry that cybersecurity is somehow mysterious and there's an aura about it. And again, I think we very often promote that, especially within the hacking community, where if you go to a hacker conference, the people that are lauded as the heroes and the uber hackers, at least we used to call them uber hackers. Now we call them rockstars. They're talking about, you know, the latest day that they've discovered or they'll drop one and that's cool. Or they'll talk about the cool hack that they've done. And that's sexy and that's cool. And me being somebody that likes to come in and talk about, well, nothing's really changed. We haven't learned, or we've lost the lessons that were, you know, learned over hundreds of years, at least in the existence of this country and our military and the idea of keeping secrets. Nobody wants to hear that. They're like, Oh, that's not fun and sexy. That's just a little boring. It's like, so I'm old and boring. But I still think the messages there that the fundamentals of security they're not that terribly mysterious and they're not that terribly difficult to understand. And there, most of the time they're pretty intuitive but I think it's a combination of if you tell everybody that maybe my job goes away.
Jeff: Or certainly within the vendor side of things that doesn't help us sell our product.
Jonathan: Right, a hundred percent.
Jeff: So I've had many conversations with many different people where I've gotten into a sort of what really needs to change, what really needs to happen, and they'll agree with me, but they'll ultimately say, yeah, but there's no money in that.
Jonathan: So I actually have similar problems. Since I run all the infrastructure for us, I get a lot of like security questionnaires from customers. And you know, a lot of it is they want you to check these boxes of like, do you have an intrusion detection system?
Jonathan: It's tough for me because I understand like the question that somebody down the line is trying to like a cover.
Jonathan: But for us, it's like, well, yes and no. Like in this modern architecture, we run, you know, like a traditional intrusion detection system. Isn't what you're thinking. If you can give me the five things that you want the intrusion detection system to do.
Jonathan: I can tell you how we are protected from that across the board, across a totally different toolset. And it's tough because most of the time, the people that are giving us those questionnaires, like you don't understand, they don't understand. It's tough like you give the answer that like, you know, they're looking for, but like, you know, is not the technical right answer. And if anybody on their security team dug deeper would be like, no, that's wrong.
Jonathan: In my heart. Like, I feel like I'm answering the question of what they want to know based on their level of understanding. And that's super tough for us is trying to like walk that line of, yes, we know you're secure, but if I tell you no on this checkmark, you've now basically sent us down this black hole where we're never going to get approved.
Jonathan: Because we don't meet that one little check.
Jonathan: And so I struggled with that a lot, especially as I start to like architecture systems and things like that. It's, you know, what was the... There's a back at our old company. They wanted us to, I forget, there's a, what's the level of encryption that's standard across DoD.
Jeff: I've been out of the DoD for a long time.
Jonathan: Anyways. There's like the standard level of encryption. But it's basically a less encryption standard than we were using.
Jonathan: And so to sell anything to the government, you had to support. I can't think of it. It's going to kind of...
Jeff: The lesser encryption.
Jonathan: And it was like, well, no, we actually use a much stricter thing with like better keys and better ciphers. And they were like, sorry, like you have to meet this. It's not like it's a standard and a baseline, as long as you prove you're over. No. And what was tough for us as we used a lot of open source projects, and they didn't support it. And it was like, well, we can never fully support this unless we basically downgrade the security and things like that. And they were like, yes, please, downgrade the security. It was basically what it came out to. And it was super tough trying to kind of…
Jeff: Because that box has a particular size and shape and they're accustomed to that. Particular pig being the thing that fills it.
Jonathan: Yeah. It reminds me of this meme, I saw the other day where it's a father watching his, I think his little daughter put the block in and couldn't figure out how to put the star in any of the holes. So she just picks up the lid and drops it in under the lid and puts the lid back on.
Jeff: Problem solved.
Jonathan: I was like if only I had something like that where I could just, you know, Nope. Yep.
Jeff: So that's, but that's the essence of the hacker mentality and, and. Yeah, the reason I was waffling on how long have I been in the business? As I look back on my life, I can, I had several part time jobs before I had that official DoD job in my college years. And one of my first jobs that were more or less a real job, was working part time to try to put myself through school and working for a state agency that was doing default loan collections, and actually trying to prevent people from defaulting on their loans because they were guaranteed by the government, the state. So they would rather get people to pay and not have to pay out to whoever the...
Jeff: They were the underwriters for student loans. And I was learning as I look back on it now, cause I had our, my tool was a phone. And I had a mainframe terminal and access to this huge database that had all the records of all the people. And I would get a list of people to call every time I went to work and I'd get through as many as I could, but my tool is a telephone and we had rules. We had, you know, back, this is ancient history now, but back in those days, you could call the operator and ask the operator, you know, what's the phone number for a particular address. We were allowed to call a certain number of times a month. We could submit and find out the phone numbers of like neighboring houses around a particular address. So we could call neighbors and say, “Hey, I'm trying to get in touch with Joe next store. Apparently their phones out. Would you mind running over and knocking on their door?” So there's this whole, looking back on it, social engineering exercise of trying to figure out how to get, you know, get to the people, to get them on the phone, to talk to them and try to get them to pay. So they wouldn't default. That was the whole essence of it. But I was also on a mainframe terminal with access to these huge records. And, you know, I would spend my idle time. Where can I go? How far can I get, can I look up my brother? Can I look up all my professors? Oh, look, this one professor at my school is this, got a defaulted loan, yet shame on them. You know, stuff that I shouldn't have been looking at, but there weren't necessarily the controls in place to prevent me from getting there.
Jeff: So if I take that kind of stuff into account, that's when I start saying, you know, my career as a hacker goes back 37, 38 years. But, I think that getting back to sort of how we've failed to try to translate cybersecurity into the commercial world or into society. It goes all the way back to the very beginning where the idea was. Everybody's going to start connecting to the internet. Oh. But we need to secure, especially from a company or organization perspective. So what was the first thing that was invented? The firewall. And the firewall, I think in many ways, set in motion, this idea that security is something that's taken care of for us, we don't have to think about it. And the idea of the firewall was, we need to have a barrier between the bad evil internet, where anything is out there that could be attacking us and our safe interior corporate network environment where God knows what goes on. And of course, in the early days we were trying to explain to people, no, you need to do stuff on the inside as well to secure things. There are all sorts of stuff that goes on the inside. And if you're, you know, security is largely lying, relying on a single point of failure, can somebody get past the firewall or not? You're not really buying much of anything, but there's certainly this nice illusion of we're behind the firewall.
Jonathan: Which is crazy because that still exists. Right? Like borderless networks are still not like a common practice that a lot of companies think about. Try to protect, like they still are…
Jeff: But some of the new marketing pitches, if you will. The buzz words, and the things that are getting people all excited or things like insider threat. Like that's something new, but that's one of the big buzzwords.
Jonathan: It was my last startup…
Jeff: ...an RSA this past year, that was one of the big things. And I, again, I chuckle every time I see a company and I deal with lots of companies, cause I get bombarded because I'm a media person with all sorts of, you know, I'm getting it now getting ready to go to black hat. I've got a dozen emails a day. “Hey, do you have time to talk to our expert on this.” You have no idea who you're talking to. But you know, very often they're like, Whoa, insider threats, a big new thing. You know, we've got to do something about it or this idea of borderless security or zero trust security. There are all sorts of new buzz words. But if you peel off the fluff around it and the razzle dazzle and the light flashy, blinky lights, it still boils back down to basics security hygiene that you and I coming from a DoD background, it was something that was just built into our DNA. It was something you did and you knew how to do. And the worst irony I see nowadays is too often now the DOD itself is turning to private industry to try to get that knowledge. I mean, not only has it been lost, it's been totally lost. And the institutional knowledge seems to have vanished, I guess it's cause we all left and went out into the private sector.
Jonathan: Well it's also crazy because now the contractors have the seat in that organization. And they do the work. It's not like that contractor is training the other people like the other soldiers, the active duty people that contractors doing all the work. And so those soldiers say in like the Signal Corps in the army, the guys that run all the networks, it's contractors that have that specialized knowledge in those certs. And it's not the soldiers that are assigned, you know, the 17, 18, 19 year old guy who would be perfect to learn that up because it's like, Oh, Hey, we'll just he'll progress to his career. And he'll know these things. And by the time he gets up to the top, he'll know and understand like what he did at that base level. And it's like, no, now he's responsible for plugging in switches. Like that's what we've relegated his role to.
Jonathan: And so then it just compounds and compounds and then you need a higher paid contractor to be in charge of more contractors and...so, but yeah.
Jeff: So it's ugly.
Jonathan: It is. But yeah, it's funny, you mentioned the insider threat, like that was our last startup Red Ally Analytics which was interesting. It's basically sucking in all the available data and try to do trends and things like that, which is like a very interesting way of thinking about security. It's like, yep. Nope. There is no security anywhere. We just take all the data and just look for anomalies and trends across all these points. Very, very hard to do, and practical terms when you talk about the size and scale of data.
Jonathan: A lot of our companies were, customers were like very large banks where they have the SEC rules where so much of the communication has to be reviewed and things like that. And so when you're talking about, you know, a couple hundred thousand person organizations, the amount of data that they generate is crazy. And then how do you do the analysis and all that kind of analysis and stuff like that? With a reasonable amount of compute power because you know, you're not fancy, you're not the new firewall, you know, you can't charge a firewalls price, you know, the firewall is way better than yours.
Jonathan: And then, so how do you start to provision hardware that can do the analysis in real time? You know, because everybody's used Facebook. When I click on a link like I instantly get all those person's photos and they've just loaded like I want your software to work just like that.
Jonathan: So those are, those were fun days.
Jeff: So I have this core belief and I think I'm in the minority, but when I talk to people, they, you know, they usually say, yeah, that makes sense. And my firm belief is that to really begin to grasp and understand cybersecurity. Organizations and individuals need to understand that to understand security you... it's a conversation to have before you start talking about technology.
Jeff: But so much of this industry is technology driven.
Jonathan: What's easy to change technology.
Jeff: It's easy to change. And we assume that new technology comes with all the bells and whistles and security built into it. So, you know, we as consumers or users of the technology, we think, “Oh, we're good because surely somebody has looked at it” or “it's good because it's been outsourced to a third party and you know, it's a hosted environment. It's a cloud environment.” So this it's so that somebody somewhere is doing security.
Jonathan: You mean went through the firewall. So this email must be good.
Jeff: But one of the things that I'm trying to do with what little time I have left in terms of having the attention of a public audience is to try to educate and get people to think more about what it is or think differently about what this thing is that we're calling cybersecurity and I base it on, you know, things the way that I learned about security back in the DoD back at NSA, which is based on what we used to call the risk equation. And there are lots of risk equations out there. If you Google it, you get all sorts of really complicated formulas. And people really try to dig into coming up with some sort of measure. How do we know we're secure? How do we know we're doing things well, most people, and you'll hear these terms bandied about, especially in the vendor pitches and all the marketing, but the basic risk equation. And I try to simplify on boil things down to just sort of a basic understanding. Risk is a function of vulnerabilities that exist, and threats, and then what we used to call on the DoD countermeasures. And so, you know, whatever your risk is a result using some pick your choice of mathematical formulas, that the presence of vulnerabilities and the presence of threats and what you do to counter them, either the threats or the vulnerabilities.
Jeff: And that gives you the risk, you know, and when you explain that at a high level, most people can track where I have fun, is I go to especially if it's a vendor event where there's a vendor floor, I'll go up to, and I try not to get to just the marketing people. I try to find the SES or the people that are more technical in the booth. And if they're advertising, they've got some sort of threat solution. I'll just simply ask them, “what is a threat?” And watch them fumble with trying to define it, or “what is a vulnerability?” and watch them fumble, trying to define it. Lately what I've been doing is sort of cutting to the chase and saying, “what is this thing called security?” I mean, we are all security professionals, but what is it? I put it out on. I put that question out a couple of months ago on Twitter. And it was to a specific individual who I've asked a couple of times, and this is somebody that's very well known in the industry and very much respected. And I respect him. We have mutual respect and I had asked him in public because we were both at a conference several months ago. What is security? And he had responded. ”I'm not prepared to answer that question.”
Jeff: So I saw him a couple of months later and I asked him the question again and he responded, but I didn't like his response. So then I went to the Twitter and put the question out there. The responses were fascinating and I sort of, I asked the same question again in a slightly different way just about a month ago. It's probably been a month now. And, you know, just basically put the question out there. What is security? And the amount of responses, especially ones that were like, well, it's simple, it's this, this, this, and this. And you know, the point of what I was asking was to point out that there's no universal agreement on the definitions and everybody that would respond and say, I know the definition, it's this. Underscored the fact that nobody agrees on what the definitions are. So I give a talk, I'll be giving it in the future, trying to think of what conference. Oh, I'll be out at DEFCON and I'll be speaking at the packet hacking village, the Wall of Sheep, or so it's a little sub conference within a conference and I'm giving a talk. That's sort of based on this concept of rethinking security, you know, and my premise is basically, it seems to me like 95% of this industry, the security industry is focused on vulnerabilities.
Jeff: And whether it's vulnerability detection, whether it's vulnerability prevention, whether it's, you know, teaching developers or whomever and not to put vulnerable code out, whatever the conversation is, hardening systems keeping up with patching all the different tools that detect and protect. So much emphasis is on vulnerabilities. My question is, with the emphasis on vulnerabilities and with the, and with the vulnerability being one component or one variable within this thing that we call the risk equation. And there's something else in the risk equation. I mentioned countermeasures, but let's say security is a synonym for countermeasures. If that's another element, if you know any kind of mathematical equation, when you have one variable that's labeled something and you have another variable, that's labeled something. So in this case, vulnerability is something, but security is something else. If vulnerabilities and everything we're doing with vulnerabilities by definition, based on the risk equation is not security. Then what then is secure. And I suggest all the things that we're doing with vulnerabilities, which is seemingly 95% of this industry. What if that's not security at all? What if that's just our job? If we're a developer we should know to put out secure code that's doesn't have all the OWASP Top 10, whatever vulnerabilities built into it. Yeah, as users, we should know how to...
Jonathan: Should be like English grammar rules. Like it should just be like, these are the rules for…
Jeff: For IT administrators and administrating our servers and systems and desktops and whatever these days, what if keeping them secure and patched and up to date that isn't security at all. That's just part of the job.
Jonathan: Yeah, that's your normal job.
Jeff: That's table stakes. And so if you take all of that away and say, that's all that stuff that we talk about and all the focus of that. Isn't security what's left and what is security? And I don't have a specific, I have an idea, but I don't have a specific answer. And my goal is just simply to get people to pause and say, Hmm.
Jonathan: Yeah, it’s interesting.
Jeff: I thought about it like that.
Jonathan: Yeah, because you would also have to think that there's probably a sliding window along with that as well. Like when does it leave being security then and move over to just your job function? Like, is it a major breach that now everybody is like, Oh, this is a normal thing. And. Yeah.
Jeff: Well, I mean, you mentioned it earlier, you know, people, the general public didn't seem to notice that cybersecurity is a thing until the Equifax breach, which was how many years ago at this point, 3 years ago, maybe.
Jonathan: I mean, the DOD people realized that after the OPM breach.
Jeff: Right. And there were major breaches before that. Uh, I mean my early days of PCI, I happened to work for a practice that was involved in helping several merchants. And in one case, a service provider that were victims of at the time. The biggest credit card breaches that had happened this was back in the 2006-7-8 year turned out, it was the Alberto Gonzales hacking rings. You know, they figured out ways of harvesting millions and tens of millions.
Jonathan: It was just the Target one.
Jeff: No, no, this is years before Target. This is T.J.Maxx, Hannaford, the service provider was Heartland Payment Systems. They were at the time rather famously breached all supposedly PCI compliant, and that caused all sorts of disruptions. But I happened to be part of the practice. We weren't doing the forensics. We were the PCI Assessors QSA is qualified security assessors that came in and said, okay, we've got to get you back in compliance. We got to get you following the security standards. And invariably, they were checking boxes, all these different organizations, not really knowing what they were checking or not, and they just didn't have that logical step back, looking at the big picture. How does it all work together? And they missed some things, let's say. So a lot of them, getting them back into a secure state or a compliance state had to do with educating them on what. What it all meant in the first place and how it all worked together. And again, that goes back to that early lesson that I learned of. Why does it bother you? Why bother with locking a safe, if it's five levels deep of the security program? Because it matters, because any and most of the breaches, the ones that you've cited that are more recent, if you. To the degree that you get to understand what really happened invariably. There is no single point of failure. There's a cascade of bad habits, not following processes, people trying to do the right thing to get their job done. Knowingly circumventing the security, precautions, and rules that are in place. But they felt like they had to because the boss needs them to do this type of thing. Invariably that's, it's this whole cascade of failures when the Target breach happened I was working at Tenable and my boss asked me to respond to something that somebody from Gartner had written about the Target breach, that had very peripheral knowledge of PCI and, you know, so they were drawing a lot of wrong conclusions. So I went point by point line by line through, this is what happened and PCI has 12 major requirements. And just based on what I was seeing in the media, I could pick off making it up now, I don't remember the exact number, but, you know, seven or eight of the 12 major requirements were clearly not being met, just based on what was going on and the irony to me was within the PCI world. Target had actually a very good reputation for having a large security practice. They invested in security, they invested in the technology, they had people, you know, so they supposedly were doing the right things.
Jeff: And they had there, you know, they had the right approach to it, but they were fundamentally missing on so many fronts because of this lack of institutional knowledge, this, the stove piping or siloing of certain people are in charge or they threw a lot of technology out there and said, Okay, we've got the technology there. We're good.
Jeff: Not understanding, what all is involved with, you know, you can use technology, but you got to be able to use it well, and you got to know what it's telling you and type of thing, you got to have two eyes on it, interpreting the outcome, the outputs and things like that.
Jonathan: Yeah. What's interesting about the safe thing. I just drew this real random conclusion. It's why in the military, like you make your bed every day. It's cause it's these like little steps that you want that just kind of like happen. And I think that's the tough part is like trying to get like all these little cybersecurity steps, they just need to be like, wrote, like, it just needs to be like, yeah, it just needs to be your job. Right?
Jeff: Muscle memory. Like that's part of the culture and the stuff that you do and you don't even think about.
Jonathan: Right. Cause that's what like, as those start to break down, like that's how you get to like the catastrophic failures and things like that. It was, I mean, just like that TV show, that just aired Chernobyl and stuff. It's like these like little things that seemed very small at the time like as they start to add up, start to cause these like crazy, crazy things. So let's talk about your book real quick, your contributor on Tribe of hackers. How did that kind of work? How did that come about?
Jeff: Well, the person that is the brainchild behind the book Tribe of Hackers is a gentleman named Marcus Carey. He's also an ex NSA guy. I met him several years ago in the conference circuit and we found out we were both NSA creepies. And so we kind of bonded over that. He had written, he had read a book that was on another topic, but it was tribe of something. And he read the book and said, this is a pretty good concept because the concept is essentially, and what this book is, he put together a questionnaire and he sent it out to people that he knew within the industry that he felt like, I don't know exactly how he chose people, put that out there, but a people that are recognized and successful and are thought of as leaders in the cybersecurity space. I don't know why I got included other words were NSA creepies. But he put out the questionnaire to these folks and said, “Please respond to these questions” and the questions are, you know, what's your earliest lesson that you learn. What's your biggest success? What's your biggest failure? What's your favorite hacker movie? What's your favorite hacker book? If you had it all to do over again, what would you change? Stuff like that. You know, it sort of covers the gamut of not just the professional, strictly the cybersecurity, but you know, who are you as a person? What makes you, who you are. What got you to where you are today?
Jonathan: It's not just like credentials and things like that. It's kind of like what shaped your…
Jeff: ...experiences. Yeah. What shaped you? And so 70 respondents, Marcus being one of them, became the book. So there are 70 people in the book. Each has their own chapter as it were, and everybody is responding to the same questions. So what's most interesting to me about the book is a couple of things. One is when I got the book and I saw the list of everybody in it, and these are the best hackers in the world. I hadn't heard of a lot of them. And I think I know a lot of people, I at least get to meet a lot of people cause I'm out there a lot. So there's a lot of diversity because there's a lot of people making a difference, but they're not all like at the hacker conferences or they're not all at black hat and RSA type of thing. They're doing things in there. Their little niche, whatever that is. There's also diversity in terms of, you know, there's a lot of talks these days in the industry about being more inclusive of women, for example. So there's a lot of women, people of color, people of other ethnicities. So there's a lot of cool, what I think is really cool diversity in the book in terms of the spectrum of people that he involved, not just the typical one silo of people
Jeff: And then what's most fascinating is to read through the book and see how all these different people coming at the problem from different approaches or different backgrounds or experiences, how they respond to some of the bigger questions, like what's wrong with the industry. What's right with the industry type of things and seeing sort of the commonality in the responses. I think that's what's cool about the book. And the book has been fairly much more popular than Marcus expected, to the point where he is now working on a series of books. I don't know how many he's got in his brain, but I know that the next one that they're planning on putting out as a tribe of hackers’ book that's focused on red teaming. And so there's a questionnaire that's focused on red teaming and yeah. I was asked to be a part of that. I don't know if I'll make the final cut or not, but I got, I got the questionnaire and I filled out the responses and they're also apparently putting out a second edition of the original tribal hackers book with an actual publisher. So this was self published initially. They're going to come out with a more real permanent addition. All I knew was I had to sign a release for him because are the photos in the book because it's a real publishing house. They had to have a release from the photographer for the photos that were being used in the book. So that's how I know that there's a copy of the book. They had to follow the rules and they are following the rules, which is an important security lesson. So that's the book Tribe of Hackers it's available. You can go to tribeofhackers.com and find her if you go to Amazon, it's for sale on Amazon and all the proceeds go to charities. At least in this first round, it's on the back of the book, something called Bunker Labs, the Sickle Cell Disease Association of America, Rainforest Partnership, and Startup Kids Club. So it's for all of us involved in, it's a way to give back to the community.
Jonathan: Yeah, nice. It seems like a great book for anybody that's like just starting or been in the industry because especially as you're starting out and you're kind of wondering, like, are you doing the right things? You're doing the right steps to kind of do it. The book shows that. You know, there are tons of people coming from all different backgrounds, all different ways, trying to do their own thing. And there's no just one cookie cutter. Like you go to college, you major in this, and then you go to the NSA and then you come out of the NSA.
Jeff: There's definitely not one specific career path to follow. And people my age, I get asked all the time, how'd you get into the business? And it was like, well, just kind of been here from the beginning, just kind of fell into it type of thing. And it's hard to fall into something that's, you know, we're still relatively young, but we're old as an industry, you know, 25, 30 years, however, you want to count it. The questionnaire for there, the red teaming book was interesting because a couple of the questions were along the lines of. How did you get into it? How would you recommend getting into this? What should you do to prepare yourself? Some of the classic questions. Like, you know, do you go the education route or the training route or the certification route, and what are the pros and cons? And I get those types of questions a lot when I'm out meeting people at conferences and I don't really have a good answer other than I to be a hacker, whether it's this book or the red teaming book, I am convinced more and more, that hacking is something that you have innately in you that sort of the desire to learn the desire to figure things out, how things work, just being inquisitive, not accepting, what's put before you trying to figure out different ways of doing things like the meme, you mentioned, with the little girl, I figured out a way to solve the problem. I'm old enough to have watched the original Star Trek series on TV, on a black and white set, and later on, it became movies. And I often quote the second Star Trek movie, which is the Wrath of Khan, wherein that movie its sort of running through the whole movie is Captain Kirk. When he was at the Academy as a student, he was the only one to ever beat the one test that they were given called the Kobayashi Maru.
Jeff: And the Kobayashi Maru test is designed to see how you would operate in a no win situation. So it's set up on a whole, a deck and it's this whole war gaming exercise where the ship that you're the Captain of is surrounded by the enemy and you're about to get blown up and destroyed. What are you going to do with it? So there's not really a wind scenario. It's more testing your mental faculties, how do you respond under pressure, that type of thing. But Captain Kirk beat it somehow. And at the end of the movie, hopefully, everybody's seen it by now, at the end of the movie it's revealed that the way he beat it was he broke into the computer and hacked it essentially, and changed the program and put in a scenario where you could, he could actually win the battle.
Jeff: So did he really beat at the test or mean the test he hacked? I talked to a lot of people that they see the red teaming and the pentesting, and that's the cool and sexy cause that's what's presented from the front of so many conferences, but there's so much more to this industry than just that one thing. And if you talk to a lot of pentesters, a lot of them aren't really happy with what they're doing. They think it's kind of boring. It's not the way it's presented where here's the result.
Jonathan: It doesn't show the six months. And the, like all the dead ends, the guy went down and things caused a lot of these ones, especially for more modern and zero days and stuff like there isn't like a tool that's built to do this test and automate the scripting and to run all this thing. Like, yeah, you can run your scripts to like find the open holes that are already there, but like that's not the one, not the one that's big and sexy is the one where you probably did some crazy stuff for months, pulling your hair out and finally got to it, you know? And...
Jeff: Yeah. So, I agree. And what I try to explain to people is fine then they're asking what should my career path, how should I focus? I say, expose yourself to as much as you can within this business. Cause there are so many elements to it. And look for the thing that you're either good at. Where you seem to have some aptitude for, but more importantly, find the thing that you enjoy. And if you can find something that you just, all this is really cool. I really like it, pursue that. Whether it happens to be a red team position or whether it's a researcher or whether it's a developer, whatever it is, if you enjoy it and you feel like you have fun at it, and that's going to make you better at it, and that's going to give you the ultimate satisfaction.
Jonathan: You won't get upset an hour in. You spend eight hours and time will just fly. Cause you're just having fun. To steal a quote from Nick Offerman. You just gotta, you get a paddle, your own canoe, you know, and find your own way. Like there's not like a cookie cutter. Like, Oh, you want to do this? You might, you know, go to college and stuff like that. But at the end of the day, you might find that, the muffins.
Jeff: I've known too many people that think, Oh, I want to get into computers and technology because they grew up playing games, computer games. So they think they're really into computers and they get to college and they start doing program and they're like, Oh, this sucks. I terribly like it. And I'm like, and then they're like, what do I do now? I'm fine. Fine. What it is that you enjoy finding out what it is. Hopefully what you enjoy doing is what you're good at doing right. Or vice versa. And that's the ultimate career path is what gives you satisfaction. It makes you feel like you've done something good at the end of the day. I like to tell people in terms of hackers, I know a lot of hackers that aren't in this book. I know a lot of hackers that most people don't know her hackers. But to me, the best hackers in the world, nobody knows who they are.
Jonathan: Yeah. Well, on that we can end it. I appreciate you coming in today, Jeff, and chatting with us.
Jeff: Well, it's been fun. Thank you.
Jonathan: Awesome, thanks.