Listen to the Audio
Enjoyed this podcast?
Share it with friends now!
Ken Gilmour, CEO of Knogin and Ryan Corey, CEO of Cybrary, are hosted in this episode of Cybrary podcast and they are talking about security-conscious culture.
Ken Gilmour states some steps by which a company can found a security-conscious culture. One of the major problems is that we tend to forget people while in security training. A janitor for example can clean up someone’s desk, and there can be a critical document, and that is a security concern. According to Ken, security in an organization belongs to everyone and it is inclusive. So while security training for the employees, we should take different needs into account and provide the particular area of security training for particular job roles. A developer, for instance, knows that he/she must not use public wifi, but his/her part in a company’s affair is to code securely, so training for that must be provided, and different job roles should be considered for a different area of security training. NIST’s NICE framework has made it easy for companies. It is a nice framework that takes a job role and finds out all the skills for that including the security skillsets which is needed for that particular job, so training can be provided based on it or a person with those set of skills is hired. Another step to a security-conscious culture is not to brush the crisis under the rug. This expression has a lot to say, but simply some entrepreneurs hide the breach if they are attacked specifically small organizations, but as Ken says, the companies that have survived the attacks are the ones that managed the crisis very well, not the ones that brushed it under the rug. The next step is accountability. An accountable person needs to have the knowledge and awareness regarding the criticality of data, otherwise, he/she could not be accountable for data. Another step is rewarding people. This technique is well-appreciated, and fun to use in a company’s community to make it security conscious. Rewarding people for finding security holes and things like that inside your company can be economical and fun at the same time because once that security hole is exploited, it would cost a lot for the company. But this method can make the employees search for issues and report them to be fixed or fix it themselves. Another step toward a security-conscious culture is to promote personal privacy. Well, this means, that in training if the provided materials are explained with personal and real-life examples, the participants would be more likely to catch the points.
This is because some security consciousness training is too boring for the presenter and the participants as well. It would no longer be that way once it is linked with their personal life. For example, tracking your kid's online activity. So, understanding security from a personal view and giving them a mindset of how it is like in business can be much effective. The last word about a security-conscious culture is that there should be trust among members in a community. In Knogin, for example, if someone forgets to lock his/her computer, and someone else sees it, he/she would go to the Slack community from that computer and sends “I love you all” to everybody in the community. So, it would be fun, and next time the person will be careful. So, trust is something important to get. Everybody trusts each other, and that is why one can freely do such kind of thing.
Lastly, Ken talks about Knogin (founded two years ago) which is a behavioral analytical application that finds the context of you, who you are, and such things, and you will get an alert in case some suspicious activities are seen related to you. A free version of Knogin is available for individuals through knogin.com and paid version is available for businesses. Security and security training must be a continuous process.
Thomas: Well, welcome back everybody to another episode of Cybrary’s podcast. Today we are talking about security conscious company culture, and today we have Ken Gilmour, the CEO of Knogin and Cybrary’s own CEO, Ryan Corey in the building today. So thank you both for being here today and take it away.
Ryan: Great. Great Ken, really glad to have you. You're visiting us from where?
Ken: I've walked Costa Rica. So yeah, warm and sunny there. And I have to come over with my, I guess my sold snow coat. So
Thomas: Yeah, it's quite cold here.
Ryan: Sorry for the four degrees here.
Ken: Kind of warm myself up every now and then
Ryan: Yeah, now your accent is not Costa Rican.
Ken: That's right? Yes, I am Irish. Don't hold that against me.
Ryan: No, no, we could listen to you talk all day. We're loving it.
Ken: Thanks. Yeah, that's good. It's cold there too. So Costa Rica, the warm place with choices, smart choices.
Thomas: It's the best place to be, Yeah, I'm going to be coming and visiting you soon.
Ryan: Awesome. So we here at Cybrary, we have a concept called security enablement which is how our product is being deployed in a lot of enterprise organizations and really excited to get you in here today. Cause I know that you and I have had conversations and you have a lot of thoughts on what it's like to actually build a security conscious culture at an organization. So we'd love to dive into that with you and, and just kind of pick your brain on it if it all possible.
Ken: Yeah, for sure. Okay. yeah, I mean, like I think one of the biggest problems with, you know, in general cybersecurity programs and awareness programs is that, you know, you tend to forget people like, you know, your janitor deals with confidential documents on your desk, have you trained your janitor?
Ryan: Right, right.
Ken: You know, so it's, it definitely needs to be an all encompassing thing for the entire company and it really helps to include everybody because in general, cybersecurity itself is inclusive. You know, you could be compromised by a man or a woman of any gender or race or any country.
Ken: And you know, having a diverse team and training everybody within your team gives you different aspects of understanding of how to defend your organization as well.
Ryan: Yep. Yep. So it's a concept that we, that you know, I think the world of security has been saying and preaching for years, like security belongs to everyone, but I don't honestly believe that, you know, organizations have taken that truly serious. Right? So you gave the example of the janitor who deals with, you know, probably fairly sensitive documentation that they have to transfer into the trash or whatever. But what does kind of security belongs to everyone mean sort of to you?
Ken: Yeah. So, I mean, again, you know, security awareness is then given as a very broad exercise, you know, so it gets examples of, you know, don't go onto wifi and things like that and it's kind of very generic but then again, you know, having specific types of courses for people within the organization based on their role is a very important thing too, because you know, developers think, well, I know I shouldn't use the wifi in the airport. You know, other people may not think the same way but a developer may not know things about how to code securely, you know, because they code to make things work and that's their job. That's their entire training. I remember asking in a conference one time if any of the audience had any training, any security training whatsoever? No one put their hand up and I asked, has anyone got a bachelor's degree in IT? And literally everybody put their hand up and it was like, okay, so you know, what's going on here? So, it is important to train people in that way.
Ryan: That's just proof that we haven't taken it seriously like security belongs to everybody. It does, right? It needs to be baked into the entire life cycle of the technological products and services and applications that we're delivering, right? And it's just literally not, it hasn't been to date. Now we're finding the trend that some organizations are starting to think that way now, right? I know I'm, be very proud of some very prominent financial institutions that that are using Cybrary to put security skillsets into the hands of everybody in the technological org and I think that's what you were talking about, you know, when earlier you talked about how security awareness isn't enough, right? You need people like the developers to have understanding of how to code securely. The DevOps professionals need to need to really understand what DevSec Ops kind of looks like and Sys admins need to know security network administrators, right? All that kind of stuff. So that's an important thing. And I think that we're seeing that trend. So it's just so surprising that it's taken so long for people in these major organizations to get there.
Ken: Yeah, exactly. And even, you know, not just major organizations, I guess, you know, a lot of people try to follow what the major organizations do and do it that way, but that's not always the best way. And I think every business, regardless of your industry, really does things differently. You know, you could have pharmaceutical companies that you would actually call tech companies because they're just so automated, you know, and then other ones that are very manual and the types of training that you need to give needs to be, you know, broad enough for all of the people within your specific organization and not just, you know, your industry or what everybody else does.
Ryan: Yup, then the organization NIST, they do a good job with their nice framework where they take, we will take a job role in like a technical job role at an organization, and they'll define out what the skills are, you know, not only to perform the core tasks of that job role, but also the security skill sets that go with that job role. And so that's kind of how our product is aligned, right? You just assign a work role to somebody and like these are the skills and then also the security concepts that they need to know.
Ken: That's fantastic. But you've, you mentioned about not passing up on a good crisis or brushing it under the rug. Tell us your thoughts on that. Cause I found that pretty intriguing.
Ken: So I think a lot of, well at least in small companies, you know, a lot of entrepreneurs, they try to hide their mistakes, right? They feel embarrassed and,
Ryan: Oh, I've never done that.
Ken: Yeah. I can't believe we got hacked, you know, and it's hard to take ownership of it, you know, whereas in a, like a very large enterprise, hundreds of your employees have publicly seen you make this mistake, so you can't really hide it because one of them was going to speak out. So you just, you have to deal with it. But in smaller organizations, you know, people tend to hide it. But I think that's kind of tragic in the sense of you're hiding this big learning opportunity, right So you made a mistake at only I am infallible. Everybody knows that, right? But everybody else is not. So, you know, we, we expect that everybody to meet, we expect everybody to make mistakes. And the key thing in us, you know, if a, an organization is hacked and it, I think it's probably inevitable that it will happen sometime in some way, maybe minor, may major, and really the best companies that survive after that are the ones who have properly managed the crisis, not the ones who have brushed it under the rug.
Ryan: Right, that makes sense. So then, you know, in kind of a next step in building a security conscious culture, you've talked about accountability and what that needs to look like. And accountability, as we all know, like in business is critical, right? If you're trying to drive outcomes you can't be, you know, as a CEO or whatever, you can't be the one doing everybody's job for them, right? So you have to delegate out. You make it very clear what the objective is. You give the person all the tools and resources and so on in order to do that job well in the way that you're asking of them and you hold them accountable to the results. What does it tend to look like when it comes to securing, creating a security conscious culture in terms of accountability?
Ken: So accountability needs knowledge and awareness, right? So I've had I think there was one instance, at least where I've gone to an organization and asked the question, you know, so who's accountable for your data. If you get breached, who's accountable. Not responsible, who's accountable? And they say the DBAs and I'm thinking, well, hold on a second. Okay. It makes sense. The data is in a database, but do they know the sensitivity of it? And do they know that certain fields need to be encrypted? You know, like credit card numbers, things like that. And if not, Why did they get fired for this? And so who's the actual data owner? who is the person who knows what that data is? And that's the person who needs to be accountable because they have the awareness and they have the training, you know, within that context? So it doesn't make sense to make someone accountable for something if they haven't had the correct training for that accountability.
Ryan: I get that loud and clear. So again, just it's you as a manager or as a leader, right. You need to give you give people the tools and the skillsets that they need in order to do their job properly. And without kind of that awareness, that training and that knowledge they've right. That the DBAs in that case, if they don't understand what security looks like in their work role, then you can't hold them accountable. And ultimately it's then your fault.
Ken: Yeah, exactly. And it's not just security, it's value of the data, you know, how
Ken: How important is it to the organization and what would happen if it was lost? You know, should we back it up? if not, you know, is it because of budgets? Is it because of value? What's the reason for it. And you know, their entire architecture should be done based on the understanding and the awareness within the organization, which requires training and awareness.
Ryan: Yup. Yup. So, uh, we went mentioned DBAs there, how does this tend to look in sort of like the, you know, the software development life cycle? How does that play a role in building a culture?
Ken: Yeah. So I mean, a lot of software development, life cycles, even, you know, when you're going through QA and things like that it's for unit testing in the sense of, you know, I want to test if this is working correctly and, you know, so you get a list of, these are the things that the unit should do, and it should work in this way. And then maybe QA will, you know, do a little bit of extra work and see if they click it in the wrong way, will it crash or with something else happen? But generally they don't do things like unintended uses from a cybersecurity perspective, you know, and they need to be trained into what, as to what to look for, and how to look for it, and, you know, what's important in that security context.
Ryan: Gotcha. Gotcha. Very cool. Um, you've mentioned as well then as another step in creating this type of culture, security conscious culture, rewarding people. what are your thoughts around that? Cause that's really interesting.
Ken: Yeah. So, you know, there's a lot of cool bug bounties and things, which are really, really sweet, right?
Ryan: hacker one...
Ken: Yeah or even, you know, direct companies, like if you find a bug with Google, they could pay you, I don't know, like a million dollars or something, or maybe it's Apple, I don't even know. And so imagine if Google, right? They have this option, right? Two options. They can give an external person a million dollars to find a really good bug, or they can give an internal employee $5,000 for finding the same bug, you know. Which one makes more sense to the organization, right? So when you're doing cybersecurity training and people find problems within your network, you know, what's the incentive for them to find that problem. You know, it's inkwell hundred dollars, you know, I don't have park with my hundred dollars that say that could be used well on my business, but if they didn't find that, that could have cost you thousands of dollars.
Ryan: Way more.
Ken: Yeah, exactly. So, you know, rewarding them with these, you know, small rewards is significant for them and really good, but it also protects your future as well. And it is an investment in your own security under your future as well. That should be considered as part of your budget.
Ryan: Got it. And so that rewarding people in the organization for finding the holes, finding the vulnerabilities, it kind of adds a little bit of layer of a gamification to it, which you kind of referenced as another step in creating a security conscious culture, making it fun. What does that look like? What does fun security, so what's that gonna look like?
Ken: You know, I think, you know, really the thing is you need a good community, right. You need your team to be able to trust each other and do fun things. So, you know, one of the things we do at Norwegian, which I really love actually is if someone forgets to lock their computer, the standard is you open Slack and you send a message saying, I love you all to everybody in the company and everybody respond it after, but that's it. I know it makes it fun and everyone just goes, forgot to lock his or her computer. And, you know, we just all laugh about it. And then that person remembers next time. Well, you know what, I should lock my computer and because I don't want to be publicly embarrassed anymore, even though it's just kind of, not really that bad.
Thomas: Yeah, we do. Yeah, pretty similar. I'm always known to change people's backgrounds. Jeremy can attest to himself. I've done it. Jeremy, our person in the booth, I've changed his background a couple of times, so it's just like, yeah, if it's unlocked and you're not here, I mean, that is a security thing you need to think about. And it's just a fun way to kind of remind people without, you know, doing too much damage
Ryan: background background takes a few steps, right? I mean, it's a little,
Thomas: you can do it quickly.
Ryan: You can do it now. I guess you can get there quick. The way it started here at Cybrary was, we call it the garden where if you leave your computer unlock, you walk back and you open it up to a garden or a sea of Google image, search results that are going to embarrass you. So like narwhals, yeah, unicorns, my little pony. Right? Panda bears. You name it?
Ken: Ransomware screensaver.
Ryan: Yeah. With those kinds of things, if you have a little extra time yeah, then you can really embarrass somebody pretty hard, but yeah. That's great. Is that, kind of the main thorough process there and making security fun? Like just little games to things and,
Ken: Yeah, just kinda, I guess, you know, making it so that everybody kind of has to participate. Well, you know, you don't really have to participate and especially if you forgot to unlock your computer. You know, your participation wasn't really there. Somebody else did it for you, right?
Ryan: Yeah. Yeah. You're involved.
Ken: Yeah, exactly. So, yeah, I guess that's it, you know, just find a way to get everybody to think about things as they're doing it without, you know, causing problems. I mean, that thing that we do, it's great for us, but in some industries that may be frowned upon, you know, because someone could actually have sensitive information on the computer and some third party has gotten into it. And then that employee gets into big trouble because. They have secret stuff. And so, you know, it, it depends on the industry, but that's what works for us.
Ryan: Yup. Got it. Got it. And then finally, in building a security conscious culture, you talk about promoting personal privacy. What does that mean to you?
Ken: Yeah, so I think, you know, one of the things. If you want to get people involved, make it personal, right? So, one of the things I used to hate in my career was receiving and giving security awareness talks because you get, you know, a hundred people in a room, you just talk there for an hour about random things and it was mandatory and you check the box at the end of the year for your PCI certification.
Right. And that was it. And nobody likes sitting through it and they were just so boring. Plus my monotone voice, nobody listens anyway.
Ryan: I think it's pretty pleasant, quite frankly.
Ken: And so, I mean, you know, the really, when you want to give proper, you know, engagement to people making it personal to them. So doing, you know, how to protect your children online. It's a thing that, you know, if I wasn't a technical person, I would like to learn, right? I'd like to go to a course like that and ensure, you know, my kids are not going to these crazy YouTube videos and doing stuff while I'm not watching because, you know, as a parent, you know, it's hard enough monitoring your kids, but then, you know, having to go through all of their history and, you know, see what's being done and then talk to them about it. It's just, it's very difficult, right? Because you just don't have time. You have a lot of things that you have to do. And so, you know, just kind of bringing people into that and understanding from a personal level of how to protect yourself, what are the risks and also gives them the mindset of, you know, what the world is like for business and other things. So, you know, while you're promoting security within your business, you're actually giving someone the value of securing themselves as well.
Ryan: Yep. Yep. That makes sense. So those are a lot of great points on creating a security conscious culture. So I appreciate you doing that. Why don't you tell us real quick what are you guys up to at Knogin?
Ken: Oh yeah, sure. Cool. Yeah, so we do a behavioral analytics application and, so the idea is that we can find the context of you and who you are and what you do, and then anything that deviates from that. So, if someone has compromised your account and they start doing things that you wouldn't ordinarily do you receive an alert for us?
Ryan: Excellent. The company has been around for how long?
Ken: Two years now. Almost.
Ryan: Wow, that quick man. I remember when it feels like it was like two months ago. Yeah. so, you guys created a course on Cybrary, so people can take a Knogin course on Cybrary, right? And then is the product available as a download like the free version of it or something like that? And where can people get that?
Ken: Yeah. So you can log in. Yeah. So just go to knogin.com, K-N-O-G-I-N.com and, and free version for individuals. If you've got a business and you want to monitor a bunch of people, you pay per user so much easier. But generally it's a, you know, a product that's helps you as a person to be able to protect yourself on your family without any cost?
Ryan: That's critical stuff, critical stuff. So keep up the good work. Again, that's a K-N-O-G-I-N.com where you can access that and definitely sign up for the course on Cybrary.
Thomas: Yeah, it's the Cybrary easy course. I actually took it on Wednesday when I came just to go through. So, yeah, I mean, it's nice that you make it available for pretty much anybody and then companies as well. And then just an easy download directly from your site and get you all those analytical tools and everything, which is nice. But I mean, while I have both of you here, I mean, talking about a security conscious company culture, I mean, that is definitely something that being like a lower level employee, I mean, kind of comes from the top down. I know it's definitely something like we focus here on at Cybrary and I know it's something you guys focus at Knogin, I mean, how much input are you guys putting in like when you're starting kind of, we talked about it earlier is like making sure that, you know, security and security training is something that is continuous and not just, you know, the first thing you do the first day that, you know, you start work and then you never think about it again.
Ken: Yeah. Or an annual exercise. Yeah. I mean, it's definitely. So things change all the time in business and, you know, people will find things. And as part of your gamification, I guess, in rewarding of people, you can actually notify people and give them training on that as well. So the person who found the problem could give the course, right? So they won the award. They're the hero. So, you know, let them show everybody how cool they are and how they found this bug, but now it's fixed. So we don't need to worry, but watch for those things in the future and you too could get a hundred dollars.
Thomas: Yeah. Kind of giving you like a reward system for people who, you know, might find things or doing things correctly, the way that, you know, you want them to be done.
Ryan: Yeah. Cool. You know, you're familiar with, we've talked about on this podcast before what we do here at Cybrary, but you know, I think it's easy when you're in this industry to assume that everybody thinks of the same things. Like it's just baked into what we do here as a company, right? So security is just baked into kind of everything.
Thomas: Yeah. I mean it’s yeah. Something that, yeah, working in tech, you kind of almost take for granted for after a while. It's just something that you think about, but anytime I'm home and visiting my parents and I'm seeing my dad use any one of his same passwords and I'm just like, come on. It's, please can we do this a different way? This is not. And like, it's hard to make people understand who don't have to do this all the time or don't, you know, utilize all the tools and things that we do kind of take for granted.
Ken: Yeah, now, I think that's an important point as well. So I mean the hardest people to, I guess, to preach to and our parents and trying to get them to understand security awareness, you know, my own parents say, well, I have nothing to hide, you know, but the thing is, if you don't really use your email account, someone can set up an email accounts reporting to be you, get used that to gain access to your bank account online, even if you don't do online banking. In fact, especially if you don't do it because you don't have an email address associated and they can just create that, start doing all this stuff for you instead. So they haven't hacked you. They've just created an account and taken over your life.
Ken: Just because you're not protecting it properly.
Thomas: Interesting. Yeah. It's not something I had thought about before.
Ryan: Well, this was good, man. Appreciate you doing this.
Ken: Thank you. That was great. Thanks for having me on.
Thomas: Yeah, absolutely. Thanks for being here.
Ken: Thank you.
Thomas: Well, yeah, that is another Cybrary episode podcast. So thanks for listening and tuning in and make sure to check back for another episode with Ken Gilmour.
Ryan: That's right.
Ken: Thank you.
Ryan: Thanks guys. Bye.