Digital safety services and equity in cybersecurity
Leigh Honeywell, CEO and founder of Tall Poppy, a security company that is building tools and services to help companies protect their employees from online harassment and abuse talks about her career running security incident response at Slack, protecting infrastructure running a million apps at Salesforce.com, shipping patches for billions of computers on the Patch Tuesday team at Microsoft and analyzing malware at Symantec.
We talk about how all of these demanding jobs prepared her for her work at Tall Poppy, get into what she learned about the intersection of First Amendment speech protections vs. online safety from working at the ACLU, why changing the culture of online harassment will probably have to be a marathon, not a sprint, and Leigh shares her experiences with several accelerator startup organizations.
0:00 - Equity in cybersecurity
3:10 - Getting into cybersecurity
7:15 - From physics to computer science
12:30 - How Tall Poppy came to be
19:26 - Technology fellow at the ACLU
26:26 - What is Tall Poppy?
31:20 - Social platforms and change
39:53 - How to work toward equity in cybersecurity
43:02 - Y combinator startup accelerator in cybersecurity
50:07 - LGBTQ+ inclusion in cybersecurity
54:27 - Learn more about Tall Poppy
56:06 - Outro
– Get your FREE cybersecurity training resources: https://www.infosecinstitute.com/free
– View Cyber Work Podcast transcripts and additional episodes: https://www.infosecinstitute.com/podcast
Transcript
[00:00:00] Chris Sienko: Is Cinderella a social engineer? That terrifying monster trying to break into the office. Or did he just forget his badge again? Find out with Work Bytes, a new security awareness training series from Infosec. This series features a colorful array of fantastical characters, including vampires, pirates, aliens and zombies as they interact in the workplace and encounter today's most common cybersecurity threats.
Infosec created Work Bytes to help organizations empower employees by delivering short, entertaining and impactful training to teach them how to recognize and keep the company secure from cyber threats. Compelling stories and likable characters mean that the lessons will stick.
So go to infosecinstitute.com/free to learn more about the series and explore a number of other free cybersecurity training resources we assembled for Cyber Work listeners just like you. Again, go to infosecinstitute.com/free and grab all of your free cybersecurity training and resources today.
Today on Cyber Work, I welcome to the show Leigh Honeywell, CEO and Founder of Tall Poppy. Tall Poppy is a security company that's building tools and services to help companies protect their employees from online harassment and abuse. Her career includes running security incident response at Slack, protecting infrastructure running a million apps at salesforce.com, shipping patches for billions of computers on the Patch Tuesday team at Microsoft and analyzing malware at Symantec.
We talk about how all of these demanding jobs prepared her for her work at Tall Poppy, get into what she learned about the intersection of First Amendment speech protections versus online safety from working at the ACLU. Why changing the culture of online harassment will probably have to be a marathon and not a sprint? And Leigh shares her experiences with several accelerator startup organizations.
This is a real full meal of an interview. So get comfy and please give us your focus for this week's episode of Cyber Work.
[00:01:56] CS: Welcome to this week's episode of the Cyber Work with Infosec podcast. Each week we talk with a different industry thought leader about cyber security trends, the way those trends affect the work of infosec professionals while offering tips for breaking in or moving up the ladder in the cyber security industry.
Leigh Honeywell has more than a decade of experience in computer security incident response. Prior to co-founding Tall Poppy, she was a technology fellow at the ACLU. Her industry career include running security incident response at Slack, protecting infrastructure running a million apps at salesforce.com, shipping patches for billions of computers on the Patch Tuesday team at Microsoft and analyzing malware at Symantec.
Leigh has a BSC in Computer Science in Equity Studies from the University of Toronto. Leigh is also a graduate of the StartOut Growth Lab, an accelerator program that offers mentoring, education and networking opportunities for companies founded or co-founded by LGBTQ+ entrepreneurs.
She lives in Ottawa Canada having returned to her hometown after a decade in the United States. And I'm really looking forward to this. We're going to be talking about digital safety online. We're going to talk about equity in cyber. We're going to talk about all the things.
Leigh, thanks very much for joining me today. And welcome to Cyber Work.
[00:03:07] Leigh Honeywell: Thanks so much for having me on, Chris.
[00:03:09] CS: My pleasure. Yeah. I always like to start out by kind of taking the temperature of our guest and their interest in tech and security. How far back does your interest in computers and tech go? I mean, one of your degrees was in computer science. I know it was in college. But were you excited about computers and tech when you were young as well?
[00:03:29] LH: It's so funny. There's this classic experience from like the literature on why don't girls go into computers. Kind of academia that's basically like women saying in there they finally find their way into a computer science program. They're like, "Well, I didn't want to go into computer science because I didn't want to be stuck behind a computer all day." I apparently had that exact conversation with my mom towards the end of high school where she's like, "Leigh, you're applying to these physics programs."
The one program I didn't get into that I applied for in university was midwifery. There's like an undergrad degree here. And I had the prerequisites. But I was like I wasn't a master's in social work already.
Get into the physics program of like struggling with like second-year quantum mechanics. And I had been doing some like IT Consulting on the side. And I eventually just sort of like accepted that I needed to actually do a computer science okay. Because I had been – I didn't have that sort of straightforward path that you see a lot of kids that are like hacking since they were two and they go straight into undergrad computer science. It really was like a bit of a winding road even though I think it was obvious to everybody else that that's what I should be doing.
[00:04:41] CS: Right. Yeah.
[00:04:42] LH: I did a lot of like IT work and that kind of thing outside of university and eventually switched my major. The other sort of confounding factor in the whole thing is I now know I have ADHD. Did not know that in university. Might have something to do with my –
[00:04:57] CS: One-year diagnosis myself here. Go team. For those behind the – ones behind the scenes, we were comparing fidget spinners before the show started. So tell me about that.
[00:05:10] LH: It might have something to do with like the whole nine-year undergrad experience. But as much as it took me nine years to get through my undergrad, I came out of my undergrad with multiple years of work experience under my belt and a CISSP because that seemed like a fun thing to do at some point.
And I was really, really fortunate to have spent that time in Toronto where there's like an incredibly thriving security community. There's like monthly meetups. There's a couple of great con – like BSides Toronto is fantastic. SecTor is a fantastic conference. Just like really high-caliber security community, which kind of makes sense given that like all the banks are based there and like the Ontario government and all that stuff.
Really cut my teeth on the work there with my first like real security job being at MessageLabs, which was a email antivirus company. Got acquired by Symantec during the time I was there. And I think we've got a couple other points of talking about this as well. But that entry-level security job is so critical to like the trajectory of people's careers. And I really was fortunate that there was this employer there that had that – like antivirus is sort of a funny space because it is so technical. But there's also a lot of like grunt work.
And those jobs where you have that combination I think are really interesting ones. That was my entry point. But I was definitely like the computer kid since I was like very small. But I was raised by a family of lawyers. So we didn't like know what the path was. Yeah.
[00:06:42] CS: Ah. Yeah. Sure. Sure. Okay got it. Okay. Yeah. That makes good sense. Yeah. As someone who started out in chemistry and had been writing and writing music reviews my whole life and everything when I finally said maybe I – physics, me and physics, hit the wall so hard in sophomore year.
But, yeah. Sometimes everyone around you knows what you're supposed to do before you do it. I definitely vibed with that. I guess when you decided – you said, "Well, I guess I can do IT. I've been doing IT on the side." Did that feel like any sort of like a step down? Or were you equally excited about that once you sort of like got into it?
Because like you said, you were very stoked about like quantum physics and all this other stuff. Yeah. Where did the spark sort of reignite once you got into computer science?
[00:07:33] LH: There's actually like a very specific place that it happened. And so, I used to – I don't know if you remember Slashdot back in the day. It was like a big deal. I organized the Toronto Slashdot Meetup for a while. And we would have like 50 people show up and verbally put a dollar sign in Microsoft and sort of [inaudible 00:07:54] like that.
And it was this really awesome like community of enthusiasts that at that sort of like very cutting-edge early time of open source. And so, getting plugged into that community. Through that community, I discovered like the 2600 Community. And somebody showed up at a 2600 meeting with a bar code for the first ShmooCon. This is like 2005. The first ShmooCon down in DC.
And I'm still like a marginally employed like IT help desk kind of type at this point. And so, a bunch of us like bought a single hotel room, piled in five people into a car and drove to DC for the first ShmooCon. And it was just this like, "Oh, this is what I want to be doing with my life." Like these are people whose entire job is like to protect other people. But they kind of get to be like trickster like fucking around with stuff a little bit at the same time. And I was like, "Oh, this is like lighting up those neurons."
[00:08:58] CS: I love it.
[00:09:00] LH: From that moment at ShmooCon, figuring out like how to get into the field? What to do? How to get there? Started picking up incident response gigs. Just like web forums getting hacked and stuff like that. I was doing an internship at Bell Canada, one of the big telcos up here, in a group that was doing really early VoIP deployments. And this was like really interesting like security stuff around that early VoIP. It was super messy. Super, super messy. Just like everybody getting hacked. And then there's like toll fraud and it's expensive.
It's actually kind of like it reminds me in a way of the way that people deal with like – there's all these like hacking attacks on Bitcoin and cryptocurrency. Because there's such an immediate financial payoff. You hack into somebody, you steal somebody's wallet, you're stealing their money. You hack into somebody's PBX in 2005, their like VoIP PBX, you run up a bunch of toll fraud, you're basically just also stealing their money.
And there was a bit more capacity for like chargeback kind of stuff in the VoIP system, in the like phone system. But you really were just like costing people money when you're hacking that stuff. Yeah.
[00:10:11] CS: Interesting. I love that. Yeah. I mean, yeah, that's the part that I'm not necessarily going to be endorsing as part of the show. But I do want to sort of put an exclamation point on what you said there in terms of –
[00:10:25] LH: Just to be clear. I was cleaning up for that stuff.
[00:10:27] CS: Oh, no. No. No. I know. I know. I know. I'm making a little joke there. But, yeah, in terms of like getting excited about cyber-attack, we appeal to early professionals, people in school. And a lot of the focus is like how do you find your community? And also, how do you stay excited about it? And I think this is a perfect example, Leigh, with you having this group that went to DC in this conference and everything and you all bonded over it. And that's going to last you a lot harder when you're on page 672 of your CISSP study manual or whatever than instead of going it alone or like typing something into Reddit or whatever. Yeah.
[00:11:06] LH: Yeah. I think having that sense of community. People talk a lot about like mentorship and careers. And I think one of the underrated pieces of mentorship is having peer mentors. It's having people. Like having a posse of folks that you're like growing up in your career with. And I was really, really fortunate to like have that set of peers within my like local community.
And this is like early days of like Security Twits if you remember? Sort of 2008 time. This was like very early days of Twitter. There was an amazing like security Twitter community really early on.
[00:11:42] CS: Oh, okay. Got you. Got you.
[00:11:43] LH: And, yeah, like identifying that sort of posse that you can bounce stuff off of. Maybe some folks are like a couple steps ahead of you. But it's not like a mentoring, mentoring relationship kind of thing.
[00:11:57] CS: Right. Right. Got it. No. I love that. Yeah. Yeah. And that is good because there's – we could talk about mentoring stuff. We could talk about sponsorship stuff.
[00:12:07] LH: Sponsorship. Yeah.
[00:12:08] CS: But peer mentoring is something that we don't really talk about that much apart from make sure you have a study group. But this isn't a study group. This is a reinforcement group. This is like a writing group for writers and things like that. You're giving each other feedback. Yeah.
[00:12:20] LH: Yeah. You've got to have the group chat. Whatever it looks like.
[00:12:22] CS: Yeah. Yeah. Exactly. Moving on to that, I want to sort of talk about your career arc so far. I mean, we name some of the cool things. You were securing infrastructure running a million apps at once via Heroku and Salesforce. You were a security response manager at Slack. You were a malware operations engineer at Symantec. Each of these feels like it provides a part of the story leading to the creation of Tall Poppy. Can you tell me about each of these and what skills or experiences you're able to take and apply to creating your own company?
[00:12:54] LH: Yeah. It's funny. Sort of like that's the training montage, right?
[00:12:57] CS: Yeah. Yeah. It is. It is. [inaudible 00:12:59] playing in the background. Yeah.
[00:13:01] LH: I think like a couple of the moving parts there, the two of the biggest things that have been formed like what we eventually came to do with Tall Poppy is understanding like what are the threats that people are facing today? And what are the incentives and high-level dynamics of these threats?
The two things that I'll sort of tease out there, doing malware analysis back in the day. That sort of technical understanding of like how do threats spread? In that case, it was email. And now we have like social media. And then with Patch Tuesday, the the team I was on at Microsoft, thinking about like sort of almost the economics of security issues.
And the sort of example I often give of like if you're anybody who's not like a nation-state level target and you have an iPhone, like you're probably not – your phone is not getting owned. Like the OS is not getting owned. If you're installing your security updates, like you're just not getting owned unless you're literally targeted by sort of Pegasus-style nation-state malware, which people do get targeted by. And I think it's really important to be aware of that set of threats.
And if you're someone who like hears like nation-state malware and you're like, "Oh, that could be me," turn on that new mode that Apple just shipped that's like the lockdown mode, right? Do that. It's mildly inconvenient. But the goal is to keep you safe from like the actual shady shit.
Thinking about that sort of threat model, like what are the economics? One of the big trends that I think we've seen in the past really the past five-plus years, but sort of the early bits of it of the past decade, is that mass exploitation of consumer accounts has, in many ways, replaced the kind of like low-level – I saw somebody present recently at this like continuing education program that I was part of. And they use the phrase internet street violence to describe sort of account takeovers. Like low-level grifting.
[00:15:08] CS: Okay.
[00:15:09] LH: And I was just like, "This is the most perfect summation of this stuff."
[00:15:12] CS: Yeah. Someone just threw a bottle in my head.
[00:15:14] LH: I was just, "Oh, my God." Like somebody just ATO'd my uncle. Like internet street violence, right? That stuff. That stuff used to be mass commodity malware. Drive-bys when you're like trying to pirate a movie or something and you get infected through your torrent client or whatever, right? That kind of attack, I feel like if you – this is very much based on vibes rather than like a quantitative study. But if you were to look at the frequency of that kind of like mass drive-bys versus the frequency of like all of the different account takeovers scams and just like general credential stuffing, you would have one of those graphs that was like that, right?
And what we see in our day-to-day work with Tall Poppy is very much the like internet street violence stuff. There have been a couple of cases in the five years whereas running the company where there was actually like a malware infection. There was actually like nation-state kind of stuff happening. But so much of what we see is password reuse. It's general sort of credential theft like that.
And I think having that perspective of like what does it take as a Microsoft to be shipping those Patch Tuesday patches? What's the level of security investment that's gone into Windows? Into Mac OS? Into the mobile OS's over the past couple of years? The bar for attackers has been raised so much that now they're going after the squishy human underbelly of all these things, which is going to be your credentials. It's going to be the sort of social engineering attacks.
The big one that we saw – the two big ones that have been on my radar over the past year are these financial scams that start out as like a romance scam. They're called culturing, which is a very like disturbing name. But it's all about like if you fatten up the pig and then you kill it, which apologies to the vegetarians out there. But that's like the analogy that's used to describe these scams.
If you've ever gotten one of these like wrong number texts that's like, "It was great to see you at the gala last week, Bob." Like, "I'm not Bob. Who's Bob?" Right? And it turns into – yeah. Don't answer wrong number texts. Tell your elderly family members to not answer wrong number texts. Oh, my God.
And then the other one is the like, "Hey, can you send me a screenshot of the message Instagram just sent you? And I need it for some contest." Right? It's your friend's account gets owned.
[00:17:39] CS: Oh, my God. Yeah. Yeah. Yeah.
[00:17:41] LH: Yeah. That style of attack. Like these aren't super technical attacks. I mean, the pig butchering scams. They like set up fake escrow websites. There's technical infrastructure involved. But it's still like it's a human attack. It's not like shady malware getting into your machine kind of stuff.
[00:17:59] CS: Right. Right. Right. Yeah. Yeah. It's more of that like MTV's Catfish level of like social storytelling almost. Yeah.
[00:18:07] LH: Yeah. Exactly. And it's funny you mentioned MTV's Catfish. Because some of the like creepy data broker websites that like advertise on Catfish and are like, "And then we use blah-blah-blah to hunt down the person." Those websites, 100% what scammers use to target people. Yup.
[00:18:25] CS: Oh, yeah. Oh, yeah. That's my wife and I Sunday night watching. And, yeah. It's a little grubby. But I'm into it. But, yeah. Yeah. I was going to say that it feels weird at this point in my life that we now have to tell our parents to turn the TV off during the day. And don't get into cars with everybody, you know? All the stuff that we supposedly were not going to take to heart they seem to be stuck with now. But, yeah. We could do security for seniors. Basically, I've just scared my parents off of everything. If it's not for me, then don't touch it. Yeah. Yeah. Yeah.
[00:19:04] LH: Or at least like the other option is like training them to send the thing to you first.
[00:19:09] CS: Yes. Exactly.
[00:19:09] LH: My folks are really good about like, "Hey, Leigh, is this real?" And usually, it's not real. But every so often, it is. And I'm glad we have that like open conversation about this stuff. Yeah.
[00:19:22] CS: I had to move from crashing to classy. I want to talk about your work as a technology fellow at the ACLU. Your profile notes that you applied your technical expertise to legal briefs, public writing and advocacy work, as well as working with ACLU IT and security and hiring, which I imagine also dovetails nicely with your equity studies major at University of Toronto.
What aspects of your security background and expertise were you providing to the ACLU? In what way has your security knowledge been able to enhance the work that the ACLU does?
[00:19:54] LH: I love getting to talk about the ACLU stuff. There's been this sort of new job invented over the last like 10,15 years referred to as a public interest technologist. This is someone who has a technical background, computer science engineering, whatever, software development, who goes into sort of technology policy work and being able to apply that like deep dive technical expertise to the parts of our sort of ecosystem that are making laws, that are challenging laws, that are working on like how do we think about this computer stuff as a society?
And I feel really lucky to have had the chance to work at the ACLU at the moment that I did. We were litigating cell site location privacy to the Supreme Court. I got to review legal briefs and dig into like how do cellphone companies store your location data in their databases. And would it be onerous to like have them narrowly give over data versus just like, "Here's the dump from the entire cell tower of everyone who was ever near that." Right?
[00:20:56] CS: Yeah. [inaudible 00:20:58]. Yeah. Right. Right.
[00:20:59] LH: Exactly. Exactly. Making sure that as we – like as technology gets more complicated, that our civil liberties are protected. That people are able to assert their rights. That kind of stuff. Really fun working on that sort of technology policy side of the fence.
My interest has also been this issue of online harassment and online speech for a long time. And the ACLU was a really interesting place to do that work because it is fundamentally like a very First Amendment-focused organization. That that is the sort of one of their big battlegrounds.
And a lot of speech that is harassing is very much also like First Amendment protected. There's like an interesting tension there. And so, figuring out how to do this work that I wanted to do to keep people safe online while also like acknowledging that there's a free press and people have freedom of speech.
There's like there are some challenging tensions there. And the way that I've sort of threaded that needle is really focusing on the sort of defensive side of like how do we keep people's accounts safe? How do we keep people's data safe? That's not complicated from a First Amendment perspective in many cases. That's the sort of like technology policy side.
And then I did a little bit of work with the rest of like I was under the legal side of the fence, side of the house within the ACLU. But there's also – obviously like it's a 1,100 or 1400-person organization. There's a lot of like infrastructure and IT kind of stuff. They were hiring their first full-time security person. And I helped with that hiring process.
And so, that was really interesting to be able to apply my sort of more corporate security, blah-blah-blah, to this very like essential important organization that's doing like critical civil liberties litigation.
[00:22:52] CS: Okay. The two things that I'm getting out of that. One is – because my initial thought was that it almost sounded – like my initial view of it was that it was like almost more like volunteering. I'm taking this insight that I have and I'm putting it to an organization that needs it, which to some extent, with the IT and the hiring part, it sounds like that works.
But what I really like about this, the first part, is that you're not just giving in to the ACLU your insights. You're getting something back in terms of it's – you said it sort of refined your idea of what can be done first amendment versus safety and so forth. And I think that's really interesting. And before I jump to the next thing, I want to ask how one would get involved in that?
I imagine there's ACLU branches elsewhere that could use similar insights. Do you have any thoughts on that?
[00:23:43] LH: Yeah. I think there's two things to think about with public interest technology. For folks that are super interested in like the policy angle of things, I'll share a website that Bruce and I actually maintains that's like a job board and announcements and stuff specifically around public interest tech. Various like NGOs, Civil Society organizations.
In the states, there are two fellowships that are really a big piece of the ecosystem. There's the Tech Congress Fellowship. I believe applications for that open pretty soon. This is literally like you get to go be the thing that I did at ACLU but at like a senator, or congress person, or committee's office. It's a super, super badass. I know a bunch of people that have gone through it. It's a really neat program. There's also the Aspen Fellowship that I'm forgetting the full name of. But I'll send you for the show notes.
[00:24:36] CS: Thanks. That's awesome. I'm super stoked that I asked about that.
[00:24:42] LH: Yeah. Sorry. Just to like finish the thought really quick. That's the sort of like you want to be at policy technologist and do this sort of like intersection of ideas plus the actual like technical stuff. Help shape laws, weigh-in on lawsuits, that kind of thing. The other part of it is like how can we, as technology people, help organize.
And I think there's like the risk there. The thing I always sort of give people, like watch out for this impulse, is like we'll go build an app to solve X. And it doesn't really solve X. And then the organization is left with this app that they don't know how to maintain. So that's like the –
[00:25:15] CS: And maybe a little cynicism about ever like trying again.
[00:25:17] LH: Yes. Ever working with techies again. Exercise caution around that particular impulse. But that said, you want to go in and fix some printers and do other like IT desk side kind of stuff. That's a really good way to make friends in non-profit land is just show up and like offer to fix the things that are broken.
From a purely like volunteer basis, I've had really, really positive experiences over the years doing that kind of work. And it's something that can be pretty low-touch. But there's always something that's like kind of broken. And if you're listening to this show, you're probably the person in your family that fixes the printer.
[00:25:55] CS: Sure. Yeah. Yeah. Completely. Yeah. Yeah. A friend of mine said like not everyone has to be the team leader. Someone needs to clean the toilet. Someone needs to take the notes, you know? And so forth.
[00:26:09] LH: And there's like a particular satisfaction in doing like IT janitor work. I love that stuff when it's like in reasonable doses.
[00:26:16] CS: And also, just the practicality of it is that like it frees up people who otherwise would be doing that to actually do the work that they're supposed to be doing.
[00:26:23] LH: Yeah, exactly.
[00:26:25] LH: So from here, let's talk about Tall Poppy. You're the CEO of your organization, which the organization on your profile is "building tools and services to help companies protect their employees from online harassment and abuse."
Firstly, can you tell me about your decision to create Tall Poppy? What niche you needed to fill in the industry? And second, can you tell me about some of the tools and services you create to help protect people from online harassment and abuse?
[00:26:52] LH: Yeah. I've been doing this work sort of in my evenings and weekends for coming up on like 15 years at this point. Started out with various like women in tech communities that we deal with ongoing like abuse and harassment kind of issues. And really, over the years, developed a playbook of like, "Ah, you're dealing with this giant internet hate mob." Like what are the actual steps that you as an individual can take on your own regardless of what the platforms do? What law enforcement does? There's stuff that's like outside of our control and then there's stuff that's inside of our control.
And it is an imperfect set of solutions. Like I would also like to solve the various like social problems that lead to these issues. But I try to stick with what I can actually accomplish. And so, we came up with this playbook over the years. And somewhat coming out of my work with the ACLU, I finished up my fellowship there, which was a year year-long fellowship. And was like, "Hey, nobody's done this thing that I've been thinking about doing for years, I guess –" and I've got my green card now. That was the other factor. "I guess, I'm going to do it now."
Started the company. Had like a pretty clear vision that there was like a software tool we needed to build that would walk people through securing their online accounts. Understanding and remediating their online footprint. That's what we ended up building.
We have a SaaS app that is like a security training tool focused on this sort of like personal ecosystem. And then over the years of like deploying the app, working with big organizations, small organizations, public sector, private sector, all this stuff, we realized that there's a set of users who want something a little bit more high touch, a little bit less self-serve and more full-serve.
We built out a sort of executive offering where like we basically like consensually cyber stalk you like with your permission. Build out this like online profile. Remove stuff from all the catfish sites. Like all the sort of data brokers that exist. Because the US doesn't have any privacy laws. There's all these companies.
Anyway, the Consumer Financial Protection Bureau is actually doing like an open comment period about the impact of these data broker websites on people's lives. I'll send you a link for like where folks can weigh in if they've experienced harm from these creepy stalking websites. But, yeah. That's a whole other rant.
We do this executive service. We like basically consensually cyber stalk people. Remove data where we can. Identify the stuff that we can't remove. And then, optionally, as a second step, do a deep dive security review of their sort of security practices. What kind of online accounts do they have? What is their smart home setup look like? Online banking passwords. How do their kids interact with technology? All of that kind of stuff to build out this sort of fulsome picture of like what are the risks in your life and what are the steps that you as an individual can take to protect yourself?
And in both of these cases, both the app-based and like the hands-on service, we also do incident response. And it's like included as part of the package. Ideally, like you've done all this prep stuff that's going to keep you safe. But inevitably, there's like still crap that happens. So let's also make sure that you've got a person you can call or get on the phone, look at the situation, assess the threat. Is this something where you need to like you should actually be engaging law enforcement? With all the caveats that like law enforcement not so great at dealing with internet stuff sometimes, most of the time. So, challenges there. Yeah.
[00:30:25] CS: Yeah. Yeah. Your platform, it sounds sort of two-level. Like you have this sort of shelf that you put in front of you and you guys take care of all the – like you said, removing. When you say like data, like cyber – yeah, online cyber stalking. You're basically removing their personal data from like those kind of like whitepages.com, like things like that. Is that what you're saying?
[00:30:44] LH: Exactly. Exactly.
[00:30:44] CS: Okay. Got you. Got you.
[00:30:45] LH: Yes. We do the White Pages kind of stuff. With the executive service in particular, we go into like court records. We go into did they register their LLC with their home address instead of like a PO Box? Let's like amend that with the Secretary of State.
Secretaries of states are so nice. You get on the phone with them and they're just like so helpful. Yeah. Really deep dive in sort of like if somebody was really mad at Chris, like what would they find online and what can we do about it?
[00:31:15] CS: I'll be requesting a free trial soon here. No. Well, okay. You mentioned something. You said I could talk all day about this other thing. And so, I want to talk about this other thing. I might be totally wrong about this. But I know that when we talk about security issues on a pure empirical level of defending the perimeter, you get a lot of different opinions on where to place the door with a giant padlock, metaphorically speaking.
We say, is this endpoint protection? Is it security awareness? Would it all go away if we just patch our shit? But similarly, I appreciate what you're doing in creating these services aimed at keeping people safer and reducing like the possible damage from cyber harassment. And I'd love to know if you have any thoughts on other aspects of the culture as a whole that should or can be changed so that rampant abuse isn't so easily engaged with in social anonymity.
I mean, we certainly know that there's plenty of social platforms that have figured out that they make more money if they just let some of the wildfires burn for a while and things like that. Do you have any thoughts on sort of – this seems like this could be a problem that's changed the way that seat belt regulations were changed or environmental protections were changed. It's this multi-decade of just picking apart little things until like the wall collapses. Do you have any thoughts on this?
[00:32:38] LH: Oh, gosh. There's a bunch of different ideas come to me from this. I think the first is the sort of what are the – is it just – would this go away if we all patch our shit? And I think we've raised the water level, the sort of like overall ecosystem safety level in so many ways, whether we think of like stack canaries and all of these sort of technical address space layout randomization. All these like very technical operating system-level things. And yet, we still have software bugs coming.
And I think at that sort of ecosystem level, there are things like moving to memory safe languages that really are important for that long-term. There's a group at Berkeley called the Center for Long-Term Cyber Security, which I just like love as like a concept that people are thinking about this.
[00:33:35] CS: Long-Term Cyber Security. Yeah. Yeah. Absolutely.
[00:33:37] LH: Okay. This is a very ADHD part of the conversation. I apologize. But I just had this moment last week when a couple of my like CSO friend's kids were graduating from university and they're like little Bobby is looking for his first full-time security job. And I realized that we're actually just hitting that first like second-generation cyber security professionals. You think about people that are like sixth-generation doctors. We're one generation in in cyber security.
Thinking about like what is the next 50 years of this field look like is just like kind of awesome that there's like – we are such – we are early, early days. Memory safety. Huge important. We have to go there. Because most of the bugs that people are getting owned at the OS level from, they're all just like memory safety issues.
It's a big lift to like rewrite our browsers, rewrite our operating systems in memory-safe languages. But if I was getting into this field today like as just a sort of generic computer science student, I would be learning memory-safe languages like Rust because I think there's almost an inevitability that we have to shift to that for, again, that sort of like 30 to 50-year next steps.
The other big like big picture thing is just passwords, the use of passwords, consumer accounts. That's sort of like what does trust look like when we have 400 accounts each? Passkeys I think are a really important step in the right direction. I'm a huge, huge, huge fan of like U2F and the Yubikey and all of that sort of like other ways of thinking about authentication and trust even with consumer accounts.
But I think there are scaling questions there too of like how can the average like end user be expected to like understand or keep track of all of these things? And I have a bunch of ideas of where like what we do at Tall Poppy could go in terms of thinking about keeping individuals safe across all of these different account types. But it's really pretty like early days there.
But I think that's like the big picture thing that I've been thinking about, is on the end point, we have to solve memory safety. And on the like cloud consumer account, work account, all these things, like that identity question is so big both within organizations and for consumers.
[00:36:08] CS: Okay. Well, two things. First of all, for the purpose of our listeners who are ranked beginners – this guy here – could you explain memory? What was it? Memory retention? Memory security?
[00:36:19] LH: Memory safety.
[00:36:20] CS: Memory safety. Can you explain that concept to me? Because – yeah.
[00:36:24] LH: One of the like sort of origin story papers in computer security was from like a thousand years ago. It was called Smashing the Stack for Fun and Profit. And there's a whole class of security bugs. And I'm trying to come up with an explanation at like a high-level that won't also piss off the people that get it at a really low-level.
Basically, there's all of these different attacks that involve manipulating the memory of a computer by like shoving garbage at it. And when you shove a bunch of garbage at it, it flips out. And you can predict the ways that it flips out and make it do things that give you control of the computer. That's like a very like high-level summary of this class of attacks.
Memory-safe languages don't let you do that. It's the sort of TLDR. They have a variety of different ways that they do that. But, fundamentally, you eliminate a class of security attacks, security vulnerabilities, by using these memory-safe languages. And there's a set of people who are like, "Oh, C++ is great though. You just have to like know how to write memory-safe C++." And that's like saying you don't need seat belts. You should just not get in car accident kind of thing.
[00:37:34] CS: Yeah. Right. Right. Right. Yeah.
[00:37:36] LH: When we talk about like adding seat belts to our security ecosystem when it comes to like what languages we use and how we build low-level systems, using memory-safe languages, that's the seat belt solution. That's like the system – or airbags. It's actually closer to airbags because like you don't have to like manually click on an airbag. It just works.
[00:37:58] CS: Yeah. Right. Right.
[00:37:58] LH: Yeah.
[00:38:01] CS: Obviously, we've talked to a C++ guy there a moment ago.
[00:38:05] LH: Sorry.
[00:38:07] CS: No. No. I love it. But I was going to say, so what are the – is the obstacle to sort of going into this coding language in terms of rewriting browsers and all that stuff? Is it just an entrenched we've always done it this way? We don't want to have to start everything all over again. Is it, "But C++ easier." What do you think – obviously, as we said, this is probably going to be a multi-decade issue. But like what do you think the sort of impediments to this? Or is it just that people aren't thinking about it yet and you're kind of on the bleeding edge of people who are like thinking in this direction?
[00:38:46] LH: And I will like caveat that this is me like conveying the opinions of people with much more educated opinions about this particular area than myself. It's not an area of my like deep expertise. But I think there's a whole set of things that are preventing the like glorious future of only using memory-safe languages.
There's just a lot of code to rewrite is a big piece of it. There's definitely like organizational inertia, like a lack of organizational learning around transitioning to memory-safe languages. And then there's also like a lot of like personal preference stuff that comes up. And I think that people will make arguments of like C++ is so much more performant. And there are like performance implications for some of this stuff that are legit. But there's also just a lot of like this is how we've always done it.
[00:39:35] CS: Okay. Yup. Perfect. Thank you. Well, I like all of this too. Because it sort of gives listeners a little sort of easter egg in terms of how to sort of future-proof their careers and think about things again that are going to be maybe very relevant in 10 years, 15 years, whatever.
And to that end, I want to transition over to the work side of Cyber Work here. For listeners who want to get involved with this type of work, whether it's on the advocacy side or the technical side of building and creating these protections, can you talk about what types of studies, skills, experiences or proficiencies they should be working on?
For example, if you hire entrance-level people at Tall Poppy, like what are some things you really want to see on the resume or cover letter in the case that they're like ahead of the curve and thinking about these things?
[00:40:19] LH: Yeah. We're really fortunate that we – and we're not hiring right at this second. But we have hired a number of folks that are relatively early career specifically to do the sort of one-on-one analysis work that we do with our executive clients. But also, to do incident response and other sort of parts of the puzzle.
The big things that we're looking for as like a set of characteristics in the folks that we work with, there's a certain level of technical depth. Like technical understanding. But especially, curiosity. We run into new and exciting ways that people are jerks on the internet and are like causing security problems and all these things. There's stuff that we're going to run into that just nobody's seen. We don't expect people to like be the experts in everything. But having that sort of like doggedness and persistence and like investigative mindset I think is really important.
The other thing that is super, super important to me and really important to our clients is, fundamentally, like basically bedside manner. I asked on Twitter a while ago, like what's a non-medical version of bedside manner for like IT help desk kind of? Hack side manner or desk side manner was like the – I like hack side manner. That was really cute.
[00:41:34] CS: I love it. Hack side manner. Yeah. Yeah. Yeah.
[00:41:37] LH: It's really like there's a big difference between coming into a conversation with someone who's dealing with like a harassment or hacking kind of situation, "Oh, my God. You weren't using a password manager? What were you thinking?" Right? And like we've all been that person. We've all like worked with that person.
Really, that sort of attitude of compassion and approaching things with curiosity even when you're like, "Oh, somebody's messed something up here." But being like, "Hey, like let's talk about this. What happened?" And really being like non-judgmental. That's important. And we've seen – when I've interviewed folks for these roles, folks coming from like a ton of different backgrounds. I wish I could hire like 15 more people to do this stuff. And someday we will grow the team that big.
But I think I feel really fortunate to have this opportunity to build like a team that includes junior and early career folks. Because we're always talking about the cyber security skills gap. And it's like we need twice – I do actually think there are probably twice as many security jobs out there even with all the layoffs and stuff as there are security people qualified to do them. And some of that is because big companies are not building in paths to train up junior folks. Everyone's just picking over the same 15 senior people.
[00:42:57] CS: Yeah. Yeah. Absolutely right. I want to talk in a not employment capacity. And you clarified this a little bit for me before the show. But I want to talk about your work with Pioneer Fund, which is an alumni organization that invests primarily in companies participating in the Y Combinator startup accelerator. And you also are part of StartOut Growth Lab, which is an accelerator program for companies that are founded or co-founded by an LGBTQIA+ person or persons. Can you talk about the work that each of them do? I know that they're kind of at different scales.
And also, I guess just the notion of a startup accelerator. I mean, we've talked to some – what people call serial startup people. And they found they found a business, they moved to the next one. They found a business, moved to the next one. But what exactly does like the venture funds of a startup salary? And what is your role within these organizations?
[00:43:53] LH: I'll start with the second part of the question and go back to the first. Right after starting Tall Poppy, we got into the Y Combinator startup accelerator. At the time, it was like 150 companies in the cohort. I think they're like 200-ish these days. They do two cohorts a year. It is a very large but also like quite prestigious startup accelerator program.
And what they do, they give you a relative – I mean, it's bigger now. But at the time, like a small investment in your company for 7% of the company. Pretty decent chunk of equity. And then you run through a program of like curriculum training you how to how to do a company kind of thing.
[00:44:31] CS: Oh, okay. Yeah. Yeah.
[00:44:33] LH: It's really like here's how to get customers. Here's how to find like employees. Here's how to raise money. Here's how to grow your company. All of these different pieces.
[00:44:42] CS: It's more than they want you to succeed. They need you to succeed if they're going to be –
[00:44:46] LH: Well, they're bought in. There's skin in the game. Right? And there's a lot of accelerators out there and some of them are much more just like a curriculum and they don't also invest in you. And I think that's one of like, as folks are evaluating, do I want to do this program? Making sure that the incentives are aligned.
There are even accelerators that like charge you money. And skeptical eyebrow goes up at those a little bit. But definitely, like looking at what the incentives are within the organization.
So StartOut, we finished YC. We raised a little bit of money out of that. Got some exposure to the sort of YC ecosystem. It's a really – like the network that you plug into with YC is really interesting. Companies like Airbnb and Dropbox are alumni of this program.
A couple months later, we got into the second program called StartOut Growth Lab. It is a startup accelerator focused on LGBTQ folks. My original co-founder is non-binary and I'm LGBTQ myself. And, specifically, bi. And this was a really interesting program both in terms of like the mentorship and network that it offered. But also, the specific like small group cohort that we worked with.
The big difference, 200 companies versus like five to seven. Right? It's a much more sort of like cozy, like you're working with a small cohort of folks. And it's a longer program. It was six months, which really gives you time to like build out your network, mature your business.
And StartOut – like the accelerator is part of a larger organization called StartOut, which is a nationwide non-profit focused on LGBTQ entrepreneurship.
[00:46:30] CS: Yes. That's how I found out about you. Actually, it was through Jonathan. But, yeah.
[00:46:33] LH: Yeah. Yeah. I love the StartOut folks. They've just been like a fantastic resource. And I've been really fortunate to get to give back to that community as well both like continuing to refer people into the program, mentoring founders who've gone through it after me and staying connected with folks.
But, yeah. I didn't realize that in like some ridiculous number of states banks can discriminate you for loans if you're LGBTQ. It's bizarre. And obviously, there's all these like scary legislative stuff happening that's way more severe than worrying about your bank loan in places like Florida. But thinking about the sort of broad spectrum of discrimination and abuse that LGBTQ folks experience. Having this like entrepreneurship program I think is really, really valuable.
And, really, just like the companies that have gone through it as well as the sort of broader network is just fantastic. And I'm really, really grateful for having had the chance to do that.
The model that StartOut works on is instead of taking like 7% of your company, you as an individual founder make a pledge that if you sell your company, you have an exit, you will donate 1% of the proceeds to the non-profit. There's skin in the game in the sense that like you want to succeed. They want you to succeed. They benefit as an organization from you succeeding. But in a much less like burdensome way than a typical for-profit accelerator would. I thought that was a pretty cool model. And, yeah, it's a fantastic program.
Out of my work with YC, I also became involved with Pioneer fund, which is the alumni investment fund that is like folks who did YC investing in subsequent companies that participate in the program. And that's been super interesting. I've learned a ton about – it has helped me level up my own running of a company and like my investor game and all that kind of stuff. And I've also gotten to just like look at a ton of different YC companies and what they're doing and how exciting it is.
And so, I think there's a lot of interesting opportunities to be involved and engaged with startups at an early stage. Whether it's like looking at startups as places to go work. Investing in startups, there's like a bunch of rules around like being accredited and what the bar is for that. That's certainly like less accessible. Although there is starting to be stuff like equity crowdfunding that's really interesting.
And definitely, like learning about the startup ecosystem. I think organizations like StartOut or other sort of like local startup groups, you go find a meetup. Maybe there's a local startup accelerator in someone's community. Tons and tons.
Oh, the other amazing resource in the states, there's obviously the SBA. But there's also a group called SCORE, which is the Service Corps of Retired Entrepreneurs. And it's folks that'll like help you make a business plan and stuff like that. Yeah. If folks have that like entrepreneurial bug, there's like – oh, and the one other thing I'll mention, YC Startup School. Startupschool.org. They have the whole curriculum. You can just go watch it. It is like why don't you just go get an MBA in like 10 hours of YouTube videos? It's pretty great. Their content is available. Yeah.
[00:50:00] CS: I love it. All right. Yeah. That's great. And thank you also for answering questions that I was meaning to ask [inaudible 00:50:05] simpatico. Yeah. I want to talk a little bit about – we've talked about StartOut. And I want to talk about your own experiences as an LGBTQ person in the sort of monocultural hegemony of the cyber security space.
And I'm wondering if you wanted to sort of get out your crystal ball and talk about like how this changes. I've seen steps. I've seen things. I just was at Women Impact Tech last Thursday event. And I'm involved – or I know organizations like Women Who Code, Women in Identity, LGBTech, Code Your Dreams and so forth. And it feels more exciting and more robust even than like five years ago when we started the podcast. But I know there's so much more to do. Do you have any advice, requests or recommendations for people who are actually trying to make cyber security more inclusive and a welcoming place?
[00:50:58] LH: I feel really fortunate to be doing this work now. I think we have come really, really far as a field. Actually, I stopped going to DEF CON in like 2013 because I just like had my butt grabbed too many times. And I'm going this summer. It's awesome. I'm so stoked. First time in 10 years. Actually, I was going to go last year and had some family stuff come up. But this year I'm going. I'm super stoked to go to Vegas and know that like if somebody's shitty to me, that the goons are actually going to like do a good job of handling the situation. Because I feel like they've really – yeah, it is so much better as a field.
[00:51:36] CS: They got the memo. Yeah.
[00:51:38] LH: They got the memo. They got the memo in like quite a bit of detail, I think. And so, I think there's that sort of like organizations need to change. A lot of the stuff that needs to happen to make the field better are at the organizational level.
At the individual level, whether it's like building – as you build that peer group, make sure that it's not 100% dudes. And at that sort of interpersonal level, for folks that are like early career but not like super senior, don't pull up the ladder as you expand your career. Look for those opportunities to mentor folks. And mentor folks that don't necessarily look like you. I think there's just so much work to be done. We can't afford to not let anybody – to exclude people from this field, right? Because we got our work cut out for us.
[00:52:25] CS: Yeah. Well, and to that end, do you have any advice for people who might like the idea of these things but have been kind of sitting on the sidelines hoping that someone else would do the big work?
[00:52:34] LH: Oh, man. I think the – can you just dig in a little bit to what you mean?
[00:52:41] CS: Just in terms of, "Hey, it's great. It's more inclusive than it was five years ago." But not having like an active hand in it. Like what's a small thing that you would sort of recommend for people so that it's not just like good progress is being made? And instead, you have buy-in of like I'm helping, you know?
[00:53:02] LH: Yeah. I think when folks are in an organization, they've got that first job. Or they're like a little senior in their role. Taking a second, looking around. Like what does my team look like? Does everyone look like me? Does everyone have the same like background and trajectory? And from that, going to like is there something that our organization is missing in terms of how we recruit? In terms of how we build our pipeline?
I think there's a lot of the sort of – there's much better writing out there than I can sort of come up with in a short segment of like how to build that like early career. Like sort of general security diversity stuff. And I think taking that moment of introspection and saying like, "What does the current – what do we look like now? What is our trajectory? What's our funnel look like career-wise?" And making sure that we're not missing people.
And then the sort of second step beyond that I think would be to actively go out and say like, "Are we posting our jobs on job boards for underrepresented folks? Are we participating in that sort of community of whether it's like securing diversity or the Diana Initiative during Defcon and Black Hat?" All of these different sort of security initiatives. Making sure that organizations are making that deliberate effort to participate.
[00:54:26] CS: Yeah. That's great. Yeah, I mean I could talk to you for hours, Leigh. This has been so much fun. But I just wanted one last question. If our listeners want to learn more about Leigh Lee Honeywell, Tall Poppy or get involved in their local startup accelerator or either as a learning or as a mentor, where should they go online?
[00:54:45] LH: Yeah. Leighhoneywell.com. Got a sort of directory of all the places I participate in. I think for start out specifically, they have local meetups in different cities around the states at startout.org. And then, yeah, I think in terms of the entrepreneurship piece, I really recommend the Startup School from YC. They even have co-founder matchmaking. If you're like, "I've got an idea but I don't know how to do sales." They'll find your sales person, co-founder kind of thing. Yeah.
[00:55:12] CS: Interesting. I love it. And Tall Poppy is just what? Tallpoppy.com? .org?
[00:55:16] LH: And we're tallpoppy.com. Yeah. I’m like the worst salesperson for my own stuff. Yeah. We're tallpoppy.com. And, yeah, we love being able to support organizations that are dealing with personal cyber safety and online threats.
[00:55:30] CS: Is this a 100% organizational clients? Or do you have like individuals that also can use –
[00:55:36] LH: We have a wait list that I'll share with you for our future consumer product. We typically do work primarily with organizations. We do some work on the executive side with individuals, whether it's like authors. We work with some streamers. Folks that have sort of like a high profile tend to –
[00:55:53] CS: Like a YouTube profile. Yeah. Yeah. Yeah. Okay. Got it.
[00:55:56] LH: Yeah. Exactly. We've also got a really extensive resources page that links to a ton of different DIY and self-service resources. I'll share that as well.
[00:56:04] CS: Cool. Beautiful. Well, Leigh, thank you again for providing our listeners with all this insight and enthusiasm. It was such a pleasure.
[00:56:11] LH: It was really great. Thank you very much.
[00:56:13] CS: And thank all of you who have been listening to and watching the Cyber Work podcast on a massive scale. We just tipped 70,000 subscribers on YouTube. We're so happy to have you all along for the ride. But before I go, I want to invite you to visit infosecinstitute.com/free to get a whole bunch of free stuff for Cyber Work listeners.
Our new security awareness training series, Work Bytes, is a live-action sketch-oriented show featuring a host of fantastical employees, including a zombie, a vampire, a princess and a pirate making security mistakes and hopefully learning from them.
Also, visit infosecinstitute.com/free for your free cyber security talent development ebook. It's got in-depth training plans for the 12 most common roles including SOC analysts, penetration tester, cloud security engineer, information risk analyst, privacy manager, secure coder and more. Lots to see. Lots to do. Just go to infosecinstitute.com/free. And, yes, the link is in the description below. It's all here.
Thank you once again to Leigh Honeywell and Tall Poppy. And thank you all so much for watching and listening. And until then, we will talk to you next week. Take care.
Subscribe to podcast
How does your salary stack up?
Ever wonder how much a career in cybersecurity pays? We crunched the numbers for the most popular roles and certifications. Download the 2024 Cybersecurity Salary Guide to learn more.
Weekly career advice
Learn how to break into cybersecurity, build new skills and move up the career ladder. Each week on the Cyber Work Podcast, host Chris Sienko sits down with thought leaders from Booz Allen Hamilton, CompTIA, Google, IBM, Veracode and others to discuss the latest cybersecurity workforce trends.
Q&As with industry pros
Have a question about your cybersecurity career? Join our special Cyber Work Live episodes for a Q&A with industry leaders. Get your career questions answered, connect with other industry professionals and take your career to the next level.
Level up your skills
Hack your way to success with career tips from cybersecurity experts. Get concise, actionable advice in each episode — from acing your first certification exam to building a world-class enterprise cybersecurity culture.