The Modern .NET Show

S06E21- Breaking the Compromise: Unravelling the Truths of Cyber Security with Lianne Potter and Jeff Watkins

Sponsors

Support for this episode of The Modern .NET Show comes from the following sponsors. Please take a moment to learn more about their products and services:

Please also see the full sponsor message(s) in the episode transcription for more details of their products and services, and offers exclusive to listeners of The Modern .NET Show.

Thank you to the sponsors for supporting the show.

Embedded Player

S06E21- Breaking the Compromise: Unravelling the Truths of Cyber Security with Lianne Potter and Jeff Watkins
The .NET Core Podcast

S06E21- Breaking the Compromise: Unravelling the Truths of Cyber Security with Lianne Potter and Jeff Watkins

Supporting The Show

If this episode was interesting or useful to you, please consider supporting the show with one of the above options.

Episode Summary

In this episode of The Modern .NET Show, guess Leanne Potter, a cyber anthropologist and head of security operations for a major retailer, and Jeff Watkins, discussed the importance of understanding diverse perspectives and finding compromises in cyber security practices within organizations.

The episode emphasized the need for a more holistic and integrated approach to security, shifting away from viewing security as solely the responsibility of the cyber security team. Leanne and Jeff highlighted the concept of pushing security responsibilities to the left in the software development lifecycle, integrating security considerations early on to minimize risks and costs later on.

Host Jamie, a developer and consultant, underscored the importance of thinking like a criminal when building secure code. He emphasized the need for varied voices in cyber security to build more secure systems and improve overall security practices within organizations.

Overall, the episode highlighted the importance of collaboration, proactive approaches, and continuous improvement in cyber security practices. By involving cyber security teams in the early stages of product development and considering potential threats, organizations can better identify and address vulnerabilities, ultimately reducing the risk of security breaches. As Leanne, Jeff, and Jamie discussed, it is crucial to prioritize people and human connections in cyber security, as well as to continually learn and adapt to the evolving landscape of cyber security threats.

Episode Transcription

What do they go for? They go for one that’s separated from the herd. And the idea behind cyber security nowadays should really actually be: put enough security controls in that they just go, "you know what? There’s someone down the road that’s got it all wide open. I’m just gonna go for them." And if you can just make yourself look as unappetizing and unappealing as possible, that’s half the battle.

- Lianne Potter

Welcome to The Modern .NET Show! Formerly known as The .NET Core Podcast, we are the go-to podcast for all .NET developers worldwide and I am your host Jamie “GaProgMan” Taylor.

In this episode, Lianne Potter and Jeff Watkins of the Compromising Positions podcast joined us to talk a little bit about the practical side of cyber security. Both Lianne and Jeff are cyber security professionals and have a ton of experience in the industry. But I had them talk about cyber security from a developer’s point of view: what can we do, what do we need to know, and how can we help our colleagues on a daily basis?

I think the other side’s true as well.

I think companies in general need to encourage a more holistic, and shift-left, and integrated approach to security. I think we talk about that quite a bit about the idea of this should not be an "over the fence," because I guess there’s two sides of the coin. One side saying, "oh look, there’s the security team, they’re the Department of Work prevention, they’re the ones who are going to stop you." And there’s the other side of that coin where nobody’s bothered to ever include people from the security in their ways of working and daily practices.

- Jeff Watkins

So let’s sit back, open up a terminal, type in dotnet new podcast and we’ll dive into the core of Modern .NET.

Jamie : So, Jeff, Leanne, thank you very much for spending some time with us today. I really appreciate anyone who’s going to be on the show, and I think that this one, this conversation we’re going to have is going to be a little different to what I usually put out on my show, because we’ll get to it in a moment. But I feel like the discussion around security, cyber security, safety, and all that kind of stuff tends to get pushed past developers straight onto and rightfully or wrongfully onto the security people. And it seems to just stay there. And I feel like it’s important that developers, testers, technologists should be involved in that conversation.

But I guess we’ll get back to that in a moment. So thank you both very much, and welcome to the show.

Jeff : Thank you for inviting us.

Lianne : Thank you so much. Can’t wait.

Jamie : Awesome. Yeah. So, for a little bit of background, I met both Jeff and Leanne at the Leeds Cyber Security Conference in 2023. That is a conference put on by our mutual friend Lee. And the goal is just to get people in a room talking about cyber security. And I think it went really well. But I think before we do any of that, I was wondering, could you both give the listeners a bit of an intro to yourselves? Like, you know, like a elevator pitch or something?

Lianne : I was so tempted, Jeff, for us to just do the way we do it on the podcast. So, I’m Leanne Potter. I’m a cyber anthropologist and head of security operations for a major retailer.

Jeff : And I’m Jeff Watkins, CTO for a major tech consultancy. Something, something. No.

I’ve been in engineering for pretty much all my adult life, really. I started programming when I was six, which I think makes me one of a dying breed who remembers actually coding on old 8-bit computers. I think there were too many people doing that nowadays, thanks to the education system. So, I’ve been a lifelong technologist, but over the years, I’ve gained quite a lot of appreciation for cyber security as being a really important topic along with other topics. And I found it fascinating to balance, both being a CTO and also running a small cyber security function—nothing anywhere near the scale or importance of what Leanne’s doing in her career, but I think it gives you a really. Having an interest in cyber security gives you a really, really good foundational knowledge of what’s actually really important to most organizations nowadays.

So that’s why I talk a lot about cyber security in general, even though it’s not really my major career.

Lianne : And while you’re talking about it. You also talk about it with me every Thursday on our podcast. Compromising Positions, protecting your assets, never leaving you exposed. The big emphasis on "ass".

We are very much collaborators in this because we were speaking, I’m an ex developer myself, and we’ve been talking for quite a long time about that user journey for people outside of the security team. Like, why are these security controls not working? Why is there so much friction with the business when this should be really important? And if the news stories and scandals that are coming out with cyber security instance at the moment, why isn’t it still on everyone’s agenda right at the top? Because this just keeps happening and happening with more frequency. So we decided to get together and do talks and podcasts about it.

So we’re, I think, the only cyber security podcast that doesn’t interview anyone who works in cyber security. So I’m the only official cyber security person on the show. And we interview people like software developers, people working in user research data, and then outside of tech, things like people working in psychology, behavioural science, and other anthropologists as well, to get an outsider’s view on what, why insecure behaviors keep proliferating. I would say to listeners, do Google, if you are searching for us "compromising positions cyber security podcast" do not just Google "compromising positions" on your work machines. I can’t be held responsible for what might come up. But…

Jeff : Yeah, I mean, there is a rather mediocre film with that name. But yeah, you’re probably going to get something a little bit more salacious if you’re not careful.

Jamie : I remember I was mentioning your podcast to a client of mine, and I was like, "yeah, this great podcast called compromising positions. You should look it up." And then I realized, I went, "yeah, you’re going to need to put podcast and, like, security or something at the end of your Google search. I am not going to be held responsible for you getting in trouble with your IT department."

Lianne : Yeah, yeah, very much so. Do you know what that is? It’s because me and Jeff have a really serious condition, and it’s kind of like our support group, really. This podcast, our serious condition is "silliness Tourette’s", and we can’t help. We were just like, this is too good a name not to use. And so our podcast episodes as well, are not SEO friendly. It’s doing us absolutely no favours, except we have charted in the top 20 recently in the UK for tech podcasts. But yes, our "Tourette’s silliness" of not doing it as any favours of actually finding our audience or anything like that.

Jeff : Well, I’ll tell you what: it does drive traffic to our website though, because looking at our actual keyword hits, people searching for "compromising photos" do end up on our site. So…

Lianne : They do. They do. That is one of the most common search terms. I guess explaining what is a compromising position might be of some interest to your listeners. And why is it called compromising positions?

Well, the title comes up because quite often in the discussion, the discourse around cyber security, cyber security professionals and cyber security teams have often said that it’s the people in their organization, outside of the security teams that are the compromising position, that are what they sometimes call the "weak link," or even "stupid" in some cases. Now I absolutely hate that term. I think if you start calling people the weak link in your organization, you’re doing yourself no favours. But also you have such a wrong approach of how valuable people in your organizations can be to help you on your cyber security maturity journey.

So I actually think the real compromising position, if you think that individuals in your organisation are the reason why your security controls are failing, it’s because the security team are the compromising position. So it’s turning it on its head, really, of understanding it’s not individuals in an organisation that are out to do harm to your organization. Actually, the insider threat piece, you know, someone inside your organization, malicious, obviously. Trying to, you know, exfiltrate data for cost is extremely rare and we focus too much on that side of it. Rather than people just trying to do a job, people just want to get through the day. And it’s actually our security controls that are preventing them from doing that, because we don’t consult with people on how to do it. So the actual compromising position is the security team in my opinion.

But also compromise is literally what we actually need to do to enable the business to engage with people. We need to come to a compromise between what the business needs to do to be profitable and therefore pay my wages, but also what we need to do to keep safe. So we have to find a middle ground. And so what I aim to do with the podcast with Jeff is to understand other people’s perspective so people can start having those negotiating compromise topics of discussions.

Jeff : I think it’s important. One of the reasons we have so many people like behavioural scientists and people in marketing, and UX on there, is actually bringing some of those ideas and skills and frameworks that, you know, don’t need to go and do another degree in them necessarily. But understanding the core concepts of things like storytelling, bringing people on board, so important in an area that actually is pretty invisible. Security should be unobtrusive. And also security, people don’t feel security unless it’s obtrusive or it’s being breached. It’s a secondary user concern. People don’t sign on to their banking app and they don’t think, "oh, I want to log on somewhere today." They know they have another task to do. People working in an organization have other tasks to do, but that means that people don’t necessarily understand it or put enough importance on it until, of course, there’s a lack of it.

It’s like a seatbelt in a car. There was a big push, and it was really difficult to get people to wear seat belts when they’re introduced, because, as long as you don’t crash, a seat belt is just an inconvenience. But of course, we know how many, you know, you can look at the figures, how many road deaths have been prevented by seat belts and by enhanced road safety measures. Actually, it’s hugely important. But people, people have a bias towards what’s in their view now and what’s important to them.

So having that storytelling, having, leaning into things like behavioural science, leaning into how you market it properly, and that’s really, really important. And, of course, cyber security teams, in my opinion, I think, don’t know if Lianne will agree or disagree with you here, but quite often either mixture of people who are technical and understand cyber security quite well, and managers, not really much, not really as diverse. Now, we’ve been lucky to speak to a couple of people who are actually in, or worked in the past in cyber security teams who are in other fields, like Bec McKeown, who I think was a behavioural scientist before she worked in the military and with the cyber security team, I think the enduring message is we need more of that in cyber security.

Jamie : 100%. I think. I think just from my background as a dev, as a consultant: I think that the more varied voices we can have in any part of technology, but I think mostly in cyber security, the better, right. Because, how do I put it? I usually say to juniors and interns, when I’m talking about building secure code, I say to them, "hey, we have to think like a criminal," right? In the example I always give, and I’m sure that if folks want to do a search on the website for the podcast, they’ll see it in the transcripts come up about 500 times, is that: if I just want to steal the car just for the feeling of stealing a car or just to achieve the goal of stealing a car. I don’t care what car it is. What I’m gonna do is I’m gonna go stand in a car park, and I’m gonna watch people come in and go in. I’m gonna watch how they are handling their keys and what they’re doing with their car after they’ve left the car. Right. And I’m gonna pick the easiest car to break into.

Now, this metaphor doesn’t really work in, in my experience in cyber security, but if you. If you stick with it just for now so that I can make the point, I feel like, you know, your thief in that instance is going to look for the easiest car to break into. And by every. If everyone in that car park is operating at a level where at the very least, there are certain security, things are in place, like a lock on the steering wheel, like maybe an immobilizer, maybe an alarm—maybe immobilizers have fallen out of fashion, I don’t know. But you’re locking the car. You’re putting a steering lock on. You’ve got, um, you know, an alarm on there, and you’re making it very visible. "I am locking my car. You can’t break in, or it will be harder to break in." Then with that, it will make it more. It will make it more difficult for that thief to steal the car.

Now, the problem is, in order to build those defences, we need people with lots of varied experiences in order so that then those defences can be built up. Like, you know, like you were saying there, Jeff, about having people with lots of different, varied backgrounds, perhaps not even from a technology background, is probably a great idea. In fact, it would likely be more advantageous to have someone in a cyber security place where they are not from a technology background. Maybe they are from a humanistic background. And I’m sure that we’ll talk about the anthropology side of it in a moment. But, like, having someone there who is like, "I understand people. I understand how people work." Like you were saying about the behavioural scientists, the behavioural psychologists, you can then put in place things that make it less likely for someone to break in.

Lianne : Yeah. And it’s always that ability to make it less likely.

So I think for a good few years now, we in the cyber security industry have kind of made peace with the fact that it’s not a question of if, it’s a question of when; and kind of living on the principle of assumed breach. Which basically means that you kind of live your day to day assuming bad things are already happening in your network. So what do you do to help spot it? What do you do help to remediate and things like that.

When you think of it like that, you can understand why quite a lot of cyber security teams are very het up, highly strung if that’s your constant default position, to be constantly paranoid and stressed out. So when, you know, an engineer comes up to us and says, "oh, I want to download this unapproved library or this unapproved tool, and then they get their head bitten off," I’d really like to encourage one: the engineers kind of take a step back and just understand we are in the cyber security team, one of the most underfunded, under resourced, under loved areas of the business at the best of times. And we’ve had to say no to a lot of things out of self preservationist needs, because it’s easier to say no rather than have oversight on yet another tool, another area of the business when we don’t feel supported. However, if you can come to your cyber security team with a really strong business case of saying, like, "look, I’ve looked up the risks, and compared to the return on investment of actually utilizing this tool, you know, this is a good opportunity for us to go ahead and use it, and I will take ownership of any of the risks associated to it." Because that’s one of the biggest kind of struggles we have in cyber security is people helping us take ownership of things.

And along with being one of the smallest teams, we’re also expected to know more about every area of the business in terms of, from a technology perspective, as well as nowadays, from being actually business minded, you know, commercially minded. So I find one of the struggles I have when I’m talking to new starters coming to the industry is like, "where do I focus?" It’s like you have to kind of be careful about being too focused in cyber security, because you will be expected to know everything there is to know about the cloud, everything there is to know about engineering infrastructure and compliance and all the other things that go along with it.

And it’s really overwhelming. So there is a definite need, and the podcast hopefully addresses that by giving some insight into the different areas of the business, a definite need to share that load of responsibility. Because I’ll be honest with you, your cyber security team doesn’t know everything there is to know about cloud. They don’t know everything there is to know about engineering, etcetera. They need you to be a part of this journey so you can tell us what risks might occur from that, because you guys can be the expert in that. When I was training up to work in tech, I could have decided, "right, I’m going to be the best JavaScript, best .NET developer there ever is, and focus right onto that and consider nothing else in my career, I could be really specialized in that." I don’t really get that opportunity in cyber security to do that because I have to be really agile to meet the needs of the business and it’s really tough that, and that leads to interesting behaviours.

Jeff : I think the other side’s true as well.

I think companies in general need to encourage a more holistic, and shift-left, and integrated approach to security. I think we talk about that quite a bit about the idea of this should not be an "over the fence," because I guess there’s two sides of the coin. One side saying, "oh look, there’s the security team, they’re the Department of Work prevention, they’re the ones who are going to stop you." And there’s the other side of that coin where nobody’s bothered to ever include people from the security in their ways of working and daily practices.

So then the project ends and goes, "we’re going live next week, you okay with that?" And then suddenly everybody’s on the hop because nobody’s had any sight of that. And I think taking an integrated approach to security and spreading that responsibility and spreading some of that autonomy when it’s mature enough, I think is going to be super important for the future of cyber security and also like taking it out of the hands of being purely technical as well. So leaning further into that point of making security part of the product process going, "hang on a second, we need to think about the fundamental level when we build this product or service. Are there actually some ways thinking about who the Persona nongrata are," who the people like the opposite of a product Persona that people will be quite familiar with, who are the people you don’t want, who are probably quite likely to actually try and do something nefarious with your application? Not, not talking about unsecured ports or things like that, it’s more like fundamental misuse.

And actually you can get a lot of mileage out of that because people all the time are trying to attack the application itself or attack the people in your organization. You know, the human beings unfortunately are increasingly becoming the target of cyber security attacks as it gets much more difficult to actually hack systems directly as defensive measures, automated defensive measures get better. So I think both sides need to meet and there needs to be an olive branch from both sides to actually build in safe and sustainable and continue security in organizations. But that requires a whole bunch of organizational change, which is also why I think it was fascinating to hear from Paula Cizek on one of our recent episodes about "how do you actually make organizational change without tiers?" Because that’s certainly something I’m not absolute expert in. And it is one of the biggest things an organization do is changing their ways of working or their structure.


Do you have a WPF application and want to take it to macOS or Linux? Avalonia XPF, a binary-compatible cross-platform fork of WPF, enables WPF apps to run on new platforms with minimal effort and maximum compatibility.

With a few tweaks to your project file, your WPF app and all its dependencies are ready for testing on new platforms.

Start your app transformation journey with a 30-day free trial.

Head over to http://avaloniaui.net/themoderndotnetshow to get started today.


Jamie : I 100% agree.

I have noticed over the last 10-15 years there’s been a big push as part of the DevOps revolution in this, something that’s been going on way longer than the last ten years. Like I have the original book on Kaizen, which was written the year I was born—I will leave it as an exercise to the listener to figure out when I was born. But I have the original book on Kaizen and the Toyota Production System book, which was then used to create a whole bunch of other books, including The Goal, which was then picked up by Gene Kim when he created this idea of "what does DevOps look like?"

And there’s this big push for "push everything to the left," because that’s. And it is genuinely, genuinely important for those who may not know what we’re meaning here. Imagine the software development life cycle as a continuum of a straight line release on the right hand side and initial work on the left. You’re pushing left means push all of those responsibilities for security, for design, for releasing and all that kind of stuff as far to the left as possible, because the further it gets away from someone who can make a positive change, the more expensive it is to change it. Right.

And what I’ve noticed is there’s a book called The Phoenix Project, which is the book that introduced a lot of developers to DevOps. And in that book there is a security professional, and he talks a lot about "push it left, push it left, push it left." Because when they come to release the new—it’s a fictional tale about a company that releases software—and they talk about how when it comes to release a new version, they’re usually releasing it, then asking him to test it for all of the security things they need to test it for. And he talks about "push it left, push it left" constantly, almost constantly.

And there is, there has been a big movement in the last 10-15 years of saying "push it left," but nobody, from my perspective, actually doing the push it left. Right. And I know that you said there, Jeff, about it’s becoming harder and harder to break into things from a technical point of view, like the software that we all use and stuff like that is getting much better. But the first step is including tests in your automatable, tests in your software that tests for some of those known vulnerabilities, right. And maybe signing up to, having someone in your team who signs up for a CVE list, right? That’s just a list that, for the folks who are listening who don’t know, that’s a mailing list where you get told about the newest vulnerabilities in all the different software packages. So then you’re immediately finding out as soon as it’s released to the public, "hey, we’re using this piece of software and there is a security vulnerability in it. And guess what? There’s already a new version with a fix in there, because by the time the CVE goes public, they’ve already fixed the problem in an update."

So like the, it feels like from my perspective, the software side of stuff. Like you were saying there, Jeff, it’s, it’s almost a solved problem that nobody tends to want to talk about or implement the solution. Like, what the heck?

Lianne : Well, the Phoenix Project, absolute seminal, classic. It’s something that whenever I have a new starter, as in a new career person coming in, cyber security or not, I always recommend they read that book because it gives you such an incredible insight into every different area of any kind of organisation. So that business there, from what I recall, is a bit like a quick-fit. So. Or should I say like a Halfords, like an american style Halfords style thing.

And yeah, the CISO in that one, e by gum, we were very lucky that our very first episode of compromising positions, we got to interview one of the authors on there. We were like, "dude, what’s with that character? He’s awful," and has pretty much by the middle of the book, a mental breakdown, goes off grid and then comes back with a shaven head.

But I think, I think the message kind of still stands is, you know, there is ways and means of doing things in terms of getting buy-in from the business. And I think the Phoenix project itself really encapsulates the wrong way to do things. You need to in cyber security understand what is really actually the necessity for the business. You know, human beings are very good at taking risks if they think it’s not going to happen, and if they think it’s not going to happen, then you’re never going to ever get buy-in from people. So you need to put a bit of tangibility into the things you’re discussing.

Jamie : Yeah, 100%.

If you can make it real, what is it that so long time listeners will know that? I trained as a teacher directly out of uni, so I did computer science, went to train as a teacher, and I feel like it’s helped me in my career. The other thing I did that I feel really helped me in my career is working in retail, because, you know, the shop floor retail, dealing with people.

But the working as a teacher taught me that, "when you need to impart some knowledge on someone, you need to make it real, you need to give the reason for it to exist." And I feel like this fits with something that you’ve both said several times, and I’d be interested in your opinions on it, and something you just said there, Lianne, about making it tangible, making it real. Tell me why. Yeah, I need to do this. Not don’t. Just not not. How do I put it? It’s not a case of set the rule to be, "don’t click on any outbound links." It needs to be, "don’t click on any outbound links, because it could be this and it could be that. But don’t worry about it if you do, because when we can, you maybe fix it or we can set up a rule or something." Right?

Jeff : Yeah. I think this goes back to the power of storytelling, and I said that as one of my key takeaways from my career, especially becoming more senior. When you’re trying to persuade people, the power of persuasion, quite a lot of it is in the storytelling, because you say, "don’t click on links." Okay, you can say, why don’t click on links as well. Then you can go even further and start to provide all that context. And then it was really interesting to hear from Ray Blake [of] The Dark Money Files, I believe he was. Correct me if I’m wrong here, Lianne, who started talking about, he shares stories of the impact of victims of financial crime, the devastating impact, and then you get even more into, like, hitting the emotions as well.

And when you start to put the why, the background context, and the impact, and the positive or negative, depending on what you’re trying to sell, then you. Because I’m sure in all of our heads, when we’re trying to persuade people of stuff, it can be pretty obvious, because all of the why exists in our heads, but it doesn’t exist in other people’s heads. I think that’s been one of my biggest career learnings is actually you got to take people on the journey. You can’t just give people facts and figures and rules.

Lianne : I think, yeah, I think that’s exactly the way Ray described it was. I can tell you that, for example, by 2025, 10.5 trillion will be lost to hackers. Such a big number that we can’t really comprehend. And it’s the same with just in general, bad news stories. You know, we hear about something awful in the news, but we’re hearing about it so much, lots of awful things happening in the world that we kind of become desensitized to it and that’s only natural. But when you talk about individual stories that has so much pertinence and resonate so much more with people, and it’s about finding those micro tales you can tel.

And, yeah, like you said, Jeff, telling people not to click on things, well, my whole job is pretty much email-based and clicking on things. So either you tell me not to be able to do my, you know, don’t do your job, or we’re gonna have to find some other way of kind of managing this. And I much prefer considering, you know, educating people in the right way of what to watch out for beforehand. But I think what’s really important these days and will become increasingly so I gotta drop the buzzword here. Hold your, hold onto your hats. When AI really starts affecting. Yeah, I’ve done it. I’ve done it. It’s gonna happen when AI starts really impacting our ability to protect people because, for example, we’ve told people for years what to spot with phishing emails. Now AI is gonna ruin that because they’re really good at writing phishing emails now that look a lot more convincing. So I can’t tell you to watch out for the Nigerian prince any more. And I can’t tell you to look out for spelling mistakes because ChatGPT, you know, is really good at grammar.

So we’re gonna have to keep thinking about, well, "what’s the other line that we need to consider?" So remember back what I was saying before. We are in cyber security, very much aware that it’s going to be a when, not if. Therefore, we need to make sure people know what to do when it happens. To me, that’s the crucial thing. If someone’s sitting on something because they clicked on something and they are terrified that they’re going to get into trouble for it, and then they don’t report it for hours on end or maybe at all. And we have to spot it through our monitoring systems, then you’ve failed. Your cyber security culture has failed. In that case, it’s about people going, "oh [CENSORED]. I clicked on something, let me report to the cyber security team. They all deal with it," and then that’s exactly what we need. That, to me, is the more important message.

Jamie : Yep. And you can’t get that without having a psychologically safe workspace, right? A place where someone can actually put their hand in the air and go, "I’ve made a mistake," or, you know, "I think I’ve made a mistake," or "can someone please check this?"

Because, you know, I don’t know whether it’s a specific thing to my generous. I’m going to speak specifically from a developer standpoint here, but I don’t know whether it’s something specifically from my generation of developers or not. Like, we tend to be really worried about looking like we don’t know what we’re talking about. Whereas I’m actually, I’ll sit in a meeting and someone will say something like, If there’s acronyms and jargon being thrown around, that is not the standard, a set of acronyms and jargon that you would find in that conversation. I will literally interrupt the meeting. I’ll be like, "right, someone needs to tell me what these words mean, because I don’t know what they mean, and I’m not going to sit here and pretend to know what they mean. Right? Because I’m going to make an assumption. We all know what happens when you make an assumption." If you don’t go watch an early episode of the Fresh Prince of Bel Air where you’re told, "you will look like an ass, and the ump will shun you." But, like, there’s this real fear of not wanting to be seen, to not know something.

And I feel like that’s a symptom of a psychologically unsafe workspace. And if you have a psychologically unsafe workspace, then you’re going to have a cyber security culture where people feel like they can’t report something. Right? Or maybe they feel like that if they did, they’d get fired. Whereas actually, what I took from what we’ve been saying so far is that actually, "just let me know. Hey, if you think you did something wrong, if you think you’ve clicked a link or downloaded something or done something you shouldn’t have done, just let me know, because then I can fix it," right?

Lianne : Yeah, absolutely.

I totally get the whole reason why you wouldn’t, you know, and I think it’s beyond the workplace. As human beings, we’ve all got incredible egos about how people perceive who we are. That’s because it’s programmed in. You know, that whole thing about you make micro judgments as soon as you meet someone, and then those judgments are very hard to change. You always want to make a good first impression, and regrettably, the workplace doesn’t allow. Especially the more hybrid and remote situations are increasing. Our interactions are also really time boxed. So the first time I could meet someone, it might be to talk about something really awful, like maybe a really bad security control or something like that, or a project’s not going well and they need support on something, or highly stressed environments. And if that’s your first engagement with someone, it kind of sets the tone for the rest of your working relationship.

And again, like the conversations I come into quite often, you know, I don’t ever claim to be the smartest person in the room. And you can find yourself thinking, "oh, they’ve invited me to this meeting because they’re expecting an answer, and I don’t have one to give because I’m not an expert in this field, but I have to seem like it," and I totally get the onus to want to constantly feel like you’re proving yourself and how that would have an impact on people’s ability to put their hands up and say, "yeah, I don’t know, and therefore I need some support."

Jeff : Yeah. I think one of the most freeing things you can do in your career is loosing yourself from the shackles of feeling like you have to be an expert. And I think getting comfortable with uncomfortable conversations is part of that as well. And I think that’s, unfortunately, they don’t teach that. We don’t teach that in any university that I know of, or really many courses. I’m sure there are some, maybe some leadership courses that do, but it’s a very, very tricky one for technologists, because technology is very binary in some ways, but the people behind the technology is definitely not very binary.

Jamie : 100%.

And it. It’s shocking to me that there are some things that we don’t get taught that. I mean, yes, okay. Right. Computer literacy is a big thing, right. Because we’re all going to be using computers, online literacy, incredibly important, because we’re all going to be online. Like, there are people alive today who don’t know what the world was like before The Internet. Right. Totally. That’s cool. Right? But, and those are very important things. And there’s, there’s a big movement to, you know, there was was a big movement a few years ago: get people to learn, get kids to learn to code because they’ll need to in their life. Absolutely.

But I think the most important thing, and it always comes back to this, no matter what you’re doing, I feel like the conversation always has to come back to, whatever you’re doing, you need compassion, empathy, sympathy and compassion. It’s a one, two, three step process. You can’t have compassion with that. Empathy and sympathy. And you can’t have sympathy without empathy. Right. And. And I feel like it’s always super important to have that when even if you’re in a positive situation. Right. You can still be positively compassionate to someone.

But I guess the situation that you were both talking about there, that maybe the first time I meet someone in person, because we’re working in a hybrid or a fully remote situation, is that I have to go to their office or go to some place, maybe it’s a 3rd place that we meet, maybe a coffee shop or something, and have that conversation about, "there’s been a breach, and we tracked it back to your computer. Can you talk me through it?" Or indeed the worst one, which is, "there’s been a breach. It was you. We’re gonna have to let you go."

Like, there’s never any, uh. From my perspective, there’s never any feeling of. How do I put this across in a. In a. In the most compassionate way. I totally understand how it happened. This is. But then, like, that’s also a learning. Perhaps a learning experience for not just the person who did it, not just the employee, but for the whole company, right? Because if you’ve recognized that there is a way that. That a problem could present itself, a breach, an issue or whatever, then that should really be a learning exercise, right? If I know that you know, Jamie in accounting. Oh, look, I’m picking on myself. If Jamie in accounting, click this link and cause this problem will, maybe there’s some education we can give there, some training we can give there, or maybe there’s a better way to catch that, right? Maybe that request went out to a known place that was a known bad spot, but it went through some kind of proxy or some kind of two, three hop process. It still got there. So why did we allow that traffic there and it being less about, "oh, no, you’ve pushed the wrong button, so you have to be fired!" more about, o"h, no, you pushed the wrong button. How can we make this better?" Right?

Lianne : Yeah. It’s really tough to kind of get people to admit that they’ve done something wrong. So the best successes I’ve ever had, though, are from the real life storytelling like that.

So one instance, I remember another organisation I was working at, the CTO, sent us an email to investigate, and it was a really weird email. Even to this day, I can’t quite work out what the scam was. There was definitely a scam in there. It wasn’t just a spam email. There was links that they wanted the CTO to click on, urgency, all the kind of red flags you would see. And we said to the CTO, this is a really interesting email. Do you mind if we share it with the organization? And so we, in our all hands, we went through this email and there’s, you know, a genuine email, you know, to the CTO, and we sort of said, "look, this is what you need to kind of look out for. This is what you need to spot." And people were so interested, going, well, "what’s the scam here?" And I was like, I" really don’t know what the scam here is." This weird link went to a weird Facebook page that went to another Facebook page. We were following the breadcrumbs and we couldn’t work it out, but there was something going on. But people were more interested because it’s like, "oh, even the CTO is, you know, what would be the implications of someone, a high privileged user, has their accounts breached?"

And then one thing I have a lot of success with is taking part in induction days. Now, when I take part in induction days for new starters, I don’t come up and stand up and say, "hi, I’m Leanne in the cyber security team. This is what you shouldn’t do in the organization. And here is the policy." Because they’re going to get the, I would say, boring security awareness training any ways. But what I do is I tell them about every time I’ve ever been hacked. And it’s so powerful to hear that someone who lives and breathes cyber security all day has also been susceptible to it. So I take them through my sort of freaky stories. I tell them about when I was a student and my emails got hacked and this hacker just, like, spammed all my contacts, including my university lecturer, and said that I had lots of raunchy pictures for people to see. I didn’t, but it was really embarrassing. And so I had to change all my details there.

And then there was another time when I was working at an organization and we didn’t have a big it presence. And we all kept the pop up that says, you must restart your computer. We’ve got all these vital updates. We just said, "remind us later" for like a week on end. And then one day we turned on the computers and it was all down, completely bricked due to malware. And so we were out of action for a week.

And then finally a recent story where I was sat on the bus a couple of days before Christmas, and I got a 2FA ping to say that someone in Scotland, I’m based in Leeds, was trying to buy lots of things through Amazon and just telling them little stories about, particularly with the MFA one about how actually that stopped an incident from happening. An incident was happening with me, but because I took those precautions in place, that I was able to prevent it. And that’s a much more powerful story than, saying, "to protect our customers data and business data," which is very lofty, especially when you’ve just joined an organization, you don’t really know why customer data and business data is important. And turn that sort of messaging around to a more personal message is so incredibly important; because I would rather protect my own personal stuff than the business stuff if I’m an individual user. So it’s easier for me to care more about my stuff than it is a massive corporation.

But the same security principles apply the same things you would do in your personal life apply to your work life. So it’s just. It’s just all about really good habits.


A Request To You All

If you're enjoying this show, would you mind sharing it with a colleague? Check your podcatcher for a link to show notes, which has an embedded player within it and a transcription and all that stuff, and share that link with them. I'd really appreciate it if you could indeed share the show.

But if you'd like other ways to support it, you could:

I would love it if you would share the show with a friend or colleague or leave a rating or review. The other options are completely up to you, and are not required at all to continue enjoying the show.

Anyway, let's get back to it.


Jamie : Yeah. What was it that I heard the other day? Somebody had said something like, "locking your house door won’t necessarily stop someone from breaking in, but not doing it will make it even easier for them to break in." Right. And I feel like that’s a—again, I talk with metaphors and analogies, right. And I feel like that’s a pretty good. Not perfect, but a pretty good analogy for using well-known, well, I hate to use the term "industry standard," right. Because I know so many companies who, when you report something to them, "oh, use industry standard x, y and z." I’m like, "well, clearly you don’t because I just broke into your system," you know, but like, using strong passwords, using multi factor authentication, using one time, time based passwords. So, yes, they raise the barrier to entry for a legitimate user, but they also raise the difficulty for someone who wants to break into a user, like a user password facing system. Right.

And I feel like, you know, you wouldn’t just lock your house using a padlock and a bolt, right? Like on those old gates, you’d slide the bolt across and then put the padlock on. You wouldn’t just do that because you know that’s going to be broken into. You’re going to hire an expert to give you a relatively good lock on your front door. And I feel like maybe it’s an attitude thing, maybe it’s, "because it’s at work it doesn’t really matter." I don’t know. I’m not smart enough to be having these thoughts. I have to say that.

Lianne : Well if you do enjoy a good metaphor, Another way to think about it from the security, sorry, the cyber criminals perspective: It’s a numbers game. And it’s about, imagine the Serengeti. So you’re a lion. I don’t know if I’ve got the right animal there. It’s lion or tiger in Serengeti. And, you know, you see a load of animals. What do they go for? They go for one that’s separated from the herd. And the idea behind cyber security nowadays should really actually be: put enough security controls in that they just go, "you know what? There’s someone down the road that’s got it all wide open. I’m just gonna go for them." And if you can just make yourself look as unappetizing and unappealing as possible, that’s half the battle.

Jeff : I think, and it is always a trade off. And it’s finding the good trade offs. So MFA is a great trade off because it takes you couple of seconds longer, but the amount of extra security it provides is a very good trade off because you can make a system incredibly secure. If what you do is you take your system, disconnect all the network cables, seal it in 6 meters of reinforced concrete, and drop it at the bottom of the Mariana Trench, I would be reasonably sure that nobody would break into it. But it won’t be very useful, it won’t be very accessible. And it’s like the CIA triad of cyber security (Confidentiality, Integrity, and Availability). You’ve got to balance.

It’s always going to be a fine balance because if somebody really wants to get into it, say you’re running a florist’s website, and if maybe, say, the russian government wants to actually do a state funded attack on it, they will get in. But for the most of us, we’re probably not the key targets. It is about finding that best trade off. And that will mean procuring the best kind of services that are value for money for you and the best kind of controls that provide usable security that isn’t too much of a trade off. And that will, you know, if you’re working in the NHS, that’s going to be a different level of trade off to if you’re working at GCHQ, which is going to be a different trade off to if you’re running a florist’s website. So it will differ, but it has to be a conscious effort to think about, "what are those trade offs?" I think it, from a human centric point of view and from.

Jamie : From a business point of view, yeah, I like that. Having the, the trade off and accepting that risk, or rather having an acceptable level of risk. All right, "if we get hacked today," because, I mean, you’ve both said it’s more a case of when. So if it happens today, what is the risk? Right. What could actually go wrong? You know, is it just reputational? Is there customer data that could be released? Is there money that could be stolen? And having that acceptable risk, I think, is at that level. And there’s something that, there’s an app security professional that I know called Tanya Janca, and she talks about getting a sign off from the decision makers. Now, this is more of a developer thing. I’m not sure. Obviously, I can’t speak to your experiences, being in the cyber security side of it, and hopefully, if you’re willing to, and if it’s okay to share, I would love to hear it. But like Tanya says, that when you present a problem as a developer and you say to the manager, whoever, "hey, we need to fix this." If folks remember the Log4J thing from like two, three years ago where it was possible to run arbitrary code via a logging framework. Right.

"We need to fix this."

"What is the fix?"

"The fix is we update a package and we rerun our tests, then deploy as quickly as possible. Assuming that the tests pass."

"How long is that going to take?"

Maybe two or three hours. Because the fix is already out there. We just need to do all of the releasey bits."

And then if that manager or that decision maker turns around and says "no," you say, "okay, right, I need you to sign this piece of paper that says, I’ve told you what the risks are, and I’ve told you what the fix is. And you personally have decided not to do it." Now, of, I wouldn’t want to work in a company that has that low level of psychological safety where everyone feels like they have to cover their own backs.

But like, I feel like when it comes to acceptable risk. There needs to be someone who’s willing to say, "yes, I will take the responsibility of having this risk." Or "we, as a team, as a company, as a group of individuals, will all band together and take the risk of, yes. We’ve. We have decided that for our florists e commerce website, we don’t need multi factor authentication because we’ve taken the risk of. It doesn’t matter for our perspective." Right. Or maybe it’s a website that allows you to log in and just list the books you’ve read. Well, "there’s kind of no point here in having multifactor run that from our perspective, because all you’re doing is giving us a username, a password, and a list of books." Right. The amount of Pii there is probably pretty slim.

But, like, how do you both feel about that? About having, like, a responsibility sign off sheet?

Lianne : I think in some areas of the business, it’s just absolutely necessary, particularly thinking, like, very heavily regulated areas of industry and things you kind of have to. Now Log4J that’s, you know, I’m sure every cyber security person still shakes when they hear it, but that’s the one actual occasion where that was actually an easy sell because it was so incredibly public, and that one in particular was an easy sell. It’s actually all the other stuff that is really hard to kind of sell as why we need to fix that. And I would absolutely love a software engineer to come up to me and say, "I want to fix this. Let’s get this kind of signed off."

But I think it’s a necessity at the moment. And it’s one of the reasons I’ve been very kind of vocal and public about it. It’s one of the reasons why I don’t see CISO in my. In my future career ambitions, because that’s a really interesting role where it feels like there is a lot of scapegoating and blame placed on people. And that’s why these processes of getting sign off kind of proliferate, is you can get into really big trouble if you do something wrong. And the CISOs kind of carries a lot of that burden. And, yeah, it’s a bit of a scary. It’s one of those kind of scary roles.

But I’d be interested, Jeff, because you’re a CTO, what about from your perspective?

Jeff : I think it’s a really interesting topic because I think, on the other flip side, when it comes to stuff where we think there is a real clear and present danger, I’d kind of rather people did come to us with a, "you need to sign off on this," because it then makes it really obvious that we believe this is of sufficient importance. And again, I think the storytelling plays into this. You need to understand without the, if you just walk up to somebody and say, "if we don’t fix this bug, then the bad stuff might happen". Then when you talk about what Log4J and what it’s used in, and it’s not just using our software, it’s using all the libraries for our software on our estate. And actually, I don’t know what will say 80% of Java solutions across the globe. I don’t know what the actual figure is, but it was a lot.

And then you start to see that scale of it and the actual, and show what the actual outcome of this could be, arbitrary code execution. And then say, "well, we need to sign off on this as a risk to the business." I think that’s actually kind of helpful. Because unfortunately some people still have, I’d say, tunnel vision when it comes to risks. And I think this goes back to a talk that we had the Leeds cyber security conference about the severity listed on some of these lists actually isn’t necessarily how commensurate with how it’s being used in the wild or how easy it is to exploit. In the case of Log4J, I believe it was found in Minecraft in the chat function. Because somebody put a string in and went, "oh, hang on a second, I managed to hack it," and then it all kind of blew up. So it was incredibly exploitable and incredibly prolific, and it had been used in the wild, so people don’t often score stuff, I’d say, in the most helpful way. But when you’ve got a really convincing argument and you go like, "we need to sign off on this," I think it’s actually a positive thing.

Now I really don’t want to. I think there should be an element of psychological safety in bringing these to the table. And you don’t want to always say when you go back to when you have to invoke the contract, you’ve kind of lost. But I think in some cases for very important risks. I think it’s actually a healthy thing to keep a register of them.

Lianne : That was such a good talk, wasn’t it?

What was really interesting about that is, yeah, let’s just take Log4J out the mix there, because that was a weird, you know, almost like a little bit of a cyber security gift, I’ll be honest with you just in terms of getting why cyber security is important on the radar. Unfortunately, that tends to be the way; a nice, good breach tends to do a lot of cyber security teams, a lot of good rather than bad. Because I don’t know if you notice that every time Uber has a breach, there’s loads of jobs going in their cyber security team the next day. It’s good for business, keeps us going.

But in the talk that we saw, all three of us saw at the conference, it was that weird thing where once you start actually breaking down how relevant this risk is to your organization, it turned out that if you were to go just specifically by the CVE kind of scores, you’d be probably misplaced, because something that’s, you know, industry wide considered a 9.8, which is, like, one of the highest you can possibly get, goes up to ten, you know, by industry standards. But then when you do the kind of understanding of, like, how relevant is this to the organization, how exploitable is it in our environment? Yada, yada, yada? Or is there a fix? All those kind of things that Jeff was talking about just then. It turns out that actually, it’s not a big deal.

But however, you have in your. You know, when we’ve done a scan in your organization, a load of things that, on the CVE score, would be very low scoring but highly exploitable. And because the way we organize our security work, particularly from an appsec perspective, we go, "oh, we’re only going to focus on the criticals, highs, and maybe some mediums." But actually, it’s the true mediums and the true lows that they’re going to really catch you out if you don’t be careful. Which is why I always recommend that teams have a cyber security spring clean a couple times year and actually start thinking about looking at those mediums and lows, and not just don’t be blindsided when you’re doing your appsec by the criticals and highs. Yes, they are important, but if I was a cyber security criminal, I would leverage those lower mediums, actually, because I know that you are all looking for the biggest fish that are out there in your application security teams to fix those highs that have just become really public and had lots of Log4J articles written about them.

I’m actually thinking, "well, I’m gonna go through a means that you’re not even thinking about the moment." And that talk, I think, was absolutely spellbinding, and I think it was from precursor, just to give them a shout. Precursor security.

Jeff : Sir, we’ve mentioned it twice now on this, I think on this, or maybe it’s a different one. Actually, scratch that.

Lianne : We mention it a lot because we really enjoyed it. I actually got them to come into my organization and redo that talk again. That’s how much I enjoyed it.

Jeff : So I think we’re all waiting for the Log4Net breach, though, just so you can feel included.

Lianne : But Jamie, I think for your listeners, obviously, this might be of interest. So Jeff does a, well, we both do together, we do a lot of work around product security. And I don’t know, Jeff, if it’s worth you talking about your thoughts around product security.

Jeff : Yeah, so with my… so I’m not only CTO, but I spend quite a lot of time as being chief product and technology officer. And at xDesign, we take clients through that product journey. And product security, I think, is a much under loved and under reference part of security, because both Lianne and I looked it up and we found in product magazine. Product books, sorry, in product books, there’s a mention of security, like online, and security books doesn’t really feature. And then we think Agile into the mix and one or two reasonable books on Agile security. But if you’re talking about Agile product delivery and baking security into that, there’s almost nothing there. So what we talk about and what we advocate for what we do, workshops, both Lianne and I, but also at my company, X design, we do workshops, product centric security workshops, and they’re not traditional threat modelling. They’re not the kind of thing where people look at network boundaries and firewalls and ports and things like that.

It’s more looking from the outside in with a variety of techniques, including, as we mentioned earlier, persona non grata. Look at the misuse cases, the abuser stories, which are the dark side to user stories. We look at attack trees, we look at things like what assets do, you know, do some asset mapping. And it’s all really non technical, but it all looks at the background of your product and starts to think about the countermeasures at the very most fundamental level, like we’re a banking application, what might people try and do? And then, you know, obviously one of the countermeasures we’ll mention a few times, so I’ll keep mentioning, would be, oh, maybe we need to consider multi-factor authentication. Well, I guess nowadays it’s mandatory.

But the idea, like this process, starts to make clear in your head where your threats are and also where your countermeasures could go before you even start talking about which cloud platform you’re going, which ports you’re going to open or anything like that. And it’s a really engaging process. And then of course, then you can do your more traditional threat modelling processes, whichever flavour, whichever acronym you, you like to follow. You can do, obviously, you know, the SAST and DAST tooling for scanning and as well you can do other technical measures. But actually starting fundamentally, I think it’s a, it’s a really powerful tool and that’s why we like to talk about it. I mean, we’re presenting that, I think, as the opening keynote for the international JavaScript conference in April as well. It’s been an interesting topic, but it’s, it’s one that I think we’re only just starting to get some traction in the industry.

Jamie : Yeah, I feel like it from my perspective as a dev, I would love to be able to, at the beginning of a development cycle and throughout it, be able to actually pause work and go, "right, okay, where are we, what are we doing? What’s the implications of the stuff that we’re putting out there?" Because I feel like, I feel like it’s not been explicitly mentioned in this conversation, but I feel like it should be. And that it’s: security is a moving target. Right. It’s not something that can just be fixed and don’t worry about it, it’s fixed and done. Right. Because like, you know, we were talking about Log4J for a little bit and then obviously, Jeff, you were saying we’re waiting with bait the breath for the Log4NET, because it will likely happen. Right? That’s not to say that Log4NET has bad code in it. It’s just that someone somewhere will find an exploit and it will be published somewhere. And then everyone who’s ever used Log4Net will then have issues with, with the, the different versions of the software that they use it. And I think because it’s a, I mean, I may be wrong. Please, please do point out if I’m, if I’ve, if my assumptions here are wrong, maybe that’s the question is cyber security, app security, platform security—I don’t want to say x security because people might think that I’m talking about Twitter, but you know what I mean—the "insert name of thing here" security, it’s a moving target. Right?

Jeff : I remember many moons ago saying, don’t worry, you can open an email. There can be no viruses in emails. Email clients got more advanced and then got to the point where you could open up an email and that for a while, there were vulnerabilities that would infect your PC. This is a long time ago, but it still stands. Any advice that I give today, that we give today, may be outdated within two or three days. People, not just zero days being found or other existing defects, but also when people update things, they unfortunately, rather unfortunately, inject new vulnerabilities.

And also, things that were secure many moons ago are no longer secure just because computing has moved on. So, absolutely, it’s a very movable feast. But I think one thing I can’t stress enough is it’s probably easier to build in good security controls, good security practices, and all of the scanning and all of the other pieces we’ve been talking about up front, than it is to try and slap them on later when you’re on the back foot because there’s something bad has happened.

Jamie : Yeah, absolutely. Absolutely.

So I guess my question then, as we, as we’re coming towards the end of our allotted time together, I guess, what are some of the things that devs—if a developer is listening to this, first of all, go ahead and go subscribe to compromising positions. I’ve been listening since the start, but because you guys both announced it at Leeds cyber security festival, and you were saying, "hey, the first episode drops in a couple of days," and I was like, "right, I gotta go subscribe now." So genuine recommendation from me, go subscribe to that, because there’ll be things that you learn that you may not really think about as a dev, especially if you’re, like, in the weeds.

But like, is there anything that you could both suggest to a developer who’s listening to this going, "okay, right, let’s take security app security, cyber security platform, security. Security at work, security my whole life. How do I make that better?" Like, what’s the one thing that you could suggest that folks can do if they’re listening to this?

Lianne : Oh, one thing.

Jamie : I know there’ll be millions, but what’s the bare minimum, I suppose?

Lianne : I suppose I would say if you are, are, if you are struggling to get by on an either side, either you’re working in a scrum team and those security tickets just stay on that backlog or whatever, or you know that you’re coming to your security team too late in the process and maybe they’re causing you blockers because you’ve said, "hey, I need help with this," and they haven’t got the capacity any more, and your project gets delayed. Is talk to your security team, talk to them about how you would like them to treat you, you know, so, for example, when I ask, you know, get a pen test done for you, "how do you want that report? Do you want it in just a PDF, or do you want me to just throw some Jira tickets on there? How do you want that information communicated? You know, how would you like us? You know, do you want us to say, actually, we think this stuff’s important. This is where you want. We want you to focus. Or do you want the autonomy to make those decisions yourself?"

Likewise, show some curiosity in what we do. As soon as anyone shows any curiosity into cyber security, I latch onto them with both my hands and my toes, and I say, "yep, let me tell you everything there is to know about what I know about cyber security," just so that I can get some help. It’s a two way street, this. It’s gonna take a collective to even get anywhere close to this horrific numbers game, because you’ve got to think that the cyber security criminals, they have totally different motivations to us. You know, they are really well funded. They make millions doing what they do. It’s actually a much more lucrative career to be a cyber security criminal than it is to be a cyber security enforcer. And so we really need all the help we can get. And it starts with a conversation.

Jeff : I think for mine, it’d be, show an interest and learn about it. Really, it’s everybody’s responsibility for safety, really. I think ISC2 are still doing their 1 million cyber security certs initiative, where you can go and get a free basic cyber security cert training and the actual certificates itself. Go and go and take an interest, learn about it, up your game a little bit. Because I believe, genuinely believe this, that if you up your game in your professional life by learning more about cyber security, you’ll also learn more about it to actually help you keep safe in your own personal online interactions.

The more you know about it, the more you understand about this area, which is just getting bigger and bigger every year. There’s showing no signs of slowing. So by 2030, the amount of cybercrime, the funds lost to it, will be more than the GDP of the USA, which is the largest economy in the world. So just take an interest because I think it’s essential. You know, cyber security could be an existential threat if we don’t all pull together and learn about it.

Jamie : I like it. I like it.

Okay. Well, yeah, I think there’s a lot to be said, and I think that you both have; I’m gonna do the horrid thing of reducing everything you’ve both said over the last hour down to, like, a single sentence. But I don’t mean that to sound horrible when I say it, but it feels like it’s all about people, right? It’s about the people side of it. Yes, we’re all dealing with technology, but at the end of the day, the three of us are all people. We’re all discussing this, right? And if we were working together in a team to do something, we’re people trying to achieve a goal, right? And I feel like that human connection, being able to talk to someone and say, "hey, what can I do better? Is there something I can do better? Can I go do some training?" Can I, you know, like you said, Lianne, how do I produce this data for you? Like you said, Jeff, maybe if I go do this training, I’ll understand it a little bit more.

You know, it all makes perfect sense, and it loops all the way back around to compromising positions. Right.

Jeff : And you get to use your quote, your closing quote, for the talk there, Leanne.

Lianne : Oh, do you want me to. So my closing quote, I do this quite a lot, is when you follow the cables, you must remember that behind each cable is a person, a hacker, and a person trying to protect that, and we should never lose sight of the human element.

Jamie : Nice.

So how can folks get in touch then? Is it just head over to the website for the podcast and, like. Like, if someone’s listening in and going, "you know, Lianne and Jeff, I want to ask them a question. I’ve got a burning question." Is it go to the website and there’s a contact form, or are you okay with people finding you on LinkedIn?

Lianne : Yeah, we do have a contact form. You’re very welcome to reach on there. But I think the best for both of Jeff and I is definitely LinkedIn. We’re pretty much glued to LinkedIn, so that is the fastest way to get ahold of us, for sure, and it would really mean the world to us. We have interviewed quite a few devs, so there is more devs coming on our show. Really interesting ones. So the last episode that we had a dev on was a chap who was very security minded, actually, and he did an amazing project with Aberdeen’s outdoor lighting, you know, the streetlights, and it was about protecting those from IoT attacks and how we went about that. And it’s a really interesting conversation. So it’s not, I would say it’s more of a tech podcast. Than a cyber security podcast and anyone who can get anything out of it, it’s not really aimed at just a cyber security audience. It’s aimed at anyone who has an interest in tech and how to make that tech better.

Jeff : And I think if you want to, if you want to talk about things like product centric security, product centric threat modelling, do reach out for us. Do reach out to us, we can either provide some resources or, you know, can organize workshops.

Jamie : Sounds good. Sounds good. I’ll get some links and put them all in the show notes so nobody has to dive over their dashboards whilst they’re listening and make a note, you know, please drive safely.

But, yeah, it’s been a real pleasure talking to you both today. So thank you very much for being on the show.

Lianne : Thank you for having us. Been great.

Jeff : Yeah, thank you. It was a lot of fun. Thank you very much.

Wrapping Up

Thank you for listening to this episode of The Modern .NET Show with me, Jamie Taylor. I’d like to thank this episode’s guests, Lianne Potter and Jeff Watkins, for graciously sharing their time, expertise, and knowledge.

Be sure to check out the show notes for a bunch of links to some of the stuff that we covered, and full transcription of the interview. The show notes, as always, can be found at the podcast's website, and there will be a link directly to them in your podcatcher.

And don’t forget to spread the word, leave a rating or review on your podcatcher of choice - head over to dotnetcore.show/review for ways to do that - reach out via our contact page, or join out discord server at dotnetcore.show/discord - all of which are linked in the show notes.

But above all, I hope you have a fantastic rest of your day, and I hope that I’ll see you again, next time for more .NET goodness.

I will see you again real soon. See you later folks.

Follow the show

You can find the show on any of these places