The Modern .NET Show

Episode 118 - Empathy, Sympathy, and Compassion For Our Users

Embedded Player

Episode 118 - Empathy, Sympathy, and Compassion For Our Users
The .NET Core Podcast

Episode 118 - Empathy, Sympathy, and Compassion For Our Users

Supporting The Show

If this episode was interesting or useful to you, please consider supporting the show with one of the above options.

Episode Transcription

Hello everyone and welcome to THE .NET Core Podcast. An award-winning podcast where we reach into the core of the .NET technology stack and, with the help of the .NET community, present you with the information that you need in order to grok the many moving parts of one of the biggest cross-platform, multi-application frameworks on the planet.

I am your host, Jamie “GaProgMan” Taylor.

This episode will be slightly different to most of the episodes of this show, as I’d love to talk to you all about why having empathy, sympathy, and compassion for the people who use the software, apps, games, services, and everything else that we build is so vitally important. I honestly think that these are the most important skills that anyone in the technology industry can have - almost anyone can learn to write code, and design patterns can be picked up as we go, but these (so called) “soft skills” are the real corner stones of being able to write great quality software.

If you like this kind of episode, let me know via twitter (DMs are open) or via the contact page on the show’s website. If enough people like these kinds of monologues, then I’ll happily create more.

So let’s sit back, open up a terminal, type in dotnet new podcast and let the show begin.

Before we start, I’d like to talk directly to our neurodivergent friends: there might be some things in this episode that you may have difficulty in applying. This episode is not meant in any way to embarrass, trigger, or upset anyone. I do not have the language or knowledge to help you to better apply these points, and for that I am sorry. But there are experts out there who can.

May I suggest the following resources as a first port of call if you would like to learn more about more about this episode’s topic (for links, click through from the show notes into the full transcription):

There are also a number of books listed at the end of the show notes, too. So definitely check them out.

A Brief History Lesson

Before I talk a little about empathy, sympathy, and compassion, we need to go back in time to explain why I want to talk about them. As Dr. Janina Ramirez said in her 2022 book “Femina”:

You cannot be what you cannot see

- Dr. Janina Ramirez ;Femina: A New History of the Middle Ages

If we want to be more empathetic towards, have greater sympathy for, and show greater compassion to our users then we must know what that means. So lets take a look at what those things are and what they are not.

I’m going to switch into “Dad mode” and tell you all off (but only a little because, as we’ll soon learn, shame has the power to stop people from achieving anything). I’m going give you an overview of some things that I’ve noticed in the last couple of years of my own development practise; working with a lot of different companies and to different deadlines means that I’ve met a lot of people in the last eight to ten years. I started my own development business a number of years ago, and I’ve run into a lot of developers who are… well, rude would be an understatement. So I want to talk about empathy, sympathy and compassion, and why it’s required for the work that we do.

But in order to do that we have to go back more than a few years. In fact, we have to go all the way back to the 1960s.

It’s the countercultural revolution, and people were discovering electronics, computers, and the start of digital technology as a whole. The Cuban Missile Crisis, the second world war, and a number of other tense political situations which lead up to and into the 60s have forced the US government’s hand in providing a previously unprecedented amount of financial support for technological and scientific research in the United States. During the first half of the twentieth century, and the decades that followed, more and more money was being spent at US colleges which specialised in research and development into computers, digital technology, and their uses.

One of the colleges which received the most financial support for computer based research was MIT. At this time, there was a small group of people at MIT called the TMRC (which stands for the Tech Model Railway Club). The people in this group where very interested in the layout and building of model train sets, and they would spend hours building the “best” possible model railway, in a room which the faculty thought was abandoned.

Because the folks in the TMRC used electronic train sets, they often needed to climb under the giant tables which held the model railway to fiddle with the electronics. These particular folks started using the phrase “hacking” to describe how they were putting things together with almost no engineering practises - for instance, junctions would be patched using cables with almost no thought to proper wire management, and circuits were generally unsafe to tinker with.

The TMRC were discovered by MIT security and, rather than being disciplined or kicked out of college, they were offered the chance to work on a recently purchased set of PDP computers. Their job: create a system which would allow anyone at MIT to access the PDP’s resources from across campus on smaller, much cheaper, “dumb” terminals. Fast forward a few years, and they’d done it. But with great power…

The team of engineers whose job it was to look after the PDP system and the network, would receive messages from users asking for access to things, requesting password resets, and so on - the standard kind of IT support that we have all likely have faced in our lives. These engineers started calling the users who would send these requests “user with a silent l”, aka “losers” - this doesn’t work too well in audio, but I’ve tried my best to convey the idea.

As Scott Hershovitz says in “Nasty, Brutish, and Short”:

If we treat people inhumanely, we should never be surprised when they return the favour.

- Scott Hershovitz ;Nasty, Brutish, and Short

As soon as you start talking about people in a negative way, you’ve immediately lost compassion for them; you immediately don’t care about them. And this is something which continues to this day under the name “banter”.


Side note here: I absolutely despise “banter” in a professional environment such as an office. The “jokes” which are told with the defence of “banter” are almost always filled with venom. I’ve seen more than a few companies that have had negative (or even toxic) cultures because of the amount of “jokes” told under the umbrella of “banter”. Some of these have lost amazing people (through people quitting because of the culture) and two that I know of have actually ceased trading because of the “banter”.


If you have that little contempt for your users, then its going to come out in your everyday language, not just when you’re exchanging “banter”, being silly, and talking about what you do. But you’re also going to react like that when the users report something that’s actually wrong with your software. And if a user knows that you’re going to scream and shout at them when they contact you, they’ll stop contacting you. This leads to either users dropping your software from their toolkit (which is extremely bad if you’re a software vendor because they’ll stop paying), or user-created hacks to get around issues - sometimes this leads to Shadow IT, which is a big problem in it’s own right.

And the problem is that a similar view of users, managers, and non-developers persists in modern development. It’s almost as though the TMRC folks set the standard for behaviour for us all to follow to this day; whether we know it or not.

I’m obviously tarring all of us developers and technologists with the same brush by saying that “we all” do it; but it’s worth noting that almost every team of developers that I’ve worked with has had someone with this attitude of contempt for the users of the software that they manufacture.

It’s almost baked into how we are seen by those around us. There is a sort of stereotype for developers - think of the time between the ’60s the ’90s. We’re talking about a time before geek chic. You know the type of character I’m talking about: the nerdy, antisocial person, wearing a shirt and slacks, with a large number of pens in their shirt pocket and a pocket protector, their square-rimmed glasses are broken at the bridge, with tape holding them together.

Well, we’re not that any more - and never were, actually. We’re not antisocial. We don’t need to give that trope any more air time. I say we ditch that trope and just say:

Look. This is us. This is the 21st century developer. We’re not anti social. In fact, we care about people. And there’s no one stereotype which fits us all; we are a diverse and inclusive group of people, and want to help you to achieve digital greatness.

Empathy, Sympathy, and Compassion

Now that we’ve had our history lesson, let’s learn about empathy, sympathy, and compassion. But first, some ancient wisdom:

Do not think that

This is all there is.

More and more

Wonderful teachings exist -

The sword is unfathomable

- Yamaoka Tesshu

I’ve always really loved that quote. Yamaoka Tesshu (the author of the quote) was a swordsman from 1700s Japan; right around the time when the samurai were no longer useful as warriors. They became bureaucrats, law makers, and other high-level administrators - I’m greatly reducing what actually happened after the Sengoku Jidai (the “civil war” of Japan, which culminated with the battle of Sekigahara), but I digress.

Tesshu uses the phrase “the sword” to refer to swordsmanship, but we can take it to be a metaphor for any kind of knowledge, technique, skill, or learning. Whatever it is that you’re studying, there’s always going to be something new, something that you can learn; whether it’s within your area of expertise, or indeed, outside of it.

As an example of this, I’ve been reading a lot of books lately which have nothing to do with software development, .NET, or even C# - I recommend that you do it, too. And each of these books has brought something new to my development work, or has helped to reinforce ideas that I already sort of had. As mentioned in the official biography of Terry Pratchett:

Even the pulpiest piece of sci-fi or fantasy could provide what [Terry] called ‘an exercise bike for the mind: it might not take you anywhere but it tones the muscles that can.’

- Rob Wilkins ;Terry Prattchett: A Life With Footnotes

Some of the things that I’ve recently picked up are:

This reflects something that Bob Martin talks about in Clean Agile - he talks about leaving technology and implementation specific decisions until the last minute

If you want to learn more about the non-development books I’ve been reading recently, and how they’ve affected my development practise, then check out my CPD logs (there’s a link in transcription in the show notes).

But the most impactful for me have been the teachings of Dr. Brené Brown. In her book “Dare to Lead - Brave Work, Tough Conversations, Whole Hearts” she quotes Stephen Covey:

Seek first to understand, then to be understood

- Stephen Covey ;The 7 Habits of Highly Effective People

We need to understand what it is that the user wants to achieve before we can help them do that. But - taking the task of understanding a little further, and combining it with Simon Sinek’s ideas in Start With Why - we also need to understand why the user wants to us to manufacture a tool to help them do what they do.

It’s the reason why the “as a user…” form of user story takes that very form:

As a user, I would like to x so that I can y.

The key part there being “so that I can y.”

And you cannot truly understand why the user wants to do what they want to do, without using empathy, sympathy, and compassion. By understanding why the user wants to do what they want to do, we can get a greater appreciation for how we can implement it.

People don’t buy what you do, they buy why you do it

- Simon Sinek ;The Law of Diffusion TED Talk

So what are empathy, sympathy, and compassion?

Dr. Brown has a talk called “Empathy vs Sympathy: Which one are you?” I’ll embed the talk (which is available on YouTube) into the show full notes, but the most important part for us is when she says:

Empathy is four qualities: perspective taking; no judgment; recognizing; and responding with relevant emotions

So why is this important?

Users of the software that we manufacture are going to report bugs, and when they see that the people who can fix those bugs (i.e. us) get angry at them for reporting the bugs, they’ll be less likely to report them in the future. Especially when the standard response they’ll receive is a twist on:

I can’t believe it! My system is perfect, and you’ve broken it! It’s all your fault!

Well, spoiler alert: it’s never the user’s fault. I’ll say that again, because it’s so very important:

it’s never the user’s fault

It’s our fault. We created the thing that they are interacting with. And not only that, we created it to help them achieve a goal or complete some task. So it’s on us to understand why they want to achieve that goal and work from there.

A Real-World Example

Let’s say that you approach a pedestrian crossing (“as a pedestrian, I want to stop road traffic, so that I can cross the street safely”), and press the button to request that traffic is stopped. But instead of stopping the traffic, you get a light electric shock.

Is that your fault or the fault of the engineers who built the button or the fault of the engineers who built the button?

Remember: this is a thought experiment and as such we need to abandon a lot of rules about the world in order to make it work. After all, it’s a truthy statement rather than a truthful one. For this metaphor to work, we have to abandon the idea of a malicious user: someone who might have intentionally messed with the button to give users a shock.

As long as there hasn’t been fowl play (i.e. a malicious user messed with the controls before we got to them), then it is the fault of the engineers who built the button for the pedestrian crossing, and not the user who go shocked. And that is an almost perfect metaphor for when someone discovers a bug in your software: it’s certainly not their fault. All they wanted to do was cross the street safely.

But should we look to assign blame in the first place? I say no.

Blame, And Why You Shouldn’t Use It

Sure, the further away from the developer that a bug is found, the more expensive it is to fix (a maxim that has been proven in study after study since the 1970s); but is it something which should have blame attached? Suppose that we should attach blame, who do we blame?

If we are to assign blame, then where does that blame sit? We can’t, and definitely shouldn’t, point a finger at someone and say, “it’s your fault.” And why not? Because there is no fault. All that has happened is that someone has discovered that there is a requirement or use case that we didn’t think about, or that we didn’t think through fully.

Besides, as Dr. Brown says in Daring Greatly:

There is nothing productive about blame, and it often involves shaming someone or just being mean. If blame is a pattern in your culture, then shame needs to be addressed as an issue.

- Dr. Brené Brown ;Daring Greatly: How the Courage to be Vulnerable Transforms the Way We Live, Love, Parent, and Lead

And Dr. Brown ought to know a thing or two about blame in the workplace: it’s one of the focuses of her research.

If we adopt the scientific method, then there is no “failure” here but a success: we’ve found a new requirement or use case.

And we should be excited to have a new requirement or use case for the function in our software. This means that people are either using it in ways that they never thought about (meaning that it’s seeing wider user adoption), or that it is missing a key feature. This is wonderful because it means that our software is becoming even more fit for purpose.

For the time being, let’s abandon the idea of Zawinski’s Law, which states that:

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can

We’re going to ignore this because it’s a humours take on the direction that all software projects take, and it’s not that helpful for our discussion.

If we approach the user who has reported the bug with anger, all we will do is inflict suffering on that user. And this will create anger which will be reflected back at us. It’s very easy for me to sit here in my recording studio and tell you not to be angry, but we all have to try.

In the late Thich Nhat Hanh’s book Anger, he raised this point:

When we suffer, we always blame the other person for having made us suffer. We do not realise that the anger is, first of all, our business. We are primarily responsible for our anger, but we believe very naively that if we can say something or do something to punish the other person, we will suffer less.

- Thich Nhat Hahn ;Anger: Wisdom for Cooling the Flames

Obviously, I’ve mispronounced his name there, and for that I’m very sorry.

Whilst replacing that anger with compassion isn’t easy, we must try to do it. For our sake as well as the sake of our users.

Empathy

Let’s revisit Brené Brown’s description of Empathy from earlier:

perspective taking; no judgment; recognizing; and responding with relevant emotions

We need to try and take the perspective of the user: they are trying to accomplish a task, and the tool which we manufactured in order to help them has somehow become a hindrance. We, by the extension of what we have made, are stopping them from achieving their goal.

We need to not judge: this can be very hard to do, as it can be very tempting to judge the user for having done something which we have deemed to be “obviously” (hopefully you can hear the bunny quotes there) wrong. But what’s obvious to you and me, may not be obvious to someone else. We must also remember that we have hundreds, if not thousands of hours of experience using the tool that we manufactured, whereas our users cannot possibly have that experience as they weren’t involved in it’s manufacture.

We need to recognise their emotions: they are feeling a certain way because our tool, which we told them would solve all of their problems, has done the exact opposite. They may be feeling frustrated, angry, lost, hopeless, or any other mixture of emotions.

We need to respond with relevant emotions: if we respond with anger, it will make their pain worse. If we respond with relevant emotions and words along the lines of:

I totally understand. I remember when I was putting this part of the system together, and it didn’t seem logical to me either until I realised that…

or

That’s dreadful, and I’m sorry. Let’s see if we can figure this out. There must be something that I’ve misunderstood when making this part of the system. Let’s go back to basics, please tell me what it should have done…

then the person will see that they are not alone in their suffering, and that we can help them.

In short: Empathy is feeling with people.

Remember, we want the people who use our software to feel at ease doing it. No one is going to even vaguely enjoy using our software if there is a Sword of Damocles hanging over their head: if they know that reporting a problem (a bug or otherwise) with the software will lead to them being on the receiving end of pain, they just wont do it. And that will be worse, because they wont achieve the goal that they set out to achieve; and that could cause them to get fired.

If they have difficulty using it, it is never the fault of the person who uses your software.

As another (slightly more grisly) example, If I designed a chainsaw which only had the goal of cutting down trees and didn’t take into consideration the safety of the people would use it, there’d be an awful lot of people who used it with missing limbs.


A Request To You All

If you’re enjoying this show, would you mind sharing it with a colleague? Check your podcatcher for a link to show notes, which has an embedded player within it and a transcription and all that stuff, and share that link with them. I’d really appreciate it if you could indeed share the show.

But if you’d like other ways to support it, you could:

I would love it if you would share the show with a friend or colleague or leave a rating or review. The other options are completely up to you, and are not required at all to continue enjoying the show.

Anyway, let’s get back to it.


Be aware of what empathy is not, though. Empathy is not pity, sympathy, or knowing but not caring. In order to empathise with someone, you need to care about them on some level. Pity is when you look at someone in trouble and just say:

Well, I hope it gets better

And sympathy is very similar, in that its accepting that there is pain and hoping that it will go away eventually without you actually wanting to do anything about it. It will go away eventually, but just hoping that it will go away won’t help the person who is suffering. Sympathy is when you say:

Oh that sucks. I remember when I was in the same situation, it’s horrid. I can’t help you though, so … er… bye

What About Compassion?

Compassion is when you combine empathy or sympathy with a desire to relieve the suffering of the person.

We write user stories, and by their very nature they are from the user’s point of view. One user story template might read:

Was a user, I’d like to x, so that I can y

There should never be any technical detail in a user story. Simon Sinek would say that the technical detail is the how, and Bob Martin talks (in Clean Agile) about leaving the decisions of how we implement things until the last possible moment. We don’t want to know about database technologies, interfaces, dependency injection, or front-end frameworks; we also don’t want to know about programming languages. We want to know what the user wants to do, and why they want to do it - the golden nugget of user-centric design.

When writing a user story, we need to get inside the mind of a user, figure out what they want to do and why they want to do it, and then write it down. That way we shift our mindset to:

as the user, I want to put myself in the shoes of the user and do the thing so that I understand why they want to do the thing

That way we understand why the user wants to achieve the thing they are trying to achieve. If I be hyperbolic for a moment, by understanding the struggle of what the user is trying to achieve, we can understand how to make the tool that we are manufacturing more well suited to solve their problems. And by making it more well suited to solving their problems, we are designing the software with compassion for the user in mind.

Doctors, nurses, and other medical professionals do this all the time. If you’ve ever been ill enough to need the help of a doctor or nurse, you’ve likely told them what the problem is, and they have done two separate things:

  1. They’ve logically analysed what you have said, and figured out the most likely cause for what troubles you. They’ve then taken their vast knowledge of medicine and treatments, and figured out the most effective way of treating you whilst doing the least harm.
  2. They’ve listened to what you have said, and responded with compassion - a real desire to relieve your suffering. No one would visit a doctor who laughs at them, gets angry at them, or gets exasperated with patients on a regular basis.

And our jobs as developers, whether we realise it or not - whether we like it or not - is to manufacture the tools that the people need in order to do things. And that requires us to understand where the user is coming from, what they want to do, the why behind what they want to do, and the (metaphorical, emotional, or physical) pain they might be feeling.

Another Real-World Example

Back in episode 48 of the show, I spoke with Dylan Beattie about many different things, including the Rockstar programming language. But one of the things which came up was compassion for the user. Here’s a direct quote from him:

There are imbalances here.

I remember years ago, a friend of mine was studying to become a lawyer and, for various arcane historical reasons, the legal qualification process in the UK basically involves jumping through arbitrary hoops for three years in a row.

They have these application windows, where you have to submit applications to various legal firms to do internships and stuff and bits. Its all run through a centralized agency, and the software that’s used for this was, I mean it was horrific. It was one of the worst pieces of user interface design and reliability I think I’ve ever seen.

And to the people who built it - I’m guessing - it was just one outsourced project of thousands that they did that year.

But to these students, who know that if they miss this window, they are not going to qualify as a lawyer because there is no redress. There’s no negotiation. There’s no appeals process. You have to work out how to get through their [censored] website, to get your details on that, and then you go, "did that even work? I don’t know."

You know, you sort of want to sit those developers down with people doing it and go, "your code is making somebody cry at three o’clock in the morning because your crappy error handling might jeopardize their entire career. You know, how do you feel about that?"

So I think some developers would be like, "Yeah, we didn’t get paid enough to do it properly." I think some developers would be like, "it was just a job". I think some developers would be like, "Oh, my God, we need to fix this. This is not acceptable. We need to do better."

In this example from Dylan there’s a real, honest pain that users of the system felt: they might not qualify for a job that they have worked their entire life towards. In the UK, you need to take four years of undergraduate study at a university, followed by three years of specialised study at law school before you can start the application process that Dylan is referring to.

That’s seven years on top of primary, secondary, and further education. According to the Law Society, most solicitors qualify around the age of 29 or 30. That’s a lot of someone’s life that you are going to delay if you don’t care about the people who are using the tools that you create.

This is the potential pain that we’re putting out there when we think:

it doesn’t matter. It’s just a null reference exception which leads to a stack overflow. It’ll work, just don’t enter a minus three into this box.

But to someone out there it is a big deal. Because they won’t know that you aren’t supposed to enter a minus three. Or they might enter a minus three because they have a legitimate reason to do so. And when the software breaks, they’ll come to you and you’ll just be angry at them. Unless you try to empathise, sympathise, and show compassion - and understand why they entered a minus three.

And that’s even if the user can get a message to you. The system that Dylan was talking about has no way for a legitimate user to send a message through asking for help. There is just a form with a large amount of inputs, where users will take a long time typing their inputs, then a button for submitting. If the page times out, if there are illegal characters, if the user pasted from a word processor, they simply get a generic error page and sent back to the beginning; but with all of their hard work removed, so they have to start all over again.

Being Open and Frank

There were lots of other contributing factors, and certainly a lot of other key players, in creating the trope of the developer who doesn’t care for the people who use their software, treats them like dirt, and approaches them with a distinct lack of empathy, sympathy or compassion. But, as I said earlier, we’re not that group of people any more.

We’re more than twenty years into a whole new century and we, as an industry, do fantastically well at accepting people into our communities. We’re empathetic, sympathetic, and compassionate towards those in our community or those who want to join. So why can’t we extend that outside of the “us” of our communities? Granted we have a long way to go, but we’re already way better than most other industries.

But how do we get there?

In my opinion, every time something breaks or every time something goes wrong for a user, we need to have an open, frank, and honest conversation about the software that WE built. We build things as teams, and we all need to discuss what we’ve built. Do we discuss who built what? No. Do we discuss who is to blame? No. We need to talk about the thing that we built and the impact it has had on a person.

In every company there will be some blame (even those who claim to not have a blame culture), it’s a sad fact of how companies and teams are run:

Jamie was the one who pushed to production, so it’s his fault.

If we spend time navel-gazing over whose fault it was, we’ll never get to the important bit: the fact that there are users our there who are in pain. They are the ones who matter at a time like this. in fact, Will Smith (of all people) has been quoted in Adam Grant’s Think Again as saying:

It doesn’t matter "whose fault it is that something is broken if it’s your responsibility to fix it,"

- Adam Grant (quoting Will Smith) ;Think Again

Shame and Blame

It has become the in-vogue thing for companies and technology teams to claim that they don’t have a blame culture. Whilst I have not worked with every company on the planet, I’ve worked with enough to know that most of those companies who claim to not have a blame culture actually use blame and shame on a very regular basis.

Dr. Brené Brown is one of the leading researchers on the subject of shame and blame. Reading any of her books will be a masterclass in how shame and blame will destroy a teams culture - be it a team at a workplace, a sports team, a classroom full of students, volunteers, whatever.

I’ll repeat one of my favourite quotes from Dr. Brown’s work, from her book “Daring Greatly”

There is nothing productive about blame, and it often involves shaming someone or just being mean. If blame is a pattern in your culture, then shame needs to be addressed as an issue.

- Dr. Brené Brown ;Daring Greatly: How the Courage to be Vulnerable Transforms the Way We Live, Love, Parent, and Lead

When software is broken or a user is facing an issue and is in pain, we should only care about the most productive way to fix that problem. Blaming and shaming will get us nowhere near fixing the problem, let alone doing it in a productive manner.

So what do we need to do in order to be productive in fixing the problem for the user?

Being User-Centric

The only way to fix the problems that a user faces is to be user-centric. Every thought, decision, and design must be done in a way which is focussed on why the person wants to achieve their goal. They want to use a tool to do something, what is that thing and why do they want to do it? Only then, can we be productive at fixing the problem for the user.

So let’s get past blame, shame, and navel-gazing, and focus on what the user is saying when they report a bug or error:

This has caused me pain. This has caused me an issue. This has stopped working. It’s not helping me to achieve what I need to do.

Because that is what matters.

We need to take a step back and think about why the code allowed the user to end up in that situation. You need ask questions like:

Is that a valid input that we thought was invalid? Does the UI have accessibility concerns? Have we built this to be colourblind friendly? What about people who use screen readers? Does the workflow make sense to someone who hasn’t been intimately involved in the software’s design?

You need to talk about the execution of the software. Is it running in a way that allows for the user to interact with it? We all know the pain of the spinning mouse cursor followed by the active window greying out, followed by our operating system telling us that the application has stopped responding. We know what that means, but the people who use your software may not.

According to Colourblind Awareness around 4.5% of the UK’s population are colourblind, with an estimated 300 million people worldwide - that’s the same number of people as the entire population of United States of America (according to the latest census at the time of writing this episode). That’s a lot of people who you are leaving unable to use your software if you don’t design with them in mind. And what about blind people? According to the Royal National Institute of Blind People there are 340,000 people in the UK who are registered blind, with an estimated 40 million blind people worldwide.

(sources for those statistics can be found in the show notes, by the way)

And what of our neurodivergent friends? They may have accessibility concerns too.

All of these people could be your users, and they could be struggling to use your software right now. Designing with empathy, sympathy, and compassion in mind would help them out greatly.

You might think that you don’t have to design with these users in mind. You might think:

Our software is about selling books. So we don’t need to worry about blind people.

And if the decision makers can back that decision up with a clear business rule and justifications, and are ready to face any backlash over it, then fair play to them. The problem is that there are accessibility laws and policies for software, and there may be laws in your very jurisdiction which state that you have to design software with accessibility in mind.

Whatever you decide, or most likely whatever is decided for you, it is never the user’s fault. If they push a button and it drops the production database, it’s not their fault for pushing it. Even if the button is clearly labelled “drop database. Don’t click it Jamie because it will break the world,” someone is going to click that button simply because it’s there. And that’s not their fault - unless they really are malicious users.

But if it’s an average person who’s using the software, and they click that “drop database” button that you’ve served to them, and you’ve allowed them to drop the database without challenging them - without prompting for authorization or whatever. Then that’s on you, that’s not on the user.

I’ll say it one more time for clarity:

It is never the user’s fault.

Next Steps

How do we move forward from here? Well, the first thing is to start thinking of the effects that our decisions have. Steve Worthy is a good friend of mine, and he recently said the following in an episode of his podcast Retail Leadership with Steve Worthy:

The effects of your (good) decisions can change the world around you.

and he’s spot on. I added the “good” into the quote, but the point still stands: whatever we do, whatever decisions we make, they affect the world around us. So the first thing we need to do is not see error and bug reports as personal attacks, because they are not.

When we focus on what’s being said:

I am in pain, and this thing caused it. I can’t do the thing that I need to do with this thing, please fix it.

rather than any perceived attack that we feel because of it, we can start to understand the why a lot more than the what.

We’re not going to get it right every time, and that’s not the point. We can’t be perfect (regardless of what Aristotle taught with Virtue Ethics). We just need to be a little better tomorrow than we were today. in the words of Michael Shure:

To demand perfection, or to hold people to impossible standards, is to deny the simple and beautiful reality that nobody is perfect.

- Michael Shure ;How to be Perfect

So I’m not saying that we should be perfect. But I think that we should attempt to apply an Overton Window to our attempts at being better.

An Overton Window is a concept from political science. It’s a metaphorical window into a topic which is seen as tough to reason about or controversial. Over time, the Overton Window will move with popular culture, and the topic which was initially seen as controversial or difficult to reason about will become accepted and part of daily life.

The examples that I could give on Overton Windows fall out of the scope of this episode, so I’ll leave it as an exercise for the listener to go check them out.

But just being better tomorrow than you were today is enough… until the Overton Window moves on, that is.

We should be the change that we want to see. If we want to see more user-centric discussions about our software, more thoughts about the why rather than that how or the what, then we should do it. When we are gathering requirements, we should be asking what the users why is; and if the person we’re asking doesn’t know, then they should be asking, too - or we should find the person who does know.

We need to be more careful with words, too. I was at a talk given by the comedian Robin Ince in the summer of 2022, and he said:

Words are like shrapnel, they cause a lot of hurt in many tiny ways.

- Robin Ince

The words and attitudes that we choose to use can have dire circumstances for those who hear, read, or consume them. So we need to think about what we’re going to say before we say them, or type them, or communicate them. Because you really don’t know whether the words you’re about to use will hurt someone.

Never blame the user. If you served them a page that has a big red “destroy the universe” button, they click it, and it destroys the universe, then it’s not their fault at all. Some users might click the button out of curiosity, and some might click it accidentally. Others still might click the button because their cat ran across their desk and it knocked the mouse.

It is never the users fault.

But then maybe I’m wrong. Neuroscientist, musician, and creativity researcher Dr. Charles Limb had this to say about telling someone that they have their facts wrong:

[I tell them] humbly. Because that assumes that you think you have yours right.

Could I be wrong? Sure. But will it hurt anyone to apply these thoughts and ideas? I’m pretty sure it won’t cause too much harm for us all to be more empathetic, sympathetic, and compassionate to those people who use our software. And, of course, we must demand it from the vendors of the software we use, too.

Resources

I’ve used a lot of resources to build up this episode of the show, including a number of books, videos, podcast episodes, and discussions with people. Here are some of the resources that I would recommend people take a look at when they’re done listening (or reading) this episode:

I’ve also had many long conversations with a tonne of very smart people, all of which lead to the creation of this episode. Some of those amazing people are:

Wrapping Up

That was my episode on Empathy, Sympathy, and Compassion. Be sure to check out the show notes for a bunch of links to some of the stuff that we covered, and full transcription of the interview. The show notes, as always, can be found at dotnetcore.show, and there will be a link directly to them in your podcatcher.

And don’t forget to spread the word, leave a rating or review on your podcatcher of choice - head over to dotnetcore.show/review for ways to do that - reach out via our contact page. And if a friend of yours or a colleague of yours has sent you this episode, head on over to dotnetcore.show/follow for ways to keep up with the show. And to come back next time for more .NET goodness.

I will see you again real soon. See you later folks.

Follow the show

You can find the show on any of these places