Computer says no

How coders are especially susceptible to lack of empathy in talking to other humans

Emil Ong
6 min readApr 30, 2016

I’ve been coding since I was very young and one of my earliest memories was the utter frustration with the compiler spewing syntax errors at me in my lack of understanding of how to form a program. Once I got further along and learned to appease the compiler, I discovered pedantic flags, -Wall, and linters to enforce even stricter standards. I’ve spent hours of my life cleaning up code to make the errors and warnings go away.

What I found was that both I and many of my fellow programmers at the time became incredibly pedantic and annoying individuals. We corrected each other as if we were taking on the persona of a strict compiler, whether we were discussing code or another subject. We had been treated badly by the computer and learned to act like it with each other. We accepted that behavior as acceptable from each other.

We were not popular with non-coders.

Eventually, I became interested in other pursuits and people with different interests and thought this culture was so toxic that I wouldn’t want to pursue computer science or programming ever again. For better or worse, I went back and have been programming for over 20 years now. What I’ve learned in that time is that we can be better humans to one another but get our syntax right too.

Why coders?

If you’re familiar with the Little Britain sketch I referenced in the title, usually identified by the catchphrase, “Computer says no,” you know that a lot of people who depend on computers for their work are susceptible to treating other humans like the computer treats them. Certainly this problem is not limited to people who write code, but that’s what I do, so I know at least we have this problem.

A friend of mine, who works in a finance department, recently asked me what it’s like to code all day long. She works on a computer all day long as well, but is incredibly kind and generous, by the way. I thought about it for a moment and said, “it’s like working with a powerful genie who can grant every wish, but takes everything way too literally.” She thought that sounded awful. Maybe she’s right, but that power is so useful that we do it anyway.

Making it better

There are at least a couple of strategies I can think of to address this problem: fixing ourselves and fixing the computers.

Who do you think you’re talking to?

By working on ourselves, I simply mean being mindful about when you’re communicating with other humans versus computers. Notice when your mindset is set to 'splaining and think about how this makes other people feel. Notice if you do this more after an intense coding session or sprint. Find ways to bring the right mindset to the right context based on what you discover about yourself.

I said this is simple, but simple is not easy. If you’ve never practiced mindfulness, try this exercise: mentally note every time you stand up or sit down in a day. It’s simple to state and do once or twice soon after you think of it, but remembering and sustaining that mindfulness is a life long practice.

One area in which I find myself being especially annoying and fussy is when someone is trying to communicate an idea that may not be completely fleshed out. As someone who writes code, if I’m being asked to translate that idea into code, my instinct is to pay forward the pedantry of the computer to the person pitching the idea.

Instead of perpetuating that unpleasantness, a practice called active listening may be useful. In particular, practice stating what you heard back to the speaker without judgement. Often simply hearing what you heard will be enough to get the speaker to flesh out the idea. Together you can work to put definition to the ambiguity. (There’s more to active listening than just what I’ve mentioned here. This description is somewhat more particular to reflective listening. Brené Brown’s work in Daring Greatly around repeating the “story I’m telling myself" had also been helpful to me.)

Technology: The cause of and solution to every problem

The other approach we can take to fixing this problem is to make computers be nicer to us when we don’t give them exactly what they’re looking for. Of course this topic is incredibly broad, so I’ll just mention a few tactics specific to the practice of coding here.

The first thing we can do is automate fixing anything that the compiler (or your code reviewers) can nitpick about. In the Javascript world, codemods are emerging as a way to automate fixing lint violations and enforcing other stylistic patterns. The Ruby world has rubocop and I’m sure most other mature languages have similar tools. Put these in your automated tool chain, stop arguing about whitespace, and start discussing architecture and other relevant topics to humans. In other words, focus your discussions with other developers on things best solved by humans.

Now for a controversial, anecdotal opinion.

As you may have gleaned from my introduction above, I started with compiled (and strongly-typed, without type inference) languages and it led me down a painful path. Nowadays I see new developers picking up Javascript, Ruby, Python, and other languages which don’t (necessarily) have these features and they are much happier folks than I was starting out. They’re excited about the possibilities of computing and able to indulge in their curiosity much more easily.

Last year, I had the distinct displeasure of working in Java on Android wherein I spent much more time remembering how generics worked and fighting with finicky followers of Joshua Bloch (for whom I have the greatest respect and admiration, only talking about those who follow anyone blindly) than learning or discovering how actually to create quality product, much less enjoy my time at work.

There’s something going on in these communities which makes some more pleasant than others to work in. Right now, I’m much more inclined to tell new developers to learn interpreted, weakly-typed languages than compiled, strongly-typed ones. It may be that these languages make it a barrier to entry that you’re able to put up with annoying pedantic compilers. If anyone has a better statement or more precise cause that explains what makes it feel better working in certain types of languages or platforms, please let me know. This is admittedly only a hypothesis.

Summary

The UX of coding can easily lead us to be worse humans to each other. We transfer the pedantry of the machine to each other, leading us to make each other feel like unwanted imposters who can’t hack it. My ideas for fixing this are:

  • Being mindful of the effect computers have on our treatment of one another
  • Practicing listening without judgement
  • Making the computers fix the things they care about while we fix the ones humans care about
  • Picking communities, languages, and platforms that are nice to us so we’re more likely to be nice to each other

These experiences may just be me. I might be a jerk just looking for an excuse in the computer for my jerkiness, but if it’s not just me, I hope this helps you find some ways to be better to your fellow humans. 😃

--

--