Things I don’t do that makes a bad developer as a result.

Thomas Gebert
4 min readMar 4, 2018

--

I’ve been doing this software engineering thing for some time now, and while the narcissistic part of me feels I am the most handsome, smartest, and greatest developer to ever exist, I occasionally have to succumb to a bit of modesty.

I was looking at some old Java code I wrote five years ago, and while I have (thankfully) gotten slightly better at structuring and optimizing my code (partly thanks to the fact that I don’t use Java anymore), I also realized that there are skill-sets that I seem to have forgotten that might make me do worse overall.

It’s easy to tell yourself that you have gotten universally better at your profession, and it’s possible that you have, but I like to entertain the idea there might be some forgotten knowledge, waiting to be rediscovered by me, and as such, let us begin this journey into why I’ve become a shitty engineer.

Debuggers

I haven’t used a debugger in several years now; not since that aforementioned Java project, actually.

It’s not like I have any real problem with debuggers, and I do believe that they serve a purpose, especially in interpreted languages.

Part of this distaste stems from the fact that I have refused to keep using IDEs after I had to deal with Eclipse crashing six times in one day, solidifying me as a die-hard (Neo)Vim user.

But of course, there are command-line debuggers, even for weird languages like Haskell, and there are almost certainly reasons for using them.

But instead, I sit in my bedroom, spitting out a bunch of print statements and occasionally running some stuff in a REPL and calling it a day, feeling an unearned bit of superiority.

Multiple Monitors

I used to work in a job that gave my three monitors on my desk. I was using a MacBook for that job, so that gave me a whopping four monitors to play with and be the most optimal person I could be.

I enjoyed it, but ever since that job, I have purposefully only ever had one monitor at a time.

I’m not sure of the reason for this, but for the last couple years, I’ve not been able to work using more than a single monitor, and I find multiple screens to be really distracting, instead opting for using a shitton of the Mac OS workspaces.

This is probably bad; there are plenty of reasons to want to have multiple things on the screen, if for no other reason to type whilst looking at a StackOverflow answer.

And it’s not like I have anyone to blame but myself; my employer would happily buy me more monitors if I asked. I’m just annoying and have to feel special, I guess

Languages that people actually give a fuck about

For the last year and a half, I’ve been doing F# for a living. Before that, I spent about eight months writing servers in Erlang, and before that, I was writing a video-archival system in Haskell. My personal projects are generally done in Haskell, or Racket, or (most recently) Coq, and I’ve largely forgotten most of my C, Java, and JavaScript that I used to know.

I don’t actively seek out languages because they are unpopular, but I find that increasingly have become attracted to a lot of the reasons that they are unpopular.

Haskell is unpopular because the type system is very strict and scary; this makes me happy because I think strong type systems are neat and make me feel like my code is awesome. Racket is unpopular because it has a billion parentheses, but I think that’s cool because now I don’t have to worry about precedence, etc…

There’s nothing wrong with being curious and playing around with new languages, but I have become increasingly worried that I’m typecasting myself into a situation where I can never leave my current job, since there are so few jobs in the whacky-language space.

If I were really smart, I’d probably learn how to use Go correctly, or just re-learn JavaScript, but I’ll instead stick with my eccentricities.

Unit Tests

This has been a surprisingly controversial subject in my life, if for no other reason because I’m not entirely convinced I’m wrong.

I tend to avoid writing unit tests, much to the chagrin of my coworkers and managers. I find them to be time-consuming and only there to give false senses of security and to make updating code more difficult and arduous.

But I do understand the appeal, in theory. There is this nice comfort in knowing that you have a program automatically checking to make sure your code is performing as expected.

So I’ll admit that I’m probably the incorrect one in this space; it’s pretty rare that everyone else is wrong and I have some sort of divine messianic knowledge that proves me correct.

Anyway, I’ll probably think of more in the future, these just crossed my mind.

--

--