Bugs are an inevitable byproduct of writing software. Sure, there are all sorts of techniques and potions that promise to decrease how many of the damn critters run about
Software has bugs. This is normal.
DHH
93748

What bother me here is why, in modern software development, so many people fail to use these techniques and tools. I use to work on “life critical systems”. We had formal specifications of behaviour, and strongly-typed, functional languages that we could apply pre- and post-conditions too. We had static analysis tools that could analyse the code and tell us if a variable was used without being defined, or if an array was accessed outside of its bounds. And yes, OK, having bounded arrays was a pain in the bum, and we didn’t have things like hashes — we had to make our own data types for that sort of thing. But the time we spent doing that extra bit of (very simple, highly-understandable) coding was more than made up for by the fact that the compiler/interpreter would tell me “you did this wrong”, or “are you sure this is what you want to do?”.

I now develop for the web where I have to use languages which are loosely typed and whose behaviour is often complex and unpredictable.

I should not be able to do things like:

var x = “a”;
var y = 2 + x;

because then I can do:

var x = “2”;

var y = 2 + “2”;

It doesn’t make any logical sense. It’s nonsense. Those aren’t two things of the same type. The compiler/interpreter knows this, so why doesn’t it tell me?

Yes, the languages that the web uses make coding very accessible, but they don’t help people to write good code.

I’m sad that articles like this have to be written, because I think we can do much better.