F*** You, I Quit — Hiring Is Broken
Sahat Yalkabov
2.5K273

Now in contrast, let me describe my first “programming job” and how I got it in 1996:

Fresh out of the Air Force, I had an AS in Computer Studies from the University of Maryland and an AAS in Avionics from the Air Force.

The only programming experience I had was in the reconfiguration of F-15 electronics warfare systems which was a keypad and all in hex code. But hey, I thought that was fun and took night courses with UM on the base in Germany and learned Pascal and C. Beyond that, I had ZERO programming experience. I could do searches and sorts with C using linked lists from memory though, but that was the final exam from the classes. The instructor was an old soldier who paid his dues in Vietnam and he felt anything as easy as programming a computer is not going to be half-assed (but stuck to the course).

Upon getting out, I did one interview with a company in Great Neck NY. It was a 2 on 1 interview with the direct manager, a woman who was in charge of the programs that handled all of the reports, and her boss, the big kahuna of all of IT, whose last project in college was automating an entire home with a computer programmed in punch cards.

I was hired. Why? Because all I needed was to show I can “solve a problem with a computer”. The programming language I would work with for the next few years would be FORTRAN, a lot of the programs were written in the 1950s.

That was old school. Back in the day, “Programmers” came from a pool of technicians, HAMs, engineers, and hackers. Over the years I have seen things change, and to me, it’s as if hirers expect “that perfect fit”, with such high granularity that it’s as if they expect the Almighty Himself to apply. Why is this? I’m not entirely sure. Maybe frameworks and specialization make this possible. I have been writing code for decades and most of the stuff mentioned in this article I don’t know — but in my daily work even now, I get situations where I have to crash course languages that I didn’t even know existed at times and learn just enough to run tests.

Or maybe hirers need that perfect 100 percent super fit because they burn people out? Like hey, we need a part for this engine that is prone to breaking so it needs to do it’s job perfectly before it breaks and we replace it. I found that from the description of the interviews the kind of knowledge asked for was, back in the day, the sort of knowledge that would only be expected of someone who already worked there.

Or maybe they are risk-averse? Do people change jobs so often now, or are people so unreliable, that the investment of bringing in a dev who would take some time to learn new tricks is a waste? Do people just work for a few months and then find a new job so for that few months, you need to know exactly what they want and be able to do it in a week? I do come from a time when people worked for the same company for years (until the H1B wonders came or the company got sold/outsourced/bankrupt). Maybe the field is just getting that bad?

But that’s the way we did it. In the 1990s, we didn’t see much concern for specifics, because the approach of hirers (and I have also been in the position to read resumes and do interviews and stuck to this principle) was to ensure that an applicant is simply capable of solving problems with a computer, capable of having the mental agility to learn a language and learn the systems, and that was all. Nobody said “I’m a C programmer” or anything like that. If you can do it in Pascal or C you could do it in C++, .NET, FORTRAN or Java too. Now you have to be a human super-brain?

What a contrast. And it does not appear to be an improvement to me.