Are developers getting smarter or dumber?

Are developers getting smarter or dumber?

I personally didn’t start coding because it was a huge passion. I wanted to go down the road of building super cool things for NASA (or CSA — the Canadian Space Agency) and while doing my undergrad, really fell out of love with the #Engineering program at the University of #Toronto.

As I contemplated my career during my (personal — or existential) year off, I took a job as a Business Analyst at PricewaterhouseCoopers (PwC) working at Nortel Networks where I was mostly writing SQL all day to come up with specific insight / answers that the business had (PwC was then managing Nortel’s payroll, accounting, HRIS, and more). The money was really good (for an 18 year old) and the freedom was great, so when it was time for me to sign back up for a different major and the courses that would come with it I decided to transition my Engineering courses to Math, Physics and Computer Science.

Having come from the engineering department (and having been a Dean’s list student there) I found most things easy in the new majors (Computer Science and Math) and minor (Physics), as the rigour that the engineering department (at that time) forced upon its students made for a very easy transition to a normal Bachelor of Science.

Things were more difficult (for students) then

In the late 90s, we did not have the readily available answers, quizzes, tests, exams, etc… from every possible school published online as students do now. We really had to work to get our assignments done because most version control systems (VCS) did not exist and would not exist for almost 10 years, there was no readily available cloud, very little WiFi and most of us still took notes on pap and paper.

For those of you too young to remember, this was what the web (and search specifically) looked like back then. You could, at times find some very eager and early adopter professors who had put their courses online however the majority of them were on university intranets and not readily available to the general public (i.e. students of other schools).

As such, we had to do the work, and we had to learn — or we’d fail.

Not connecting to most

At this point in my life, having come back from my year off and jumping into a new major, I had lost the network of students I had known and worked with and was starting fresh. I knew that I had one year before I would take University of Toronto’s Professional Experience Year (a co-op program aimed at professional degrees; including but not limited to engineers and computer scientists), so I diligently studied, did all the work and made a very paltry three new friendships (I must qualify that by saying that I was friendly with all the students and still social, however these three were friends that I still talk to 20+ years later).

It may have been snobby of me but I did not connect with most students as I found that so many of them were headed to being web developers (not something I really realized then, but I do now) who would scratch the surface of actual development, rather than actually understanding fundamentals in algorithms, data structures, etc… I was still trying to be an engineer rather than a computer scientist (at a time when Software Engineering was non-existent). As such, my three friends from that time have gone on to become CTOs, VP Engineers and run their own companies.

I will now go through some of the tips I have learned that I want to share with all burgeoning developers and software engineers who fail to see that everything old is new, again and again, not just in fashion, but in software development as well.

Everyone getting caught up in the hype

Do not pay attention to hype. Every year there is a new programming language, framework, library, pattern, component architecture or paradigm that takes the blogosphere by storm. People get crazy about it. Conferences are given. Books are written. Gartner hype cycles rise and fall. Consultants charge insane amounts of money to teach, deploy or otherwise fuck up the lives of people in this industry. The press will support these horrors and will make you feel guilty if you do not pay attention to them.

In 1997 it was CORBA & RUP.

In 2000 it was SOAP & XML.

In 2003 it was Model Driven Architecture and Software Factories.

In 2006 it was Semantic Web and OLPC.

In 2009 it was Augmented Reality.

In 2012 it was Big Data.

Since 2015 it has been machine mearning, deep learning, #artificial intelligence and self-driving vehicles.

Most of what you see and learn in computer science has been around for decades, and this fact is purposedly hidden beneath piles of marketing, books, blog posts and questions on Stack Overflow. Every new architecture is just a reimagination and a readaptation of an idea that was floating around for decades.

Image result for number of developers 2000s
Image result for number of developers 2000s

So many developers

Have you heard of FOMO — that is fear of missing out? Well, people in the development world also have these fears.

Image result for number of software jobs
Image result for number of software jobs

This little image produced by CNN tells a typical story for people looking to get into the business. Everyone sees (or hears) that developers are making a good salary, can often expect raises throughout their careers (although, there is a natural ceiling to this) and they all jump on board.

When I graduated, all the jobs that existed for software development went to people who came from one of two faculties — Computer Science or Engineering (most computer or electrical).

Web development bootcamps

Nowadays, so many people take a Bachelor of Arts and then transition into software development without the right foundation that you end up with tons of people dragging down the profession as a whole.

Image result for number of web dev bootcamps

Many of the bootcamps that potential web devs now go to try to make them employable in 12 weeks or less by teaching them the absolute basics of putting together something like a MEAN stack, a rails app or a mobile app.

No fundamentals.

No computer science.

No algorithms or data structures.

Crazy right! This type of developer is what has driven the web 2.0 movement and also really bad code everywhere.

Massive open online courses (MOOCs)

In the last few years, people have put platforms online that would purport to teach all the fundamentals about, well just about any university degree, online. Places like Udacity and Coursera teach anyone with time (and sometimes money) about some of the topics that these bootcamps fail at. The problem here is that people taking these courses stop learning at the end of completing them, do not apply themselves to anything new and believe they are ready.

Continuous learning is a must

Many common concepts in Computer Science have been around for decades, which makes it worthwhile to learn old programming languages and frameworks; even “arcane” ones. First, it will make you appreciate the current state of the industry and second, you will learn how to use the current tools more effectively — if anything, because you will understand its legacy and origins.

Learn at least one new programming language every year. One new programming language every year, but go beyond the typical “Hello, World” stage, and build something useful with it. I usually build a link shortener with whatever new #technology I learn. It helps me figure out the syntax, it makes me familiar with the SDKs/APIs or the IDE, etc.

Learn unix

I started my university career without a laptop, so I was at the whim of our computer labs — which had SunOS. I very quickly learned to love Unix and never went back. When I did buy my first laptop, I had to compile almost everything, build drivers, use regular X-windows and it was just FUN! I think if you geek out like this on something as mundane as compiling and learning about an OS, you’re someone I’d probably hire!

Image result for linux laptop 1999
Image result for linux laptop 1999

Then I moved to the corporate world and noticed that everyone’s desktops were just being used as terminals to connect to a Unix environment. Aha! The world really does not operate on Microsoft or Apple. Go to any successful company and their web servers are more than likely running one of:

Debian

RedHat

Ubuntu

CentOS

All of these are flavours of unix. If you have a laptop, install Linux on it, I prefer Kubuntu but it really doesn’t matter which specific distro you use, just pick one. Linux is far more customizable than Windows or MAC. Want machine to compile code? Want a desktop OS running one of several GUIs? Want to build a high performance server tailored to your exact needs?

Image result for linux vs windows vs mac
Image result for linux vs windows vs mac

Speed is another huge factor — Windows 7 and Mac OS X take up about 1 GB (okay, so this really depends, but it’s not too far off the mark in general) of RAM. On my Kubuntu Linux system, that’s usually only around 300–400 MB. And startup time is about 20 seconds. Everything runs much, much faster, and whenever I’m stuck using any non-Linux system, I really begin to notice how much I take the speed on my system for granted. It’s extremely useful, practically and psychologically. The wasted time adds up.

Linux is only difficult in the beginning. Once you get over the initial, steep learning curve, you realize that a lot of things become easier. You also realize that you have a lot more control over your system. The ability to change around the basics of how you experience your OS is essential to the general philosophy of Linux and its users, and is something that you won’t be able to do with Windows or MAC.

Executive and board members, Product managers and sales teams

Yes, I am an #executive and a founder in most endeavours I undertake. That also makes me (often times) part of the board, one of the key product managers (since I am the product owner) and more often than not, part of sales and marketing. Let me tell you, I fight tooth and nail for everything a developer would need and want for their craft and I know I still don’t always deliver. Now extrapolate that to normal executives, teams and boards and you end up in a very messed up situation.

Conclusion

I do not think that developers are actually getting dumber, what I think has happened is that most companies have obviously figured out that they need developers in all facets of their organization and as such the demand for them has gone through the roof. This has led the market to react with creating many get rich quick schemes like the web dev bootcamps who charge exorbitant amounts to teach you how to build a rails app in 12 weeks or less.

This by itself is not the only reason, but one of them. Managers, sales teams and executives force developers’ hands to make short-sighted decisions in code and then never give them time to truly refactor. More often than not, these same executives and boards want to drive the cost of technology down and outsource the work to some far off place where there is nearly no accountability and generally not the attention to detail that you would have in a local (in-house) team.

When I, as an architect, executive, code reviewer, database specialist, etc… sees some of this code, I truly cringe. My companies are all built from the ground-up, that is — those I found, not the ones I join. I’m a constant teacher and mentor to all my staff and make sure they get the massive benefit of being a part of my staff.

Come join one of my companies — we build some of the coolest technology and find the best people to work with.

Originally published at AI, CTO, Realtor, Angel Investor.

#AI CEO @1000ml — patented training — Founder @DataforGood, @1000ml & more! Early Investor @Dessa #AI, #ML, #RL, #Sports