Much talk has appeared recently around the pertinence of a degree in tech. It’s a long standing discussion, which this time was triggered by the publication in the Wall Street Journal of an article suggesting US college students can skip college. Unfortunately, in the UK, I happen to side against the degree in the current climate.
Before I go into it, I love the cutting edge of academic research. I’d have loved to do a PhD and have turned down two offers and scholarships before. The reasons I did then are the the same as now. The UK IT industry, as a cohort, doesn’t value it. There are not that many roles which need a doctorate nor see it as a differentiator and back in the day, you were considered overqualified, so would never make it onto a graduate internship. That left post-doctoral job candidates working for minimum wage.
The Defining Moment
Here in the UK, university education has undergone a major shift as of a few years ago. The cost of a university degree trebled overnight, compounded by market interest on loans which starts earlier and puts it into the same arena as mortgages. Most students expect to leave with £53K+ debts, which aren’t as high as some countries but the UK has struggled with glass ceilings in permanent tech roles and a culture of extremely low job security.
Compared to those starting work earlier, those graduating from a UK university and working in the UK can on average, expect to earn £150,000 more over their entire lifetime (aka the ‘graduate premium’). If you consider interest payments together with the possibility that someone may defer payment if they are not earning £21,000 a year, this advantage is wiped out by the time the loan matures 30 years later. The graduate premium doesn‘t truly reflect the projected financial position of university education and this doesn’t include the effects of inflation on the remaining disposable income stream which will lower the cash value of our £1 over time.
It’s no great secret that a lots of UK companies opening in the States have lost their developers after a few months to roles paying 4 times as much. The cost of living in New York is comparable to that of London (with some aspects, crucially house prices being lower in New York on average) yet the salaries in some national digital/tech sectors around the world dwarfing those of the UK! Compare all this to the cost of living in other countries and the results are a bit grim! In the top-40 countries for developer salaries, the UK comes a pretty average 17th, whilst the USA, Israel and Germany occupy the top 3 spots.
So all in all, university education is even more of a life defining moment that it has ever been here in the UK. This is to say nothing of degrees outside technology who are going to have it much worse!
People could use the argument that those with engineering degrees offer greater benefit to the market and are better problem solvers. Even ignoring the tech industry’s lack of meritocracy, this is only partially true (as much as I’d like to believe it is true). The number of software jobs for real engineers, those with real analytical backgrounds (mathematics, statistics, physics) are few and far between, making up at most around 2.5% of the software development jobs posted in the UK each quarter. The rest do not require advanced problem solving ability such as those nurtured in environments like MIT. Most companies ‘know’ they want a developer and have no idea how to hire them or that there is a better way to engineer software. Indeed, Gen Y probably don’t know any different having grown up with red brick courses being taught the way the have been in recent years. It’s more or less an unknown-unknown. There are many more of those ‘middleweight’ problem solving jobs and those folk fill that gap nicely… skills shortage aside, leaving the real problems to be solved by the specialist problems solving companies such as Google, Facebook, IBM and Microsoft. Honours even, but the number of roles for them relative to the market are few and far between. So the need for a rigorous degree is pretty much non-existent in opportunistic terms.
For example, applied mathematical thinking is one of the main differentiators in problem solving, not one’s ability to TDD software. Certainly all but one of the best software engineers I’ve ever met have backgrounds in Mathematics as a first degree, with a masters in Computer Science, Software Engineering or similar. They do also TDD and BDD extremely well, which is and should be bread and butter to any problem solver. However, by that same token, that one person who is in that respectable bracket hasn’t got a degree but more than holds his own in that same space. He has a better grasp of mathematics than any of the software engineers graduating through software engineering degrees that I’ve so far met.
In addition, the proliferation of managed frameworks and IDE’s whilst bolstering productivity (thumbs up) has, as expected when I ranted about it 15 years ago, lowered the barrier to entry allowing many people who otherwise wouldn’t be in the profession to take it up (thumbs up — everyone should code) but it has also flooded the market with really really bad problem solvers (thumbs down) and neither interview processes, nor recruitment agencies can filter them out confidently. It’s created a two-tier system which has meant those with good problem solving skills have sometimes found themselves marginalised in favour of those who can JFDI but can’t continuously improve.
As one of the commenters on the original WSJ article stated very directly:
You can teach a monkey to code. Arthur Anderson / Accenture created scores of 10 week wonders over the years.
The difference is that they can’t do the heavy lifting.
Its like trying to compare an orderly to an LPN to an RN to a doctor, to a specialist.
While they all work in medicine, don’t think you can assign an orderly to do the work of a specialist
For me, this surmises what we’ve lost! I’ve certainly seen an erosion of problem solving skills in the marketplace in the last 18 years or so.
Sadly, it appears that advanced problem solving isn’t something the majority of companies value or care about (or more accurately, even know they should be caring about it). These skills are the same that staff will need in different roles in the organisation. Business Analysis has been watered down to absurdity over the same time.
Similarly, the emerging field of ‘Data Science’ will see scores of people add this title to their name when they haven’t got a grasp of statistics, don’t understand analysis, experimentation, segmentation or any other crucially important tools or techniques.
Understanding theory of constraints, queuing theory, non-linear dynamics, enterprise level workflows, data acquisition rates, throughput of work, context switching, identifying the impact of bottlenecks are conceptually the same and just as important (if not more so) to the enterprise as they are to software/systems engineering (to be exact, software engineering is the same as the ‘maths’). Software development skills are to an advanced problem solver, the same as picking a language to a software engineer.
What does it say about our prospects for innovation if we can’t as a society find, value or nurture the problem solvers? Have companies lost interest in real innovation in the fight to bring things to market quicker? That is a debate for another day!
For me, the jury is still out on the problem solving, but in terms of degree education, I have a fear that in another 30 years, when a significant proportion of student loans have been written off and perhaps added to the national debt, tomorrow’s graduates will wonder if the degree was worth it at all. I fear only time will tell!