Leaving academia? Think twice before going into data or software.

Working in the information industry has many perks, but you should be aware of the trade-offs.

Niels Cautaerts
The Modern Scientist
26 min readJun 26, 2023

--

There comes a time in every academic’s career when they have to reevaluate their career plans. A majority of PhD students start out with aspirations to become professor, but only about 0.45% of PhD graduates actually do.

Some are forced out by the system. The academic job market is a brutal game of musical chairs. Over the past decades, the number of of awarded PhD degrees has ballooned but the number of faculty positions has stagnated. There is simply no space for everyone.

Many also leave on their own accord, fed up with the meaningless publish or perish grind, the winner-take-all funding cycle, the flimsy short term contracts, the toxic egos, the countless additional responsibilities, the strain on the family, the administrative burden, and the mediocre salaries.

Yet it can be a struggle for academics to secure a role in “industry”, especially if they have post-doc experience. An extended period of fruitless job hunting leads to insecurity and desperation, especially in academics who have sacrificed the best years of their life collecting stacks of academic achievements that were prestigious at the time but now seem all but worthless trophies.

The long job hunt phenomenon usually reflects quirks — if not to say general dysfunction — in the job market, rather than a lack of competences in the academic individual. Regardless, it has spawned a parasitic industry of career coaches and counseling services that prey on these lost souls. The pitch is that academia is a dark and unwelcoming place (it partially is), whereas industry is the promised land where the grass is always green (it is not). For the bargain price of a few thousand dollars you too can [ join their community / get personal coaching / get access to training material / … ], which will give you the skills to finally land your first dream industry job!

Data and software are industry career paths that are often recommended to soon to be ex-scientists. Researchers can put their analytical minds and technical skills to work to solve the biggest challenges in AI/ML, Big data, IoT, and Digitalization! Time to hop aboard the hype train! For good measure, this advice is generally paired with a friendly “learn to code” slap across the face.

The message has been internalized in the academic community. Hype generates funding opportunities, and research follows the funding breadcrumbs by sprinkling the mandatory AI buzzwords into any proposal. PhDs in AI/ML are proliferating. Why embark on a PhD in physics or mathematics, when you can get hired by directly working on a topic that has generated buzz in industry? Academic influencers on LinkedIn parrot the “learn to code” meme on a regular basis. The internet and the media make “tech” seem like the ultimate Valhalla, and academic career consultants are adamant that the industry is begging to hire highly educated PhDs.

The propaganda works. Many budding researchers now go into a PhD with the explicit aim of getting a career in data or tech afterwards. Washed up academics are signing up for Python boot camps and MOOCs, and treat data in industry as a solid fall-back plan to an academic career. Deriving insights from complex data is a core competence of scientists; doing the same in industry as a data scientist should result in a smooth transition, right?

The reality is that most work in the data industry is far less glamorous than the hype, blog-posts and newspaper headlines would suggest. Very few of those who left for industry come back to tell their tale, so most academics have no clue about life on the other side. This article aims to illuminate the good and bad, from the perspective of a former academic turned data engineer.

My personal story

As statistics would have it, I’m part of the 99% washed up academics. I obtained a PhD in Physics and graduated with honors. Afterwards, I spent two more years during the COVID pandemic as a post-doctoral researcher at a “world renowned” research institute. I enjoyed research and science, but after my post-doc contract ended there was no clear pathway for an academic continuation. So began my search for “a real job”.

I thought I was well prepared for this eventuality. Even though my primary research activities were performed in a lab, I taught myself Python programming during my PhD in order to analyze my data. I also dabbled with machine learning because in the mid 2010’s it seemed like everyone had to know about XGBoost. During my post-doc I contributed new algorithms with custom GPU kernels for image analysis to open source libraries that were popular in my field. At this point I considered myself a seasoned self-taught Python developer and potential data scientist, someone who should be able to transition into an industry career in no time.

Ultimately it still took me over half a year of applying to find a job. None of my applications to public job postings went anywhere. I never even made it to interview; I was simply rejected through e-mail or ghosted. Ultimately, I was given an opportunity to interview for a data engineering position at a company through the recommendation of a friend. The first interview lead to another and eventually a job offer, which I accepted. This episode was the first of many beatings to my ego and my strong belief in pure meritocratic justice.

After only ten months of working as a data engineer I threw in the towel. I don’t regret this period; it was eye opening and educational. However, I discovered that the job content and the new culture in which I found myself were irreconcilable with my personal goals and interests.

Most of my experiences were not advertised by the career coaches, probably because they never worked in the space themselves. If you are an about-to-be washed up academic who is thinking about going into data, I hope my perspective in this piece helps you to make an informed decision.

Disclaimer

This is an opinion piece, compiled from personal experiences, observations and anecdotes from others. Your findings may vary depending on which company you work for, who your colleagues are, what your role is, which country you work in, what your background is, and what your priorities in life are. I worked in the data industry for a short period of time, and so my experience may not be representative for a longer career. As I’m writing from the perspective of someone with a STEM PhD, this is also my main target audience.

The good

You will earn a comfortable salary

The rumors about salaries in data and software are true. If money is a primary motivating driver for you, you should consider making the move. Given that you first went into academia, you either don’t value money that highly, or you were naive. Still, in contemporary society where a lot of our perceived self-worth is derived from income, it feels good to be well paid. If you have a family or other dependents this point will factor even more heavily in your calculations.

In the West, as a salaried developer or data professional, you will comfortably reside in the upper percentiles of earners. In some countries, you may do even better by becoming a freelancer and diligently optimizing your taxes.

I always worked in western Europe, so even in academia I could still live comfortably on a post-doc salary (my heart goes out to all the starving American post-docs). Still, my effective net income jumped by about 40%, from about 2500 EUR to 3500 EUR per month on average, by going from a post-doc in academia to data engineering. In addition I received a company car, fuel card and other benefits. Had I stayed in the field for some years and been promoted, my salary would be projected rise much faster than it ever could in academia.

There is a flip side to salaries in industry: a total lack of transparency.

As a post-doc I was on a fixed pay scale. I could look at a table to see all the categories and associated salaries. I knew that all other post-docs were earning exactly the same amount as me.

In industry, you will not know what your colleagues are earning. A company wants to pay you the least amount you will tolerate, so to get yourself the best deal you must become a ruthless negotiator.

Academics are not used to haggling, expect systems to be fair, and generally have a high tolerance for abuse. Since they lack a good reference point for acceptable salaries, they are in a weak negotiation position. This means they can end up leaving a lot of money on the table.

You will have a life outside of work

Coming from academia, you will find that the pace of most work in industry is eerily… chill. You have weekends and holidays to just do what you please. Maybe you can even spend time with your family! When 5 p.m. rolls around on Friday, you can close the laptop and forget about everything until Monday.

Academics could do that, but the reality is best summarized by this comic. Personally, I always felt like I was in a race in academia. Time not spent working was time wasted. I could always work on another paper, another proposal, another conference abstract. I felt a strong compulsion to do those things before my contract ran out, so that I could find the next floating sheet of ice to jump to. Looking back, I could never have started a family considering the pace at which I was working.

In many ways, academics are like business owners, where the business is their academic career. There is fierce competition in the market so you must do everything to stay ahead. The difference is that academics get no financial reward if they do manage to beat the odds. The main side effect of scientists’ race to the bottom is that society gets flooded with ever more low quality publications that sit forever unread behind a paywall.

Again, there is another side to this coin.

As an academic, I worked unhealthy hours not only because I felt compelled, but also because I was passionate about my research. I cared, because I was exploring my own ideas, and satisfying my own curiosity. My work was simultaneously my hobby.

As a salaried developer, I had to work on other people’s ideas and live by other people’s priorities. Those priorities precipitated from the objectives of shareholders, from which I felt quite detached. The loss of meaning and freedom killed all the joy I previously had for writing code and solving problems. My work was now simply a job. I was relieved to close the laptop on Friday, and reluctant to open it again on Monday.

There will always be a new opportunity

Demand for developers and data professionals is high, and job openings can be found across all industries. Finding a first job in the field is hard — why that is I leave for another article. But once you are in and acquire a keyword friendly job title on LinkedIn, recruiters will pester you like flies. Under a barrage of spam, consisting of irrelevant job pitches and requests for “having a chat about your career”, you may find it hard to swat them away in a professional manner. Even after you leave the field, the recruiters will follow. I imagine this experience shares similarities with what women experience on swiping-based dating apps.

The recruiter tsunami is annoying when you already have a job. But at least you get constant affirmation that your skills are desirable, and it won’t be as hard the second time around to find a job. This may be a chance to develop your negotiation skills.

You will be able to work remotely

This one is partially attributable to the COVID pandemic. 100% remote positions are harder to come by these days, but 60–80% remote jobs for software and data are still the norm where I live. To fulfill the basic responsibilities of a developer, you don’t need to be in a particular location. Personally, I find being able to work from home and avoiding a commute a big boon. Some of my colleagues were even able to live in and work from another country. This is not possible for most other jobs.

One downside to remote work is that you may feel isolated and less connected to the team. Learning from others and encountering random opportunities that may be beneficial to your career are facilitated by face-to-face interaction. In academia you work and learn mostly by yourself. You will find this harder to do in industry where there is no literature to consume; to get up to speed you must extract knowledge from other people’s heads.

You will gain an alternative perspective on work

If nothing else, doing a stint in the data or software industry will give you a new perspective on work that you can only gain by participation. During my post-doc I developed complex open source software, yet my development style was transformed for the better by developing simple software in industry as part of a team. I learned about working in a more goal oriented way, by prioritizing and structuring work instead of exploring every interesting tangent. I learned about technologies I never would have in academia, because academics prefer to entertain interesting curiosities over boring but practical solutions.

In addition, industry jobs in general will give you a different perspective on how the world works. How the job market works. How to navigate organizations with a very diverse workforce. You will learn what skills are actually valued by society — for better or worse. You will recognize you are a market participant with a dollar value on your time. You will look back at your academic achievements and laugh — partly out of frustration — at how little they mean. How you sacrificed months and late nights to prepare manuscripts which you handed over for free to mafia-like for-profit publishing companies.

The lessons you will learn from an industry job are unlikely to increase your intellectual capabilities. But they are important nonetheless.

The bad

You will not work on interesting problems

The headlines, hype, and career coaches lied: very few of the data and software jobs out there are as intellectually stimulating as research.

The vast majority of these jobs exist to support a business in solving the problems it has right now. The importance of a problem is measured by its impact on the bottom line. Most of these problems are rather boring and unsatisfying to someone with an academic background.

Researchers enjoy working on difficult and open ended questions that are concerned with uncovering mechanisms; problems where finding a solution requires inventing novel approaches in order to answer them. Problems with trivial solutions and practical details are left to “future research”.

By contrast, data and software problems in industry are narrow in focus and formulated in terms of achieving a specific result. That result must always be tied to the primary goals of the company: stay in business, increase revenue, decrease costs. These problems are universal across sectors, and thus society expends enormous amounts of human capital to solve nearly identical problems over and over again. From a technical and/or analytical point of view, problems tend to be straightforward; someone — you — just has to spend the time to work it out, down to the boring detail.

The main challenge to industry problems is rarely the problem itself, but how to solve it in the fastest way with the least effort. This is because the most expensive part of solving a problem is paying for the time of a data or software professional, and therefore it is the first parameter that should be minimized by management. Interestingly, the recurrent meetings and micromanagement processes aimed to maximize productivity of the workers are rarely considered with respect to their drag on productivity and waste of human capital.

Solving a problem fast means maximizing the (re)use of tried and true methods and tooling, and avoiding scenarios that require long deep thinking. In data science, that means that you won’t be developing custom ML algorithms; you will go through the catalogue of existing models and pick the one that is good enough. Most often you’ll spend your days putting them into a blender to make ensembles. In software development you won’t be designing novel optimized systems from scratch; you will duct tape together — ”integrating” — existing components, often open source software that was written by passionate hobbyists. You will rely heavily on frameworks, and create tools that assist with slight variations on the same mundane and trivial tasks. Academics I’ve talked to that have spent some time in data industry admit “they miss the math”. For some of them, the pain of this grief subsides with time and money.

It may seem arrogant to dismiss problems in industry as trivial. After all, aren’t all modern companies investing heavily in AI and ML to become “data driven”? Isn’t this a transformative moment in corporate history, where the analytical skills of scientists become indispensable assets for companies to navigate the tumultuous waters of the market?

It is what the career adviser and industry influencers would have you believe, but I would argue: no.

For starters, any company worth their salt has always been data driven, way before the AI/ML hype train picked up steam. They had a “business intelligence” unit, staffed with data analysts, who would crunch the company’s numbers and distill it into reports for leadership. The job of a data analyst is the closest thing to what a research scientist does: collect data to answer a question, process this data, derive and report on insights. Yet data analyst is typically an entry level role in companies; no PhD required.

What is the difference between a scientist and a data analyst? The kinds of data they deal with.

Scientists require specialized domain knowledge to make any sense of their data. Data may need to be processed with custom algorithms implemented in obscure software before any conclusions can be drawn. In addition, scientists diligently control the data collection effort in order to probe causal relationships.

By contrast, corporate data is a directly interpretable log of the company’s activities: what assets the business owns, what is coming in in sales, what are the costs of payroll, etc. Data is a by-product of operations. Understanding this data calls for some basic proficiency in financial jargon and knowledge about company specific conventions. Processing it requires Excel competence and a rudimentary understanding of arithmetic. Averages and sums, charts with a line that goes up or down, and a healthy dose of common sense are often sufficient to steer the corporate ship.

The AI/ML revolution that started in the mid 2010s did not change the nature of business, but it did create a lot of FOMO among business leaders. Companies scrambled to incorporate the technology into their products, services, and processes, and they hired armies of data scientists to make it happen. Many of those data scientists were PhDs, because at the time data science was considered a highly academic discipline. In reality, companies were looking for analysts familiar with the scikit-learn API.

While machine learning techniques are useful for data mining and forecasting in high dimensional spaces, many businesses put the cart before the horse. Most of the problems they aimed to solve were still equally mundane as before; no ML required. Even for the problems that might benefit from ML, the issues of low data quality, sparsity, low/non-existent correlations, and noise were often conveniently papered over in the belief that a shiny new algorithm would magically fix them. Because few people understand statistics, many data scientists to this day are tasked with tweaking model hyperparameters for insignificant accuracy gains. Additionally, since corporate data is collected ad-hoc, causal relationships are nearly impossible to find. This means that data science is often reduced to a glorified curve-fitting exercise.

As the data science hype cooled, companies came to realize that most of the time, effort and challenge in a data project was actually spent on retrieving, cleaning, and transforming data into a format that could be fed into an off-the-shelf ML model. Additionally, they discovered they could not actually use the hacky Jupyter notebooks produced by data scientists. Hence, much more emphasis was placed on building supporting infrastructure and creating software products out of models, birthing the roles of data engineer and ML engineer.

From my own experience, these roles are the very definition of what David Graeber calls “duct tapers” in his book Bullshit jobs. You are tasked with solving a problem that should not exist in the first place, and would not if other people did their jobs well. Yet it turns out that the demand for duct tapers is high: there are many more job openings for various engineering roles than science roles in the data space. These days, they can even demand higher salaries than data scientists; another sign that companies place low value on pure analytical skills or education level if these characteristics can not be leveraged and steered towards profit generating endeavors.

Of course there are interesting jobs out there in industry. Research and innovation oriented data and software jobs exist. After all, big tech companies have research arms where the ChatGPTs of the world are developed. But these jobs are not reserved for the average “I’ve done a machine learning course on Coursera” PhD graduate. The market for research-oriented positions in industry is just as competitive as academia. Research is expensive and risky. Companies want to maximize next quarter’s profits; investing in research is diametrically opposed to this goal. Most money going to research might as well have been burned: it returns nothing to investors in the short term. That’s why nearly all fundamental research is funded by society through taxes.

In conclusion, the vast majority of data and software jobs exist to deliver a basic service to business. This involves working on mostly boring and repetitive problems. Throwing ML into the mix does not make problems more interesting, nor the associated jobs more intellectually stimulating. Instead, duct taper jobs proliferated to support a tiny kernel of ML applications.

You threw away years of your life

You might find yourself going through the five stages of grief over your dying academic career. Allow me to drop some truth bombs which will hopefully make you get to the acceptance stage faster.

Very few companies care about your academic credentials. Most will even completely ignore them. In some cases, they will count against you. You may have found your 3–7 years as a PhD scholar a transformative experience, but if you decide to get a data or software job you will start right back at the bottom of the ladder. Any post-PhD academic experience is just delaying the same inevitable result. You will have younger colleagues with a more senior rank than you, and your same-aged peers will be so far ahead that you will never catch up. Given all the expert knowledge and skills you know you posses, given the sacrifices you made, and given that the new job isn’t all that hard to begin with (see the previous section), you feel humiliated. By striving to be part of the frontier of human knowledge, you effectively handicapped yourself in building out a basic tech career.

Career coaches suggest you to highlight all the “transferable skills” you acquired as an academic, which are also useful in industry jobs: data analysis, critical thinking, project management, leadership, effective verbal and written communication, … the list goes on. For technical jobs, one may add programming to this list.

Unfortunately, companies generally do not equate this experience to the same experience gained in an industry job, which is why you will be set back. Your “real experience” meter starts counting day one of your first non-academic job. That is also the day you become valuable to LinkedIn recruiters.

Why is that?

Firstly, companies don’t fully understand academic jobs. To them, you are a stranger from another land, and they sense you don’t know the culture and speak in a weird dialect. “Years of experience in an industry” is a convenient and commonly used proxy to assess your worth, and you don’t fit into the box.

Secondly, the data and software world operates in a completely different way compared to academia. Academics mostly develop code by themselves and for themselves. The tech-savy ones may contribute to open source software. By contrast, software development in industry has a strong focus on product, teamwork, maintainability and robustness. You are not familiar with this way of developing. Therefore, you are a junior.

Finally, companies pay developers and data professionals for experience in very specific technologies, not their eagerness to learn or their growth potential. Even if you know how to program, you will lack a lot of hard skills and hands on experience with tools and libraries that are uncommon in academia (e.g. cloud services, pyspark, kubernetes, or even SQL). To learn the basics of these technologies will take you a few days to weeks. To master them may take you months to years. Therefore, you are a junior.

Your predicament as a research scientist is that you are simultaneously highly over-credentialed and underqualified for most jobs. In some industry jobs like project management, you might get away with selling your transferable skills. But in a technical job you need to demonstrate hard skills in a very specific domain, and you just don’t have them. You don’t know “how things are done around here”. You are an electrical engineer applying to become an electrician.

To reach the acceptance stage, you must be OK with letting go of your academic decorations, be open up to learning an entirely different thing, and look only ahead with the aim of becoming a code craftsman. You choose a technical industry job because you want to write code, not produce documents, papers or slides. You have proven that you can learn fast, so all you can do is run and try to catch up. The best you can hope for is that, somewhere along your journey, you can dust off a niche skill you picked up as an academic.

You are a replaceable resource

As you progress in a research career, the field thins out to a tiny number of people. At some point, all the people on this planet with comparable expertise in the subject fit into a single room, and they meet every year at a conference to listen to each other’s boring PowerPoint talks. You are unique. You are one of very few experts in the world and it feels good.

The only problem is that, except for the people in that room, almost no one else on earth cares. No one but the taxpayer will purchase your knowledge. On the contrary, you will be the one paying conference and publication fees in a desperate attempt to advertise your research. Maybe if you reach the senior professor stage and have a media-friendly personality, you will be considered worthy of answering basic science questions on the news.

By contrast, industry rolls over backwards to buy your skills as a data or software professional. However, your skills will be considered a commodity. While there are too few workers to meet the demand for software duct-tapers, you will still be one of many with a functionally identical skillset.

The only distinguishing metrics that determine your market value are which tech you have touched and how many years of experience you have with them. I’ve seen a PhD in computer science receive the same rate as a BA in psychology because they were doing the same job. Of course credentials alone should not determine salary. However, the fact that the computer scientist was at least twice as productive and better at solving problems because he had a deep understanding of the technology didn’t get factored into any business equations. Differences between individuals are obvious at the level of a team, but not quantifiable or visible to the money people.

To the business, your unique personhood is to be abstracted away because it is messy, complex and difficult to capture in a spreadsheet. A company wants all of their operations to behave like a factory, with predictable, streamlined and repetitive processes. Henceforth, you shall be nothing but a “resource”, one of many, who has the ability to type keys in the right order to make the computer do things. You are a digital laborer who should be optimally utilized. Your potential is quantified in terms of your number of available man days. Your productivity is tracked via tickets that are dragged around in Jira, the dominant IT project management tool. If you drop dead from boredom and monotony on the digital factory floor, the first question that is asked is which resource will replace you.

The idea that software development should strive to be like manufacturing is perpetuated by popular IT management doctrines like Scrum and derivatives of Lean. Leadership in industry is prone to adopting various management fads in a cargo cult fashion, rather than iterate based on evidence. Scientists are allergic to dogma, but because you are now a resource at the bottom of the corporate hierarchy, you get no say in how you are managed or what you will work on. These tasks are reserved for the “thinkers” and “planners”. Which leads us to the next topic:

Your opinions and ideas do not matter

Academics are encouraged to be skeptical of everything — especially of authority and popular sentiment. They should question each and every one of their beliefs and should never assess the truthfulness of any statement without sound evidence. If you have a good idea and sufficiently good argumentation to support it, you will be free to try it out no matter what your academic rank is.

Industry does not like skeptics, especially if they are at the bottom of the hierarchy. Difficult questions and challenging the status quo are perceived as going against the grain and complicating team cohesion. You are hired to be a code craftsman. Everything beyond that is outside of your domain of influence.

As a technical resource that is part of a team, it is much more important that you believe in a common set of values, than that the values you hold are actually true. Industry is not in the business of seeking out truth. There is no scholarship or philosophy in industry, only thought leadership.

Scholars aim to study and evaluate the literature, and gain new knowledge by proposing and testing hypotheses through the scientific method.

Thought leaders perpetuate ideologies that apply an original twist to an old basic concept, and package it into a TED talk or tech blog post. Some of these ideas gain traction in the community, which promotes the thought leader to an authority figure. By virtue of popularity and authority and yet total lack of hard evidence, the new ideology becomes widely accepted.

The blind adoption of thought leadership by companies leads to all sorts of Kafkaesque situations, productivity destroying meetings, and rivers of performative bullshit which you will have to wade through as a technical resource. Scrum rituals in enterprise warrant particular mention, but going into more detail is outside the scope of this article.

You will always be told you can provide feedback, challenge things, pitch new ideas. This is only true in so far as you don’t trample the fundamental values the team subscribes to, or trespass on the domain of leadership. If you feel micromanaged by Scrum, you will not be able to challenge that. If you find unit testing is not very helpful in writing data pipelines, you may be reprimanded for undermining the team’s TDD philosophy. Counterintuitively, conformism and even sycophancy are far superior strategies for standing out. You know why LinkedIn is full of loonies throwing toxic positivity cringefests? Because it is rewarded.

The good news is you can always become a thought leader yourself to try to change the tides. No credentials, evidence, or data needed, just a nugget of originality and a healthy dose of salesmanship will do the trick. You might consider this post a piece of fluffy thought leadership, except that the bulk of corporate IT leadership won’t be happy to read it.

A technical job is a dead-end job

Humans have an insatiable desire for progress.

In theory, progress in academia means continuously pushing the boundaries of human knowledge. Most academics claim this is their primary driving ambition. However, in practice, they can only do so by staying in the academic game. This requires collecting ever more prestige points: getting the title of professor, publishing in Nature, being keynote speaker at a conference, securing a prestigious grant, receiving some kind of prize, … At some point, winning the game becomes a goal in itself, and natural selection pushes shrewd careerists up through the ranks.

In industry, progress means increasing shareholder value every quarter. Workers do not have intrinsic motivation to achieve this goal, so companies manufacture it by gamifying the work experience with career ladders and inflated job titles. The illusion of career progress serves the purpose of a metaphorical carrot hanging just out of reach in front of the worker, so that they would continue to push the profit mill. Many companies are able to indoctrinate their workers, and convince them that “adding value” is a noble cause they should sacrifice the best years of their life for. The most successful corporations convince their workers that the mission of the company is good for society.

As a worker in the data or software space you contribute to shareholder value, but usually in an indirect way. Unless you work for a tech company where the product is software, you primarily serve internal clients. That means you end up being seen as a cost; a resource that provides a service to business which enables them to perform value adding activities.

From this perspective, it is easy to see why career progress is a problematic concept in a technical career. If you are promoted up in the corporate hierarchy, you will no longer provide the same useful service to business and instead be expected to manage people. Alternatively, you can be promoted from junior, to medior, to senior, to reward you for your years of experience without going into management. Companies claim that this expert track is just as rewarding as the management track. Sure, your salary will increase, but you will remain a resource at the bottom of the hierarchy.

You may also experience the Dilbert principle and Putt’s law first hand when you go into industry. In academia, not every professor is equally intelligent, but at least they climbed up from the bottom to get where they are, and you can expect a basic level of competence and understanding of what it means to be a researcher. In industry, it’s apparently very normal to manage a software engineering team without ever having written a line of code. Working under lame duck IT management who feel the need to justify their existence by wasting your time in pointless planning and alignment meetings is soul crushing.

In IT there appears to exist a loophole: the “architect” career track. Architects don’t have to do technical work nor manage people. In an enterprise setting, this role involves becoming a professional participant to high level meetings and discussions that have no bearing on the actual technical work being done. Architects make designs and drawings of software systems which will not be made; certainly not according to their plans. They don’t even need to know how to implement anything themselves, but they should be familiar with UML. Architects often resemble dictionary examples of the Dilbert principle.

The smartest people realize that technical work always reduces to grunt work in the enterprise, so they avoid it altogether. In order to have the biggest perceived impact — and thus the biggest pay-day — only problems at the highest level of the corporate hierarchy should be addressed. Few of those problems are technical.

Hence, for maximum perceived impact on value creation, you must work directly with C-suite or become C-suite. The best chances for doing so are created by going into management consulting — Bain, McKinsey or BCG are a must — and/or buying yourself an MBA. This is the short-cut career progression track. You don’t climb your way up to CEO by working yourself up from developer. Certainly not after wasting the first critical years your career in academia.

But what if you like programming and data analysis? What if you don’t care about becoming CEO, but still want to do meaningful, interesting, valuable, technical work, and be respected as a person with ideas, creativity and intellect?

Too bad, that’s not how enterprise works.

There are two ways to make the most out of a technical career.

The first is to embrace your status as a resource, forget career progress, and sell your time as a freelancer/consultant. In this scenario, you surrender to the market but become your own shareholder and CEO.

The second is to ensure that whatever you build has direct impact to the company’s bottom line. That means working at a tech company, where software is the product and core business. Of course, these positions are much more competitive than, say, a developer role at a bank.

The most rewarding but most risky may be trying to combine the two: create your own software and sell it to companies as a product.

In summary, unless you work at a company where the tech is the product, most tech and data work is (perceived as) a low impact and low value service to business. As long as you are an implementer, you are forever stuck at the bottom as a resource, drifting around on the whims of fools with agency to make decisions.

Conclusion

Academia is in crisis and it has been for decades. It’s a pyramid scheme that burns through ambitious young people for the benefit of a few hot shot academics at the top. It’s a dog-eat-dog world where everyone is fighting for breadcrumbs. The chances of scaling the pyramid are minuscule, but if you do beat the odds you will have sacrificed everything. If you don’t manage to climb up, your only future in academia will be post-doc purgatory. Eventually, the system forces you out with nothing to show for it.

On the other hand, work in research provides a unique combination of intellectual challenge and creative freedom. It is a vocation in which self-motivated individuals thrive, driven by the pursuit of knowledge. The starving artist and starving scientist have more in common than one would think.

Unfortunately, society as a whole can not function if everyone gets paid to play with curiosities. From cleaning, to construction, to manufacturing, to software development: there is a lot of grunt work to be done to go from a sentence that starts with “This could have applications in …” to an actual product that people can use. At least in software and data you get to work from a comfy chair at home, at a leisurely pace, while being compensated well.

There is nothing wrong with “just a job” kind of work. However, the hype and career advice around data and software is misleading many academics into thinking they will apply their expert knowledge and skills to work on ground breaking applications in the AI revolution happening in industry. The reality waiting for most will be boredom, monotony, purposelessness, micromanagement, dehumanization, low appreciation, and stagnation. Just another resource on the pile to build low value crap no one needs.

Learning to code is good advice for any academic, but doing so in the service of securing a data or software job is not for everyone.

I work as a researcher, data scientist, and scientific software developer at VITO. Previously, I also worked in enterprise as a data engineering consultant. Opinions expressed in these pieces are solely my own, and do not reflect those of my current or past employers. Check out my personal blog where I occasionally write about random things that interest me.

--

--