Why full stack developer is a myth in 2022
Fullstack by backend
Since the very beginning of my career, I see the position descriptions like “Fullstack developer this” or “Fullstack developer that.” Speaking a language doesn’t mean understanding anything written in it. Here is what I think about it, with many pictures too!
Webmaster of disaster
PHP — let’s not forget what it means — personal home page interpreter was a real big deal. The HTML template with SQL placeholders gave a so-called webmaster the full power to change the whole page's content at once. A true full-stack dev, the year was 1995—good old days. I was 6, by the way, and Madonna was a virgin. Well, not really.
Two sides of the same coin
The technologies kept developing, and it became clear that UX/UI and working with data don’t have so much in common. And keeping the database up and running is a task by itself. Ops has their own way, CSS standards were released and working for a while, the devs were still working on the whole stack, MVCing the data through templates. The year was 2005.
What about the backend? The whole world is on SQL case because obviously, SQL is the biggest mistake that happened to the programmer world. Rant in the comments if you disagree or agree! The semantic gap between the relational data model and object-oriented programming languages pushes the developers to search for the solution. While ones try to pretend it doesn’t exist and create active record ORMs (in python Django ORM, PonyORM) hiding it from the client, the others try to transparently project the table structure to the code, spreading the complexity of the data tangling further up the abstraction sky scrapper — the data mapper pattern(sqlalchemy). The third feel very attacked by SQL and try to subvert the thousand years rule of the simple query language. Ehm, sorry, structured.
Mongos, casandras, dinamos. We need more databases, please.
MVC is not MVCing anymore; SOAP is old as hell and annoys everyone. RPC caused the monstrous unsupportable corporate nightmare networks, so folks came up with a simplified version of it: REST. Simple actions, a multitude of models, and a couple of bells and whistles. Facebook reinterprets the PHP interpreter (that thing died for good afterward) because the amount of crappy code they wrote is so humongous that it’s easier to write the whole new interpreter to optimize the existing code base. It’s like looking for another planet because you fucked up the old one. Seems familiar?
Antoine DevOps Exupéry
On the other side of the little developer little planet, the code still needs to be deployed.
Much has changed since the bare metal era. All wanted to deploy virtually: the hosters so they could fully utilize their hardware and the devs because they wanted the repeatable environment and lower prices for operations. Intel virtualization technology passes the youth pimples since the first release in 2005 and is widely available to the users.
Two main approaches compete. The virtualizers say bitwise copies at any cost, fully duplicable environments for thousands of servers — quickly clonable and easily rolled back. Virtual machines from VMware and VirtualBox, vagrant (rest in peace)result from the debate on that side.
The provisioners counter: full clone is a unicorn; there is no such thing, binary images are too big and expensive to store and process, and have trouble supporting all the hardware. Chef (2014), puppet (2011), ansible(2012) are the result of this one.
In December 2013, Docker was first released. Nowadays, it’s a standard for partial virtualization, aka containerization, which is, in a nutshell, the combination of the two approaches. And then: cloud, elastic environments, the systems need better maintainability and service levels, quick rollbacks, canary deployments, blue-green deployments, docker-compose, Kubernetes, AWS, GCP, Azure.
Besides that, you can’t create any serious software without quality assurance, probably data science, if you are about to create something particularly disruptive; usability analysis, and of course, UI design for things to look good.
All of that was done by a single person once. It was a quarter-century ago. Things have become quite a bit more complicated.
Let me put it clearly: no one can handle this all alone anymore.
Envision the vision 🐘
Ball of confusion, this is what the IT today, hey hey.
I recall my first lecture in section number 3 — the graph theory — in the course of discrete mathematics back then in 2006, prof: “Graph theory is a particularly new discipline, so the wording is not yet very uniform. Some books address the same things under different names and slightly different things under the same name. It’s been only 60–70 years from the first publication on that matter.” I was 16, and 60 years didn’t seem a very short period of time to me.
The IT as we know it now didn’t have even a fracture of that time. The knowledge and skills you gain, the technologies you master may become completely obsolete in a matter of a couple of years. No wonder we don’t really understand the strict definition and the borders of responsibility in the engineering team. Reshape the object in the backend or reshape it in the frontend? Who is responsible for the sprites — the designer or the layout engineer or frontend developer? The struggle is real.
Remember the parable about the blind men and the elephant? The blind men have never seen it, so when they had to describe it, each of them had their own vision of what an elephant was. We all tend to exaggerate the meaning of our contribution to the team effort and the significance of our experience.
Imagine a car being a software product. A designer, a frontend dev, a backend dev, a manager, a QA engineer, and a DevOps engineer develop and deploy it together. Here is how they describe the elephant:
So you can work it out as a single person, but only to a certain scale. The contradiction, mutually excluding motivation and lack of deep knowledge in all areas, will reflect a product's bad quality. At the end of the day, doesn’t Conway law say exactly that?