Web Crap Has Taken Control

fulalas
17 min readJul 19, 2024

--

It’s no secret that web development has been problematic for a long time. Year after year, more tools are created with the pretense of being the ultimate solution to fix all web crapware once and for all. But are they accomplishing anything?

The untold history of web development (inspired by Fireship)

In 2016, Jose Aguinaga published on Hacker Noon a brilliant and hilarious article [1] about an anecdotal conversation between two web developers: one who hadn’t touched web code for some years and the other who was still working in the field. Even after 8 years, it still holds up pretty well, except for the ending, where the author clearly underestimated how far the tentacles of the web would go:

I’m just going to move back to the backend. I just can’t handle these many changes and versions and editions and compilers and transpilers. The JavaScript community is insane if it thinks anyone can keep up with this.

Introducing React

After Facebook social media was released to the public, it soon became popular, grew in features and turned into this very complex and slow application. To work around that, the people behind what is now called Meta decided to create a new JavaScript library called React, released to the public in 2013. Although many frontend frameworks are available today, the React ecosystem dominates the market, imposing all sorts of idiosyncrasies [2].

Most popular web frameworks in 2023 (source)

React is the right solution for Facebook in the same way a hammer is the right tool to drive a screw. But worse than Facebook creating and using React is that soon after it was released, most tech companies around the globe started to believe they needed it in order to be as successful as Facebook. Life is not without irony now that Facebook’s prestige has been fading away.

One of the biggest changes introduced by React is that now everything should be asynchronous to avoid locking the UI when things get updated. So if you want to change the state of something and check it later, you can’t just simply do this:

let variable = "test"

if (variable === "test")
console.log("hello")

To change the state of a variable in React we need to call an asynchronous method. As a result, when the IF statement checks for variable’s value, there’s no guarantee it has already been updated. The solution is to implement the useEffect() callback and move the IF statement into it, like this:

import { useState, useEffect } from "react"

const Example = () => {
const [variable, setVariable] = useState("")

useEffect(() => {
setVariable("test");
}, []);

useEffect(() => {
if (variable === "test")
console.log("hello")
}, [variable])
}

Before you wonder, setVariable() will never return a promise or anything. So yes, you need a callback named useEffect() and it needs to have an array with the variables you want to handle. The problem here isn’t limited to tripling the code and having a much more confusing syntax; it’s the callback hell. It means that in practice you can’t debug your code anymore because the code flow is now super winding. It’s not uncommon to see developers spreading things like console.log(“passed here”) all over the place.

We have been working with UIs for over 50 years without needing to implement everything asynchronously by default. Only in specific cases where the application must perform a heavy operation in the background is the async approach useful to avoid locking the UI. In vanilla JavaScript, for example, methods like setTimeout(), fetch() and even for await…of are asynchronous, so the UI doesn’t hang when they’re called. Developers also have the option to make their own methods asynchronous. This approach offers a much easier and faster development process without sacrificing the user experience.

But React is relentless in the shit business. For it, there’s no such thing as static HTML. Every screen the user sees is the result of a JavaScript code, including React itself, that generates the HTML either on the server side (SSR) or client side (CSR), with the latter being the default and therefore the most common. Most developers never heard about this concept until they realize their application is performing poorly, but then it’s too late because migrating to SSR in React is a whole other nightmare. Even if you accomplish that, you’re just shifting the inefficiency to the server, so global warming would still be looking at you.

The user can check if the application is using CSR by requesting the browser to show the page source. If the content is just a template, it means the user machine is running JavaScript code to generate the HTML. Yes folks, your battery life is most likely being compromised for no good reason.

In React, you frequently have to deal with code in the middle of HTML-like React syntax. Here’s an example:

return (
<Container maxWidth="lg">
<Header />
<div className={classes.root}>
<Grid container direction="row" justify="center">
<Typography variant="h4" className={classes.heading}>
Create User
</Typography>
{ sbOpen ? <NotificationSnackBar onClose={sbHandleClose} /> : "" }
</Grid>
{ working ? <Spinner /> :
<Grid container direction="column" alignContent="center">
[...]
</Grid>
</div>
</Container>
)

How can someone call this mess progress? Back when there was no JavaScript framework, the code was separated in three types of files: HTML, CSS and JavaScript. The separation was clear, and even if the developer wanted to embed everything into the HTML, they would have dedicated tags, completely detached from each other. Not saying it was pure joy in a colorful dream, but clearly a much better solution than what these frameworks impose. But bear with me because we’re just scratching the surface.

I see dead packages

The JavaScript framework frenzy implies the applications will be developed with the help of third-party plugins, or as they call them, packages. Modern web developers are often so reliant on these packages that they forget the basics of math and algorithms, so packages such as is-odd [3] and left-pad [4] are frighteningly popular. It resembles factory workers on an assembly line: no clue on how to build the parts, only how to plug them.

To make things worse, many old limitations of JavaScript can still be seen in its latest version. For example, when trying to parse JSON, if there’s an invalid character, the language won’t inform you the index, so chances are you won’t be able to pinpoint the offending character, especially in long inputs. Let’s add one more package to fix that, shall we?

The list of annoyances is so long that Microsoft decided to create TypeScript, a language written on top of JavaScript to overcome some of its limitations. But because browsers don’t understand TypeScript, your application needs a package to transpile it to JavaScript. And because JavaScript has to be compiled on the client side, it means the browser version may be incompatible with some recent JavaScript features, so another step is needed to transpile modern JavaScript to legacy one to ensure old browsers will be able to run the application properly.

As you can imagine, each iteration makes the code fatter. The result of this ever-growing inefficiency is that simple applications usually have thousands of direct dependencies, plus thousands of indirect ones, resulting in insanely large JavaScript files. All of this usually sent to the client machine to be downloaded, parsed, compiled and executed just to generate the HTML, which is what matters in the end. It’s nuts.

In some cases, all it takes is one of these unknown package developers to screw their code, causing countless applications around the world to fall apart. That’s precisely what happened with left-pad, as discussed in a previous article on how bullshit has dominated the tech industry [5].

With the optimization knowledge from today It should be going other way (source)

When a web developer checkouts their project and calls npm install (or yarn install — oh boy, another package manager), it usually downloads a large amount of packages into ‘node_modules’ sub-folder. The catch: these package major versions are hard-coded in the project’s ‘package.json’ file. Handle it with care because it’s all very fragile! This means all the pressure on users to have their systems up-to-date doesn’t apply to the dependencies of web applications, unless developers manually check each dependency update every week, which is, of course, very unlikely.

And since each dependency may also have its own dependencies, you may find inside the package folder another ‘node_modules’ sub-folder, which can also have dependencies of its own, and so on. Depending on the order these things are loaded, you might end up with a broken project due to mixed versions of the same package running at the same time. Sure, there are packages to solve the problem with packages, and, of course, they cause other problems that will require other packages... It’s a recursive machine of chaos that literally makes a ‘hello world’ project require more than 40,000 files. They call this fun.

But wait because it gets worse: since some dependencies are system-related, like Node.js, Java and yarn, if the developer updates the OS they might break their application. That’s one of the reasons why Docker is so popular, as it freezes a very specific OS state into a container that happens to work for a given project. After all, when having a problem, why not creating a new tool to workaround it instead of fixing the cause, right? That’s modern web in a nutshell.

One remarkable example can be seen in GitHub’s blame page [6]. Because it implements this stupid React viewport ‘optimization’, if you use the browser search feature to look for anything outside the viewport, the result will be empty. What those genius did? They created their own search feature, powered by React and garbage npm packages, resulting in almost 3 MB of minified JavaScript code. If they could only know that simple static HTML can easily render interactive tables with thousands of rows even on a mid-range phone.

GitHub’s blame page performance measured by Lighthouse tool in mobile mode (v128.0.6547.0)

Faster than a turtle, or maybe not

In a 2015 presentation, Tom Occhino, one of React’s Core developers, said the following [7]:

People started playing around with React internally, and everyone had the same reaction: “Okay, I have no idea if this is going to be performant enough, but this is so fun, I don’t really care if it’s too slow — somebody will make it faster.”

They all knew that in JavaScript land, the more abstractions, the worse the performance. They were just hoping for someone to perform some sorcery, which, as we know, never happened and probably never will. Fast forward to 2018, when Netflix announced they migrated their portal from React to vanilla JavaScript, managing to reduce JavaScript bundle size to 1/5, which resulted in 50% faster loading time [8].

In 2020, Jeremy Wagner wrote this amazing article where he conducted some performance tests comparing React to vanilla JavaScript to show one component on the screen [9]. We’re not talking about vanilla JavaScript being twice faster. No, no. We’re talking about 24 times faster:

Hoping for someone to make it faster, React continues to achieve garbage

Material UI, one of the most popular libraries for React, with 4 million downloads per week [10], is another example of the performance disaster in React ecosystem. This is the total time spent with JavaScript in mobile mode loading a screen with just buttons (same looking style):

React buttons vs MUI buttons (v5.15.21) — inefficiency at its best

Some people will try to mislead you with arguments like ‘no one will put thousands of buttons on a single page’, where they are missing the point: these popular React packages are extremely inefficient even when compared to pure React, which we already know is slow. A static HTML, in comparison, takes 0 ms in this test because it doesn’t need JavaScript to achieve the same result.

The increased entropy from Material UI affects not just the JavaScript process, but also the final HTML, which has twice the number of DOM elements and 10 times the file size! Again, from the user’s perspective, they look exactly the same in this test.

In React Native, if you have a FlatList component with more than 50 items with just text, performance will suck and React will splash this bullshit in your face:

VirtualizedList: You have a large list that is slow to update — make sure your renderItem function renders components that follow React performance best practices like PureComponent, shouldComponentUpdate, etc.

Now, if it knows what’s wrong, why doesn’t it do the right thing automatically by default? Well, that’s missing the point. The truth is that it’s simply unacceptable that in 2024 people are still trying to convince us that something as simple as a list could be slow in any reasonable scenario. It gets even more surreal when we read React’s official documentation [11]:

Be aware: React code is expected to contain bugs

React has no self-respect to the point it admits its code can have bugs, and even calls absolute positioning something ‘complex’. Really?! If this doesn’t sound pathetic, let’s try a different approach then.

In 1997, Id software released Quake 2, a 3D-accelerated game that was capable of delivering 60 frames per second on machines from that time. This means that every 16 ms the engine had to process hundreds of polygons, textures, dynamic lights, AI, physics, animations, multi device input, multi-channel audio, I/O, network synchronization, HUD, etc, etc. We are talking about an application that is miles and miles ahead in terms of complexity compared to the vast majority of web applications we see nowadays.

Currently, we have tiny machines in our pockets that are hundred times faster than the computers from 1997, but still, a website that only shows a static table takes some seconds just to have its JavaScript code generate the HTML for the user. What went wrong that we not just achieved this crap, but worse than that, we normalized it? Current web applications should be running at least a hundred times faster than Quake 2 used to run, and on machines from that era!

Developers seem to have no idea how fast HTML and CSS can be parsed and rendered. The slowness of the web is not caused by these two guys, unless the developer is doing something amazingly stupid. Even vanilla JavaScript can perform decently when used with parsimony. For instance, it can parse a thousand entries JSON and populate them into an HTML table in a tiny fraction of a second. It’s so fast that, after downloading the JSON, no asynchronous call is needed for the rest of the process.

All the optimizations that developers have been hungry for to avoid low performance on the web is mostly caused by the inefficiency of these bloated frameworks and their packages. It’s completely nuts that modern tools normalized the concept that it will deliver crap by default, and it’s up to the developer to find a workaround if they don’t feel happy with the result. Imagine a car manufacture telling mechanics: make your client’s car work properly by optimizing the engine.

Oh, and good luck trying to find external help to fix the issues you’re facing, because for each situation there are dozens of answers that may or not apply, depending on the magic combination of tools/versions behind the application — and there are so many combinations that chances are your case is unique. It’s not uncommon to read things like ‘answer A didn’t work for me, but answer B did’ [12].

JavaScript bundle therapy

There are countless tools to parse and bundle the JavaScript code of web applications, but Webpack is by far the most popular one. It’s no surprise it’s written in JavaScript because why not, right? What is amazing, though, is how terrible it performs, not just regarding CPU performance, but especially memory-wise: 5 GB of RAM to bundle a React application with 150k lines of code (not including the dependencies). All this obscenity is seen as normal nowadays. No one bothers to question it anymore. We’re supposed to shut up and be grateful for this rubbish ecosystem.

What’s even more annoying is that Webpack’s main mission is not accomplished decently. Instead of removing all unused JavaScript code (the so-called tree shaking), it insists on bundling useless code that will never be reached. For example, imagine class A:

import React from 'react'
import { myEnum } from './classB'

const classA = () => {
if (myEnum)
console.log("hello")
}

And class B:

import React from 'react'
import { CssBaseline } from '@mui/material'

export enum myEnum {
item
}

const classB = () => {
return (
<div>
<CssBaseline />
</div>
)
}

If the application never renders class B, and considering class A is importing only this decoupled enum from class B, ignoring CssBaseline, we would expect the JavaScript bundler to exclude CssBaseline. But that’s not what happens! In other words, it will recursively bundle all the imports of the declared classes. When you sum up all those mistakes, the result is something like this:

Webpack in production mode: bundling more than 50% of unused code for our joy

But if the code is not being used, that shouldn’t be a problem, right? We have to remember that JavaScript code is usually downloaded, parsed and compiled on the client side, so depending on its size and the machine it can result in very poor performance even after caching [13], not to mention shorter battery life. That’s the main reason why most applications ship their code in the so-called minified version. Just another workaround to optimize something flawed in its concept.

Sometimes I wonder if web development is not just a prank. Just take a look at this yarn nonsense when you try to call yarn install [package]:

Ha!

Developers behind yarn simply decided to remove a well-established keyword used to install packages. And to make matters worse, they added a message to teach the user the right way of doing it, which is way more work than just installing the thing since they know exactly what the user is trying to do! There’s clearly some sadism in these guys’ soul.

Orphan yarn files in /tmp because yes — congrats for the asymmetric number convention

How far the web crap can go

What about the server side then? It’s probably more sane, as it has to be efficient and reliable, correct? Well, since 2009, when Node.js was released, all kinds of servers have been created and ported to this JavaScript framework. At this point, it couldn’t be more predictable the fact that Node.js is mainly written in JavaScript, a language that is very limited, inefficient and clearly not designed for anything close to handling a server — there’s not even support to multi-thread.

Let’s step back a bit and think about what’s happening here. In a static compiled language like C/C++ or Rust, the developer compiles the application once so users don’t have to do it ever. Now, with just-in-time (JIT) compiled languages like JavaScript, not only does each user have to compile the code, but every single time they run the application it has to be compiled again. Not to mention that even after compiled, JavaScript will not perform as fast as a static compiled language [14]. Finally, stack on top of that all this garbage code introduced by JavaScript frameworks. The magnitude of inefficiency is just insane.

OK, let’s avoid online applications as much as possible because most of them are clearly screwed. Let’s try desktop versions of the applications we need, like Postman, this tool made to test endpoints.

Postman for desktop — wait, create an account?

If you smell web in this desktop application, you’re not wrong. Postman is written using Electron, meaning it has a full Chromium browser embedded, including DevTools, libffmpeg, etc. And… Surprise! It’s written in React Native:

Why keep it simple if they can overcomplicate?

Yes, your desktop application is in reality a browser accessing React Native code that depends on countless packages just to generate mundane HTMLs. So what’s the price for all that? Well, in version 11.3.2 for Linux there are 22,000 files taking 410 MB of space storage and more than 700 MB of RAM to load the offending initial screen. They want us to believe that the newest curl 8.8.0 for Linux, while more powerful and having only 3 files taking 1 MB of space storage combined and consuming just 10 MB of RAM, is this light because it has no UI. Sure, sure…

Postman needs over 25 MB of minified JavaScript code just to show a login screen no one asked for

This Electron nightmare is everywhere, including many popular desktop applications such as Discord, Dropbox, GitHub, Teams, Spotify, Visual Studio Code, WhatsApp, etc. They all behave similarly to Postman. And in the mobile realm it’s no different because most software companies don’t want to invest money in a different application for each platform. So if a web application has also a mobile or desktop version, chances are they all are written using some sluggish JavaScript framework and come with an embedded browser.

In the past, a flexible programming language would be multi-platform, and the compiler would be responsible for generating a binary for each platform. But because JavaScript compiler is in the browser, developers see no other way except shipping their desktop and mobile applications with an embedded browser. By the way, browsers nowadays are huge monsters that have more lines of code than the Linux kernel. It’s stupid to the point I feel embarrassed just to write this down.

The only realm untouched by JavaScript is then the operating system, right? Not exactly. On Linux, the most popular graphical environment, GNOME, has 50% of its shell code written in JavaScript [15], and third-party extensions for GNOME have to be written entirely in JavaScript. PolicyKit, a popular Linux application that allows unprivileged processes to speak to privileged processes, also depends on a JavaScript engine to work.

Firefox v128.0 source code: more JavaScript than any other language

A glimpse of sanity

What’s the solution then? While I could argue that static HTMLs would be enough for most web applications, it’s understandable that a developer might wish for certain conveniences, like templates/components, automatic DOM update, support for third-party plugins, etc. Luckily the world still has a few sane people, and some of them decided to create SvelteKit, a lightweight multi-platform framework focused on performance that gets faster and smaller each major release. SvelteKit doesn’t impose any of the bullshit trends promoted by React, and out of the box it does what we expect from a tool: it helps developers while providing a decent experience to the users.

SvelteKit highlights — if React could only understand some of that…

However, if the web application is intrinsically complex, JavaScript is just not suitable, period. In this case, it’s recommended to use a proper programming language instead. For example, Blazor framework, which uses WebAssembly to run C# code directly in the browser [16]. No bizarre syntax, no winding code flow, no infrastructure madness and, especially, no unpredictable performance.

Perhaps the most important thing is to avoid over-complexity. React is just one big example, but there are many other ways to add unnecessary entropy to your project. Do not assume you have to fight code for days just to implement something trivial. Start with the absolute simplest ecosystem possible; experiment with things. Avoid dependencies as much as you can. The fewer moving parts, the better. If things start to become too complicated or you feel you don’t have much control over what’s happening under the hood, go back and challenge your team to think outside the box. It’s totally fine to give up on things if they’re not going the right direction. At this point, we have many decades of human experience in the field, amazing hardware and tools, so there are no excuses.

Some people have been complaining that Medium suffers from the same issues discussed in this article, and that’s true. Medium is definitely bloated and its countless popups are annoying. I’m sorry about that, but I receive no money from these articles and am still considering switching to another platform — you know, most of them start out great and become enshittificated over time.

Discussion:
Hackers News

--

--

Responses (36)