Why UX is a critical part of the stack
Whenever the words ‘User Experience (UX)’ are tossed around, developers, engineers, managers, customers, and pretty much everyone except the UX team usually imagine one thing: dark theme. Just kidding. Sort of. Usually the first thing that comes to mind is the user interface (UI), or more specifically the way the app looks. They might as well be thinking of graphic design. Pretty layouts, colour themes, rounded corners, and the logo!
If you just thought that too, well, I hope you continue reading. If you know anyone that thinks this, maybe pass them this article. And if you’re a UX designer yourself, then I’m going to try to reintroduce it from a different angle.
The Brain as an Interface
Everything in computer science is based on the idea of interfaces as a means of connecting abstract layers. Firmware interfaces with hardware. Applications interface with operating systems. Code interfaces with code. This is the basis of the full-stack, where each level of the stack gains access and interacts with another part of the stack. There really isn’t any kind of coding that doesn’t involve the constant use of interfaces.
But the concept of the stack stops at the UI. Once the machine is all hooked together and there is someway of clicking buttons and typing text, all is good and well. I’ve even met some devs that stop short of the UI. They would rather live in Linux land and stay on the command-line forever. And, you know what? They’re right. At least for them and their brains.
Applications need to interface with people. No matter how automated a system is, there is always a point of contact with a human being somewhere along the way. Sometimes this human has learned the holy computer language and their brains are capable of writing BASH all day long. For other devs, maybe an IDE is a better option. For the computer illiterate, however, they need an interface that works with their non-coding brains. When it comes to brain-computer interfaces, short of plugging into the Matrix, the best option we currently have is UX.
The Problem of Many Brains
When choosing a stack and developing for a platform, there are very clear and explicit territory drawn by the various compatible languages and tools. Desktop is different than mobile or web. MacOS is different than Windows. Android different than iOS. But, as long as you develop for a single platform, all the target devices will all work the same. Mostly.
The problem of fragmentation has been around for a while now. Android might be the first one that comes to mind lately, but Windows XP was around forever until it was finally officially unsupported by Microsoft in 2019. In the land of tech, trying to constantly keep up with new version of operating systems, languages, libraries, etc. can lead to some teeth-grinding nights. Even Apple with their controlled updates and closed ecosystem, still has problems with older hardware that can’t keep up.
Designing UX carries with it a similar, albeit, more fuzzy problem. When identifying the core users, the number can range from dozens for smaller projects, to millions at a large scale. Each person has their own brain with their own brainOS. They can come from different countries, speak different languages, are born in different generations, and so on. They also all have their own unique hardware we call bodies. And to get meaningful data on them, you have to talk to them and figure out what they mean even if they aren’t entirely sure themselves. Oh, and they can lie too. It becomes an entirely different fragmentation problem.
If we threw enough time and resources at a computer-based system it is possible to have it run perfectly. The way the circuits are designed, how the electrical bits flow, and how the code works all function in a closed system with a definitive set of rules. There may be the odd environmental factor here or there that affects the hardware, but for the most part if the hardware works as expected, the code can be made bug free in theory. Every piece will be perfect, and every interface cleanly connected.
For the user interface, however, it is impossible.
There are so many factors and variability involved that there is no way of ensuring that 100% of users will use the application in exactly the right way. The best that can be done, is what is usually achievable for human-based systems: the bell curve.
It’s a lot like a teacher trying to make sure all the kids pass math class. Some will fly through, easy peasy, while others will struggle to figure it out, and most will be in between somewhere. For any set of users, the hope is that most of them will be able to do most of the important things most of the time. If that can be obtained then the UX team has done a top-tier job.
Minimum Viable UX
In tech, the MVP dominates. It’s all about coming out of the gate as fast as possible and iterating from there. I’m all for minimalism. There tends to be a lot of fluff and extra features that don’t add any real value to the product. Trimming the fat is an excellent strategy, and Occam’s razor can get the job done. What tends to happen, though, is that a little more than the fat gets cut out.
A lot of development lives in the backend. It’s the core code that allows the system to work in the first place, so it only makes sense that sets the foundation. It makes up the heart, the organs, the meat and bones. So, yeah, it’s pretty vital. The front-end in this analogy starts to feel like the clothing we wear to make us look pretty. It also serves some practical purposes like keeping us clean and warm. UX then becomes our personality. It’s the thing that lets us, you know, be social and interact with others. Sure, someone without personality functions and looks alright, but you don’t really want to go out with them for dinner after work.
That’s basically what happens when an MVP is decided upon. The core functions are there with some passable UI and that’s that. The founders will then hire a bunch of full-stack developers to build it because “they can do it all”. Then off they go into startup world. Without a proper minimum UX, however, a lot of the work ends up wasted simply because the user is unable to make the most of the features available. It’s like putting all this effort into making an elaborate dinner, then forgetting to bring plates and utensils.
The Not-So-Full Stack
Some might argue that at this stage of development it’s a waste of effort to make it look pretty, and they’re completely right. It’s not about having a nice looking UI, it’s about whether people like to use it and can use it well. There are plenty of frontend frameworks out there that make the pretty part easy (or at least easier). Yet there are still a lot of terrible UI’s with pretty buttons and colours, because the UX was not considered as part of the MVP.
The title of a full-stack developer is a badge of honor in the coding world. Being able to understand and work with a wide range of languages, frameworks, and tools takes a certain pride and dedication to coding. Programming itself is hard, frustrating, and complicated at the best of times, so it’s understandable that adding more to their plate is unreasonable. Yet, without including at least a minimal UX as part of the stack, the overall system is incomplete. It makes it to third base without ever hitting it home.
By extending the concept of the stack with UX, it changes the lens on the development process. It is as important to connect the application to the user, as it is to connect the database to the logic, or the classes and modules to one another. It is called a user interface afterall. It’s time to add the user — and their brains — into the stack. Else it ends up being just another broken interface.