No one cares about it. Why should they? As developers we’ve written mainframe code with dumb green screens; gone to client / server because response times were unpredictable and interactivity was lousy; found client / server was too complex (read: expensive) both to write and deploy; put out new software on new mainframes with cheaper hardware and dumb colour screens (web browsers) except the screens are more unpredictable and development / test systems no longer match production (because it would be more expensive, and we’d require the skilled sysadmins that were let go when we dropped the client / server model); found that aside from pretty colours and fonts web applications had the same unpredictable response times and lousy interactivity as green screen applications did, and were less reliable on top of that; brought in AJAX, node.js and other techniques to redo client / server in a way that solves the deployment issue (so companies don’t have to hire decent sysadmins) but makes writing it even more complex, since the environment is less predictable, the constraints are tighter and less predictable, the language sucks, incorporates nearly every error of language design of all other languages put together, and is particularly susceptible to bugs that are triggered by unpredictable environmental constraints, and on top of all that has no decent tooling.
Yea, software sucks. Do we know how to fix it? Of course we do, since when shit “has to work” it’s not written that way. Why don’t we? Because writing software that way is expensive, so companies won’t push for it, and it isn’t “cool”, it’s not “the newest thing” that simultaneously has to be exactly like the oldest thing, so developers won’t push for it.