Apple to developers: There is no escape.
Quincy Larson
3.4K224

This kind of thing has been Apple’s MO since 1982.

What I blame Steve Jobs for, more than anything else in the industry, is a strange and pervasive false minimalism where power and flexibility are removed in favor of one-off features. This starts with the original Macintosh.

The Macintosh team was originally working under Jef Raskin on what at the time was slated to be the Macintosh but eventually became the Swyft, and later the Canon Cat. It was a genuinely innovative design: a device that worked primarily as a word processor, but that had the capacity to perform programming operations in-place in the context of a document, like a combination of literate programming and spreadsheet macros. Steve Jobs, having had become obsessed with the Alto he saw demoed at PARC, had started the Lisa project, and when it became clear that Lisa was going to be a flop (it was expensive and underpowered, totally unusable without an aftermarket hard disk that cost as much as the computer, had no applications to speak of, and cost as much as a new car), Jobs got rid of Raskin and took over the Macintosh project, pushing it away from the original concept and towards being a lower-budget version of the Lisa. As a result, he pushed for lower specs: a last-generation CPU, not very much RAM, no support for TV output or external monitors, and a small monochrome (not greyscale) display. In the end, it debuted alongside the Commodore Amiga 1000 and the Atari ST, both of which had double the RAM, impressive high-resolution color display on standard color televisions, multitasking, and faster current-gen CPUs, at half or a quarter of the price. Somehow, the Macintosh sold better.

During development, Jobs was very wary of the ‘mistakes’ made with the Lisa, and so he forbade the developers from adding new features. Famously, he didn’t allow the hardware team to add any expansion ports; someone secretly added a single expansion port to the board design, however, and the original Macintosh shipped with a board set up such that an intrepid user could solder a connector onto the motherboard and get an expansion port (which couldn’t be used because the OS didn’t have any support for it). Removing expansion ports is, however, typical of Jobs’ attitude here, and it’s typical of his later work at Apple when he came back. (He was forced to resign over the failure of both the Lisa and the Macintosh, and took most of the Macintosh team over to NeXT, which doesn’t seem to have this particular problem somehow!) His idea was that the Macintosh was a computer for “regular people”, who he defined as people who would not only be incapable of using an expansion port but who would be incapable of learning what one is. In the end, plenty of cost on the Macintosh went into fancy beveling and other purely aesthetic aspects of the outer case, advertising, and other marketing concerns. The lesson was: a mediocre product, if it looks pretty and is advertised very well, can become the basis of a successful brand.

The lessons of the original Macintosh became a strange kind of warped “worse is better” ideology that affected Apple only occasionally during Jobs’ 17-year exile and almost constantly after his return. As soon as Jobs returned, he cancelled most ongoing projects, shifted focus from the pretty servicable beige boxes that represented macs in the mid-90s to the technicolor form-over-function blob of the first generation iMac, and ordered that floppy drives be excluded from all future devices (in 1998). Excluding floppy drives probably removed some cost, but margins for Apple products have always been huge in order to allow for big advertising budgets: owning Apple products, even in the early 90s, was as much a conspicuous display of wealth as it was a utility (outside of schools, which had deep discounts on Apple products as another marketing technique). With software developed under Jobs’ reign (like iMovie & iTunes), even the limited flexibility and configuration of earlier mac products mostly went away or became more limited: as part of the marketing push, Apple products were set up to be configurable in only limited ways pre-screened to be in line with the style approved by Apple’s designers, and Apple software was designed to work primarily with Apple-proprietary formats even when open formats existed and were superior (in a mirroring of Microsoft’s extend-embrace-exterminate strategy from the same time period). The unofficial slogan for Apple was “it just works”, with the hidden caveat that “it” was limited to a small set of tasks Apple had chosen to focus on — while Apple made it easy to do the very specific things they wanted people to do, doing anything else was much more difficult than on competing platforms. This kind of shallow marketing-first strategy was very successful, even though most people more or less recognized it.

This strategy continued (even though the UNIX base inherited from NeXT made it difficult to lock-down software extensibility on macs and systems like homebrew became common). The original iPhone wasn’t supposed to have third party apps; later, the app store was added but in a way existed as a way to funnel money into Apple moreso than as a way to actually add third party apps to the platform: Apple retained complete veto rights to apps, had long evaluation periods, required third party developers to pay $100 a year to even be allowed to submit their code, and forced these developers to write everything in a language that, while not strictly Apple-specific, is an obscure early C++ competitor that only every achieved any kind of traction at NeXT and later within Apple for compatibility reasons. Apple used this mechanism to remove: apps that criticized Apple, apps that competed with Apple’s own services, apps that violated Apple’s current design guidelines, and apps that were reviewed by somebody who was in a bad mood that day. The iPhone has no distinction between desktop and application drawer, and has no task-switching capability. The iPod, unlike the mp3 players it initially competed with, only worked with macs, only worked with iTunes, only worked under firewire, and had extremely limited controls and configurability; while some of these limitations were changed later, this was only done as a means of bringing this overpriced luxury object to the masses of people who still ran Windows. Apple laptops started the push toward being thin and light, and removed useful things like disk drives and expansion ports using the excuse that thinness and lightness were legitimate goals; of course, while laptops benefit from not being heavy, after a certain point being functional becomes more important than being lighter, and being thin only means the device is easier to break: Apple had essentially invented an excuse for charging more money the less they actually invested in their device, so that whenever they removed a feature or a piece of hardware they could add both the cost of the hardware and their manufactured percieved value of thinness to the cost of the end product. More recently, Apple has pushed for the removal of both the function keys and the headphone jack, as part of this general push towards smaller and lighter devices.

Now, this would be one thing if Apple could be ignored. My complaint isn’t just that Apple has the kind of abhorrent paternalistic attitude towards its users that used to justify the rush for Africa; after all, individual people and individual companies have all sorts of abhorrent attitudes and ideologies, and they largely don’t directly affect me because I can avoid putting money into those organizations. My big complaint about this is that other hardware and software companies, faced with Apple’s success, have tried to emulate their objectively awful hardware and software design decisions (thus perpetuating this strange design sense that users are children who need to be protected from choosing ugly color schemes by Big Daddy Ive), never realizing that the key to Apple’s success in everything since the end of the Apple II line was to produce mediocre products, sell them for many times their actual value, and spend enormous amounts of money on ad campaigns to convince people that mediocrity is amazing and that even really common things were secretly invented by Apple. In other words, Apple, since the early 80s (but especially since 1997), has been the Kim Jong Il of the tech industry, and their ideas are being gobbled up by lots of people who really should know better.

The existence of function keys on a new-model laptop is really sort of a non-issue in the scheme of things. Apple has been torpedoing third party development in much bigger ways for a long time. Function keys are only used for development by users of IDEs, and the intersection of IDE-using developers who also use brand-new Apple laptops is a group whose situation and opinions in the grand scheme of things shouldn’t matter much. These people are already shooting themselves in the foot paying four or five times what they should be for an ultimately less functional machine than what they’d get if they bought a used Thinkpad on ebay and stuck Linux on it; it’s not the end of the world if they also have to use a crappy mouse to navigate ill-designed XCode menus sometimes. Maybe these people are developing on a mac because they are trying to develop for the iPhone; in that case, their slower rate of development may hasten the death of that device as well, and it would probably be a net good.

Like a lot of people, when Steve Jobs died it made me hopeful. Lots of legitimately smart people work at Apple; maybe one of them would take charge and reverse some of the most damaging policies, the way that many of Microsoft’s worst policies changed after Ballmer got ousted. I expected that after a handful of things that were still in the pipeline from the Jobs era managed to get pushed through, we’d start to see some good decisions: laptops with thick protective cases, metal hinges, and locks to keep them closed; three-button mice; hardware that’s actually up to date; machines that ship with homebrew already installed; iPhones you can run Android on. But, I’ve lost that hope: I now suspect that Apple, like RIM and Oracle, will keep to its current course until it finally screws itself over enough to actually die — and, like Atari and IMSAI before it, will probably become a free-floating brand to be placed on arbitrary hardware based on whoever buys the rights.

A single golf clap? Or a long standing ovation?

By clapping more or less, you can signal to us which stories really stand out.