The Paradox of Thinking Differently: Apple, Authoritarianism, and Innovation
During Apple’s special event earlier this month, Phil Schiller attributed the decision to do away with the headphone jack on the iPhone 7 and 7 Plus to the company’s “tremendous courage” — “courage to move on, do something new, that betters all of us.” The senior vice president of worldwide marketing went on to justify this choice as an innovation intended to propel consumers into the future of digital and wireless audio:
‘There’s a little bit of pain in every transition, but we can’t let that stop us from making it. If we did, we’d never make any progress at all. The question we ask ourselves when making transitions like these is, have we done all the right things to mitigate it and explain it and to make what’s on the other side so good that everyone is happy with the change? We think we’ve done that.’
Early reviews of the iPhone 7, 7 Plus, and the new AirPods have generally been positive, but the vast consensus of internet users challenges this assumption.
Back in June, Nilay Patel at The Verge derided the notion of eliminating the 3.5-millimeter TRS (tip-ring-sleeve) headphone jack as “user-hostile and stupid” and accused Apple of being “so out of ideas that actively making [smartphones] shittier and more user-hostile is the only innovation left.” More recently, Nick Statt identified Schiller’s justification as an exemplary instance of the “trademark Apple arrogance, indicative of a company culture in which doing what’s logical and consumer-friendly is often conflated with doing whatever Apple executives think is best for its own product lines and for the industry, standards be damned.” The company has referred to “courage” before in justifying its approach to controlled innovation.
As Dan Frommer points out, Steve Jobs justified prohibiting the development of an Adobe Flash plugin for iPhones and iPads back in 2010 by explaining that Apple is
‘trying to make great products for people, and so, we have at least the courage of our convictions to say ‘we don’t think this is part of what makes a great product, we’re going to leave it out.’ Some people are going to not like that, they’re going to call us names, it’s not going to be in certain companies’ vested interests that we do that, but we’re going to take the heat ’cause we want to make the best product in the world for customers. And we’re going to instead focus our energy on these technologies, which we think are in their ascendancy and we think are going to be the right technologies for customers. And, you know what? They’re paying us to make those choices. That’s what a lot of customers pay us to do, is to try to make the best products we can. And if we succeed, they’ll buy ’em; and if we don’t, they won’t. And it will all work itself out.’
It’s not entirely coincidental that the heyday of Flash in web design has ended. Apple’s courageous stubbornness often proves more than prescient, particularly given its track record of creating narratives depicting certain changes, exclusions, and updates as market-driven, obvious, and inevitable.
Apple has a knack for casting its decisions as universally beneficial innovations. Kaveh Waddell remarks on the company’s tendency to present technical compromises and commercial objectives as necessary evolution and their assumption of a role of innovative authority tasked with determining “which technologies can stay and which must go.” Apple’s succession stories usually partake of reason and truth, but introduce and recursively rely on assumptions regarding inevitable change. Ian Bogost reflects that its official narratives make it “hard to imagine” alternative outcomes or developmental trajectories, particularly when real outcomes are controlled and prevailing narratives are produced by the same self-declared authority.
Inevitable Innovation
After invoking Apple’s corporate courage, Schiller went on to declare the removal of the unpatented (royalty-free) headphone jack as an inevitability: “You’ve got to do it at some point. There are just too many reasons aligned against it sticking around any longer.” For Apple, these reasons ranged from engineering priorities — the allocation of internal space in the iPhone 7 and 7 Plus given the improved camera(s) and batteries — to marketing prerogatives including the development of AirPods and the company’s acquisition of headphone maker, Beats Electronics.
John Paczkowski at Buzzfeed News reflects that Apple has presented the future of personal audio as wireless simply by asserting that “current assumptions…are not only antiquated, but worthy of immediate abandonment” regardless of consumers’ preferences and prior investments in headphones. The company must make the case that the shift from wired to wireless and from Bluetooth to proprietary W1 chips which only sync with Apple devices running updated operating systems is “worthwhile.” In the meantime, 3.5-millimeter TRS jacks remain on other smartphones including the equally waterproof (albeit explosive) Samsung Galaxy 7 and Note 7, audio devices, alarm clocks, baby monitors, and airplanes. Recent exceptions include Lenovo’s Motorola Moto Z and the Chinese LeEco Le 2, Le 2 Pro, and Le Max 2 models which feature standard USB Type-C connectors, but the lack of controversy over these releases indicates the extent of Apple’s influence over the market.
Paczkowski observes that Apple’s hardware revisions over the last twenty years succeeded because each update “delivered value orders of magnitude greater than whatever it replaced.” The floppy disk was incapable of satisfying growing storage demands by 1998. Apple prioritized the aesthetics and portability of thin and light laptops over the versatility of CD, DVD, and rewriteable drives in 2008 given that large downloads have become routine. In spite of its dated design, the headphone jack is not inadequate for present purposes; it is, however, “just a hole filled with air” and a waste of “really valuable space” inside the iPhone 7 and 7 Plus, as pointed out by Dan Riccio — Apple’s senior vice president of hardware engineering.
Marketing guru Phil Schiller assured keynote attendees that “we’ve been through this many times before”:
‘We got rid of parallel ports, the serial bus, floppy drives, physical keyboards on phones…At some point — some point soon, I think — we’re all going to look back at the furor over the headphone jack and wonder what the big deal was.’
In addition to the lengthy history of TRS jacks — the 3.5-millimeter jack dates back to the 1960s and its predecessor, the 6.35-millimeter jack, was used as an audio connection on telephone switchboards as early as 1878 — the “big deal” is that these jacks are unpatented (royalty-free) and universal, unlike Apple’s proprietary Lightning 8-pin connector or its obsolete 30-pin predecessor.
Greg Joswiak, vice president of product marketing, points out that Apple is attempting to avoid reprising the dock connector debacle of 2012 when the company phased out the 30-pin connector, rendered hundreds of peripherals obsolete, and provided a solution only in the form of a $29 adapter. Even though the iPhone 7 and 7 Plus ship with two transitional compatibility methods including earbuds with a Lightning connector and a 3.5-millimeter adapter that does not allow for simultaneous charging and listening, an adapter that offers full functionality is available for $40. Carolina Milanesi at recode remarks that AirPods only only cost $119 more than this adapter and are “an important tool to show users who are embedded in the [Apple] ecosystem the power of owning multiple devices,” but for these users, a price tag of $159 is considerably less than their other devices, even if they misplace a few pods or pairs.
Grumbling about headphone jacks is a suitable lead-up to iPhone’s 10th birthday in 2017; after all, it is a return to the device’s roots. The first-generation iPhone (2007) didn’t have a standard headphone jack either. The recessed port was narrower than most headphone plugs and required the use of either the iconic white earbuds or an adapter. The jack was standardized on the iPhone 3G (2008) and remained unchanged until late 2016. The relatively small matter of its removal raises much larger considerations regarding the influence exerted by one corporate monolith on digital culture and the global technology market.
The “Courage” to “Think Different”
Throughout 2016 Apple has showcased the quality of user-submitted photos and videos “Shot with” iPhone 6 and 6 Plus, a notable highlight being the compilation of submissions shown during a recording of the late Maya Angelou reading “The Human Family.” The “Shot with iPhone” print, outdoor, and TV spots are likely to continue given the improved camera systems in the iPhone 7 and the dual wide-angle and telephoto zoom systems in the 7 Plus which allow for portrait mode and the “bokeh effect.” Apple must also summon the courage necessary to take on the consensus that wireless headsets are socially awkward and gauche in spite of their indisputable usefulness for multitasking or complying with hands-free driving laws in vehicles lacking the capability to sync with a phone.
Turning an easily misplaced $160 accessory that Geoffrey Fowler at the Wall Street Journal aptly describes as “a cross between earrings and electric toothbrush heads” into an object of consumer desire is a tall order, but Fowler is confident that if “any company has the power to make something like that seem cool, it is Apple.” The absence of the iconic white cords showcased in the “Silhouette” outdoor, print, and TV ads (2004–2008) raises the question of how discreet earbuds ideal for (in)conspicuous consumption might be glamorized. The company’s well-rehearsed lines regarding innovation will likely prove useful.
Although AirPods will surely steal the spotlight, Apple’s innovation of personal audio will stimulate the market for other Bluetooth and Lightning port headphones. Apple’s Beats has designed headphones with the new plug since 2014 — a privilege recently extended to authorized affiliates in the MFi (Made for i…) program. As Apple moves forward into the wireless future it has declared begins in 2016, it is worthwhile to take a look back at the company’s ethos on innovation.
The peculiarly prognostic 1984 ad directed by Ridley Scott aired only once on national television, during Super Bowl XVIII, and played before movie previews in early 1984. The company introduced itself to mainstream America as a colorful, youthful, and radical alternative to a bleak and totalitarian technological establishment. Apple set itself apart from its competitors and suggested that its products were intended to disrupt and overcome the old order and liberate the people. This corporate vision of resistance carried the latent possibility of a new and “different” but equally oppressive order that would not arise for another twenty years.
After being ousted in 1985, Steve Jobs returned with the company’s acquisition of NeXT in 1997 — the same year that saw the epochal “Think Different” campaign developed by TBWA\Chiat\Day. Without missing a beat, Apple began leading the technology market away from customized technical specifications toward color preferences and content consumption on desirable and expensive devices that are remarkably easy to use. Apple computers and devices were indisputably for the people — at a price. The company reasserted the original opposition between the brand mystique surrounding its products and the establishment culture embodied by Microsoft in the series of “Get a Mac” television ads and keynote skits from 2006 through 2009.
Chris Messina observes that Apple Computer, Inc. became Apple Inc. in 2007. The company emphasized creative professionals less and instead encouraged mainstream users to create what Tim Cook refers to as “the diaries of their lives.” Apple invited these users to consume, rather than think, differently through digital content marketplaces. Since then, the company has produced sleeker laptops with fewer hardware features and mobile devices with more, all of which are constantly connected to purchasable content.
Prior to the announcement of the iPhone 7 and 7 Plus, Apple had a reduced share of a growing smartphone market in 2016. Regardless, its innovations have fundamentally shaped the market and it is reasonable to expect that the company will continue to approach technology as a luxury consumer good. The company’s orientation toward consumption was perhaps captured best by Phil Schiller, who claimed back in March that 600 million people use PCs more than five years old and declared this (uncited) statistic to be “really sad.”
Windows authorization certifications and pesky update offers aside, Apple’s shift toward digital content and software downloads has dovetailed with the implementation of Gatekeeper — an application code-signing utility intended to block malware that also — under default settings — makes it difficult to run programs that Apple has not authorized. Although Gatekeeper settings can be adjusted, its existence underscores Apple’s evolving relationship with digital rights management (DRM).
Steve Jobs dismissed and looked beyond DRM in his 2007 open letter “Thoughts on Music” but Nilay Patel and other commentators have suggested the potential for DRM or the rejection of unauthorized peripherals through the digital Lightning port. Schiller maintained that the shift to the Lightning connector or wireless options “has nothing to do with content management or DRM — that’s pure, paranoid conspiracy theory,” but it is certainly not unprecedented. Even as Apple has restricted user freedom in the name of protection and innovation and placed conditions on content ownership, the company has publicly dedicated itself to the protection of digital privacy rights.
Image Wars
The cult phenomenon of Apple passed prior to 2011 and, although the company has a had a few outstanding ads since — see “The Human Family” — it needed a new promotional tack to maintain a trusted brand. In early 2016 the company was issued a court order under the All Writs Act of 1789 to create a version of iOS capable of disabling security features on an employee iPhone issued to Syed Rizwan Farook — one of the San Bernardino shooters — by his employer, San Bernardino County. Apple cooperated with the government until it was revealed that the FBI had ordered county officials to reset the password on the Apple ID and iCloud account associated with the phone. No automatic cloud updates could be obtained without either entering the security code or hacking into the phone. Apple refused to compromise user security by creating a backdoor for the FBI and on February 16, 2016, Tim Cook published an open letter explaining the implications of the government’s demand and the rationale for the company’s refusal.
In “A Message to Our Customers” Cook clarified that the government was demanding that Apple “take an unprecedented step which threatens the security of our customers,” the “implications” of which extended “far beyond the legal case at hand.” Apple had “no sympathy for terrorists,” but they also did not have “a backdoor to the iPhone” and considered this feature “too dangerous to create.” This exploit would function much like a physical “master key” and even though the government claimed that its use would be “limited” to the present case, Cook maintained that there was “no way to guarantee such control.” Apple’s stance appears to be the opposite of authoritarian — indeed, the company cast itself in the role of the defender of freedom and privacy. Yet, as in the 1984 ad, Apple relied on resistance to the specter of institutional authoritarianism to make a case for its own independent character. This meant protecting proprietary encryption and the rights of end-users even when the user in question was a deceased terrorist.
Cook invoked the “chilling” implications of the availability of such an exploit: the government could “reach into anyone’s device to capture their data” and would have a precedent for demanding that “Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone and camera without your knowledge.” Apple resisted this “overreach by the U.S. government” while declaring that it felt the “deepest respect for American democracy and a love of our country.” The company’s position attracted amicus briefs from legal experts, civil and digital rights organizations, and many technology companies.
On February 18, 2016 the editorial board of the Washington Post — a paper acquired in 2013 by Amazon, one of the companies that supported Apple — declared that “Apple should not be forced by the government to decrypt users’ data.” The “political,” as opposed to judicial, “branches of government” were responsible for determining the limits of “device security and law enforcement access” for counterterrorism purposes. In the meantime, the conflict raised a series of pertinent questions:
To what extent is it reasonable to force companies to write new code and harm their international reputation for data security — and, therefore, their business models — in order to help the U.S. government hack into suspects’ phones? Should this be a routine investigative tool, or reserved for extraordinary situations, or beyond the pale? Farook’s is an extreme case, but it is easy to foresee the government attempting to apply All Writs to less important investigations. What sorts of software can the government compel tech companies to write?
The editors concluded that Apple’s “role as a leading exponent of data security” carried “special responsibilities” in the context of democratic nations seeking to prevent terrorism and regimes in which “dictators would use anti-terrorism tools to crack down on dissenters.” They expressed hope that Apple would “fight as hard to safeguard its users’ privacy from authoritarian abuse.”
On February 25th, David Ignatius, an opinion writer in the Post, pointed out that Apple asserted the precedence of a “private company and the interests of its consumers…over the public’s interest as expressed by our courts.” USA Today columnist Michael Wolff inquired “Who does Apple think it is?” and proceeded to imprecisely summarize the situation as one in which a tech company and “an array of left-wing supporters” defended the rights of a deceased terrorist. Apple and its “liberal” and “left-wing” supporters refused to recognize the necessity of undermining encryption to fight terrorism. Tim Cook’s argument was pure “agitprop” identifying the government as the “enemy, even the operative villain, in modern life, perfidiously or mindlessly intent on taking away the privacy of its citizens.”
By 4 April, Wolff concluded that the issue boiled down to public relations on both sides. The government had accused Apple of “PR grandstanding” but it was certainly convenient that the FBI accessed Farook’s data without the company’s compliance days after the bombings in Brussels. The government “could not let Apple win” this skirmish over image; “[t]herefore, it announced victory for itself.” Apple forced the government to exploit encryption and publicly undermine user privacy. The “real point” involved neither the “hunt for terrorists nor the sanctity of encryption” but branding, prerogatives, and principles.
Apple’s legal conflict crosscut national security and revealed a fundamental disjuncture between private (corporate and consumer) and public (institutional) interests. The official stance assumed by the company on innovation and users’ rights is of a piece with their approach to technological history. Regardless of whether the public is unaware of the implications of encryption or less vexed technical topics, Apple claims to be qualified to determine what is best — at least for the company and its customers — regardless of institutional practices or public opinion.
“Free to be the greatest”
Pop songwriter and reluctant star Sia premiered “The Greatest” at last week’s keynote address with a live dance performance. The video for the single is intended to express solidarity with the victims of the Pulse nightclub shooting on June 12 but the ode to empowerment readily applies to Apple’s present situation. Nearly a decade after the first iPhone was released and five years after the death of Steve Jobs, Apple has still “got stamina.” Next year the company will open the new $5 billion, Pentagon-sized Apple Campus 2 — also known as the “spaceship” — in its hometown of Cupertino, California.
Whatever the Next Big Thing is — whether it is made by Apple, acquired by Apple, or comes from another venture altogether — some commentators, including Austin Frank on HackerNoon, suggest that Apple is unlikely to be “truly innovative” in the future because it has come to “value preservation over expansion.” Although “millions of us fell in love with Apple for improving our lives and transforming the tech world before our eyes,” Frank argues that “those days are gone.” It is impossible to disregard Apple’s influence on the meaning of innovation in the twenty-first century even if the notion of thinking differently is merely a veneer over increasingly regimented possibilities.