Apple and their dark horse called Silicon Engineering

Shashank M
Mac O’Clock
Published in
5 min readJun 22, 2020

Today’s WWDC while being virtual and being accessible to almost anyone with an internet connection, also probably had some of the biggest changes in Apple’s history and product line. Today’s announcements are a much bigger deal than that new iPhone or that new iPad event generally held just before the holiday season. Why you may ask? Fair point. After all, the juggernaut called Apple still relies heavily on its hardware sales, mostly the iPhone to fill its coffers and the reliance on hardware sales is far more than what its peers do. Yet with smartphones becoming increasingly commoditised and the global smartphone market getting saturated you can only move so many iPhones every year.

This brings us back to today’s event and the announcements made. Apple for the first time in a decade revealed the decision to switch to its own processors for their line of Macbooks. That ladies and gentlemen is a big deal! While the Macs were and never have been a volume based business for Apple, the people who create stuff, like developers and creative professionals overwhelmingly use Macs to create content and applications for use by the wider public. Almost a decade back Apple made its debut into the high stakes silicon race with its A4 chip powering the then, fresh off the line, iPhone 4s. The chip allowed Apple to craft some experiences like shipping the phone with a camera whose quality was far ahead of its rivals, allowed them to ship Siri a year later and after a couple of years introduce mobile based machine learning, all on the device itself. This approach of processing data on the user’s device rather than on the cloud gave Apple several key advantages over its rivals who still needed a remote server to process a lot of operations. Advantages like:

  1. Increased privacy to end consumers. In a world which was becoming increasingly digitised with data shared like candies on a halloween night by big tech companies, the out-lash against such practices picked up enough steam to actually matter to your average joe.
  2. By making your end user privy to a device which could handle onboard Machine learning algorithms, you could do lot of additional stuff in processing that image data from your camera sensor, handling contextual actions and suggestions onboard the device.

These advancements in their silicon engineering enabled them to develop custom software which ran like butter on their hardware thereby enabling them to provide richer experiences to their customers while entirely eliminating integration headaches with various vendors and worrying about compatibility across a multitude of devices.

Apple is no stranger to low level hardware changes. Each decade over the past few decades has had at least one big architectural change in the Apple world. From PowerPC to Intel and now to their A series of chips, Apple has made these changes regularly. However unlike the past few decades where you had multiple vendors shipping multiple parts and a dozen different fragmented OS across devices, these days the world has been reduced to a handful of vendors. With the devices across the spectrum converging and sharing functionality and allowing interoperability, it has become even more important for companies like Apple to stake a claim in the connected world. By owning one of the only two major OS for mobile platforms and by virtually having the tablet and the wearable market to themselves Apple is in a fantastic position to leverage this change in market dynamics.

The quest for interoperable devices across a broad spectrum of devices also leads to a scramble for ensuring maximum tie-ins and common infrastructure sharing between these connected devices. While the hardware can vary greatly, the hooks to integrate these devices and the ability to control these new age devices fall back to the smartphone for the average consumer. Smartphones have become the de-facto remotes for these devices and devices like wearables such as Apple watch are only further cementing this paradigm shift.

All these points above converge to the opinion that those who control and enable the superior handling of these connected devices, allow seamless connectivity to carry out daily life from ordering food to starting that ride on that cool new ebike, may eventually dominate the consumer market. To make this vision come true you need class leading performance in your devices which can do complex tasks on device rather than requiring the internet for the achieving the same. And to power these experiences you need to create a compute engine which can handle these responsibilities without breaking a sweat. This is the where the investment Apple has made in beefing up their hardware team has paid off handsomely over the past few years.

The Macbook was the last holdout in the Apple stable which still relied on a 3rd party vendor to handle the operation of the system unlike Apple watch and the iPhone or the AirPods. By switching to an ARM design at the core level architecture and by ditching Intel, Apple finally has the flexibility and freedom to do crazy amounts of integration magic which they have done for their mobile devices. This also should sound as a warning sign for Intel which has failed to innovate at the level needed for powering the newer age devices and resting their laurels on past triumphs in the PC market.

Will this move make other manufacturers who rely on Intel to follow the same route? Is ARM the future? Only time will tell. However for the average consumer this does mean a significantly powerful device which can allow him/her to accomplish far more tasks, for far longer at a much faster pace.

Feedback? Need some more insights into core engineering?
Connect with me on LinkedIn || Twitter

--

--