Impact of the M1 on Computing

Scott Francis
An Idea (by Ingenious Piece)
7 min readApr 25, 2022

When the M1 MacBook was introduced into the MacBook Air, it was a revelation. But, it was a new platform, new chipset, new drivers, and an 8MB limit on RAM (or was it 16?). So I had to be content to wait for an M1 MacBook Pro before contemplating switching to an M1 machine.

But I can hardly blame Apple. They hit their highest volume laptops (and desktops) in one shot with the M1, with the constraints it had.

The M1 Experience as a User

Using the M1 MacBook Pro in practice has been kind of amazing. It is ridiculously fast. I’ve gotten used to the idea that I won’t notice changes in chip speeds when I upgrade hardware — that the changes will affect some tasks, but nothing I’ll notice just — you know — browsing.

But with the M1, I noticed it right away when I opened a file in Word and it felt like it opened (from a Mail attachment) almost before I finished double-clicking. Okay. Silly example. But Word wasn’t running on native binaries yet. So I was impressed. Excel is faster too. Everything feels faster.

Better still, the battery is insanely good. I have used this laptop for hours of work on a plane and still had 75% of my battery left — and sat in a lobby waiting for my hotel room for 3 more hours of work, still north of 50% battery life. I can sit on a Zoom for hours outside on our office patio — not worrying about plugging in. Maybe that doesn’t sound like a lot? But in the Intel Mac era, I’d never have tried to do a call without plugging in — because you just never knew when your battery would suddenly accelerate to zero. And you might do some work on the plane — but 3–4 hours of work was absolutely the maximum before you were shutting down with 10% battery. I used to watch movies and/or work on my iPad because of its insanely good battery life… and this M1 MacBook Pro exceeds it.

Maybe no one needs this much battery while working from home. Maybe everyone will just get M1 Minis instead — or something not-portable. But with the M1 there just doesn’t seem to be a tradeoff between your portable compute, and your stationary compute — unless you’re getting a Mac Studio at a significantly higher price point.

I have the 16" M1 MacBook Pro. I love it. The only knock on it for me is the weight — which might be counterbalanced by not having to carry my power supply with me wherever I go. My previous Intel MacBook Pro was lighter. And for that reason, I can see the 14" M1 being the sweet spot of the “Pro” lineup. All the same performance, a lot less weight in the bag, and most of the battery duration you’re looking for. But we’ll see! That 16" screen sure is nice.

I can also imagine these machines being game changers for students — who could expect to go a whole day at school or in class without plugging in. When the battery life is so good that you don’t have to think about it, it really does change how you use the device.

Meanwhile, you’ll find me on a lot more Zoom calls from the patio, or sitting on the couch or a comfortable chair without a plug because… why not? There’s no rush.

Nerd out on Chips

The TL;DR for this is is that the chip designs are awesome and represent the culmination of decades of evolution in design. There are some great resources to better understand that linked in this section.

The M1 is also just fascinating from a chip-nerd point of view. When I was in college, the debate between RISC and CISC instruction sets was running hot. In theory, it was clear that RISC instruction sets would be superior, all things being equal. But for the next 10 years or more, Intel just powered through process improvements and technology improvements that postponed that day that RISC would outperform CISC — seemingly indefinitely.

And then the ARM and later A-series chips came along and showed that actually RISC chips’ advantage in power consumption would really show up and show off their capabilities… And around the same time, TSMC started lapping Intel and Samsung with process-technology-improvements… and while artificial benchmarks showed A-series Apple chips could compete with Intel chips, you never knew for sure how they would perform running a full blown desktop operating system… until the M1 MacBook Airs came out.

Apple’s experience with systems on a chip (SOCs) for their phones might have led them to a different design paradigm for the Mac as well — a SoC that included all of the clever innovations that Apple had devised for the iPhone. All on one die. But why not go further — with more GPUs than are needed for a phone? Why not go further still, by putting the RAM on that SOC as well!

That’s just what Apple did. And then took advantage of the chip architecture to have “unified memory” between GPU and CPU cores — eliminating a major bottleneck in more traditional chipset designs. Perhaps the only drawback is that an increase in RAM means a bigger chip with lower yield or lower volumes? That doesn’t seem to be holding Apple back.

Jean-Louis Gassée addresses the announcement of the desktop class M1 chips:

In essence, Srouji says, in the M1 Ultra, the two M1 chips are joined without paying much of the penalty associated with connecting two processors together: power consumption, bandwidth and memory management. [..] In “fusing” the two M1 Max chips, Srouji and his team come up with a never-before [seen] 64-core GPU

And of course for anyone who really wants to geek out on the specs, AnandTech is still the place to go.

Now we have a chipset that powers the leading phones, an upgrade to that same design that powers the leading tablets and laptops, and then as of spring, a further upgrade that supports high end desktop applications. That third step was uncertain when the M1 rolled out… there were those who doubted whether the M1 could sufficiently power a pro-level laptop, which Apple subsequently proved out with the MacBook Pros. I am sure I’m not the only one who was surprised that Apple solved the desktop class compute problem by effectively fusing two M1 Max chips to double all of the CPU and GPU core counts, and share memory architecture.

Looking Ahead

From my perspective, Apple is now producing a desktop class machine that is beyond my needs! I have more than enough power in my laptop. I don’t need that performance — but I can get a version of that same chip that offers me all the battery life I need in a portable (laptop) format. It’s a wonderful thing.

The open question for me now is whether I’ll ever upgrade my iPad — because my battery life on this machine is so good, I’ll never be reaching for the iPad “because my laptop battery is dead”. And I can watch all the movies I want on a bigger laptop screen, or get work or news reading done… I really don’t have to choose between devices based on the task I’m doing.

(I’m sure that some future release of an iPad will make me question why I would ever use my laptop again, but let’s save that for a future discussion! )

Now Apple’s future is a question of how fast it can iterate designs, how fast TSMC (their main fabricator of chips) can improve their chip processing, and what new form factors Apple can support as these chips evolve.

Will Apple produce machines a la the Tensorbook that are focused on machine learning? Or is an Apple machine with an M1 chipset already tuned enough for this use case? Will Apple produce products that are more finely tuned to specific industries or use cases? There’s historical data to indicate either conclusion you prefer, but I think the recent focus on “workflows” by Apple’s design and product team hints that it is possible they will build lower-volume machines to target more specific uses of Apple products.

If there’s one thing we should have learned from Apple’s approach to the market, their desired products drive chip design — and improvements in chip design drive their ability to produce new product product offerings or new form factors. I expect to see a lot of innovation around these chips in the next decade.

One more thing: there’s a great article on the oral history of the ‘Get a Mac’ campaign. If you’re not familiar, this was a blistering (but funny) campaign by Apple to highlight both the miseries of Windows machine usage at the time, and the benefits of using a Mac. Obviously, no journalist integrity required, as it was comedy and marketing, not journalism, so no deficiencies of the Mac were discussed.

It’s a fascinating read. At the time, it felt to me like these commercials laid bare a set of inconveniences that were hard to square with why we all put up with them. I was a Windows user for 13 years from 1994 to 2007, and each of the critiques hit home, and hit hard. And moving to the Mac eliminated so many of these inconveniences for myself and for many others. The ads were cheeky — befitting of an upstart — which Apple was, at the time. Most of these ads aired in the early days of iPhones, before Apple became the juggernaut.

To mark the 10-year anniversary, Campaign US asked members of the creative team, the crew and the actors to share the untold stories of how the campaign came to life. What follows is their recollections-inconsistencies, errors, biases and all-lightly edited and condensed for clarity. Click here to hear our podcast with John Hodgman, Justin Long and Jason Sperling. Click here to read Part 2.

It is a fun read on how a project like this can come together.

Originally published at https://sfrancisatx.substack.com on April 25, 2022.

--

--

Scott Francis
An Idea (by Ingenious Piece)

Co-founder and CEO of BP3, Magellan International School Board, ATC Board. Interested in Tech, Apple, Startups, Austin, Education, Austin Cuisine.