“Courage” edition 2 or why Apple will release a dual screen Macbook Pro in the next 7 years

New dimension to multi-touch user interface

September 9, 2014, Apple’s highly anticipated event showed the world the first generation Apple Watch — the first device ever to have pressure sensitive force touch technology — Apples calls it Taptic Engine. Since then it appeared on iPhone 6s and has significantly changed the way we interact with the Multi-Touch display. But, where is the next leap? How can something built into a tiny watch revolutionize laptop computers?

Multi-Touch technology hasn’t evolved significantly since 2007 and original iPhone 2G. It became more accurate, responsive, with better finger and palm rejection, but on a large scale stayed the same.

One of the reasons was that touch input was already excellent, extremely well adapted to small touch screens in 2007–2013, and technology limitations were far greater in touch screen world than in other areas of mobile.

With a touch screen, there was always an inevitable lack of physical interaction with an object, absence of some kind of haptic feedback that would give a user sensation of touching a physical thing.

Since iOS 1 and iPhone 2G Apple relentlessly pushed skeuomorphic design language all the way until 2014 and the release of “flat” iOS 7. Cupertino executives said the primary reason for a dramatic overhaul was a “retina” grade screen quality and we no longer needed deep shadows around app icons and leather emulated apps like contacts. Was there other reason to ditch skeuomorphism? Undoubtedly. Haptic feedback and Taptic Engine development that led to the introduction of “system haptics” in iOS10. Apple’s term for haptic feedback across iOS.

Taptic Engine and post-skeuomorphic era of app design

Contextually specific controls in applications emulate real physical objects like video scroll sliders, scroll wheels in timer or clock app, moving notification and control center panels by providing different haptic sensation. Cell phone manufacturers tried to solve this issue with a traditional linear actuator in a smartphone.

I won’t go in-depth, but the key aspect is that typical linear actuator requires 10 oscillations to reach its peak power output, Apple’s Taptic Engine needs only 1 oscillation to provide a powerful vibration. What it means for the end user is the possibility to have very short vibrations like peak and pop in Apple terms, lasting only 10 milliseconds and then stop rapidly. That changes user experience dramatically. In Apple Watch terms you can literally feel like someone is tapping on your wrist, iPhone 7 gives you a very natural feel of 3d touch preview of your pictures, web links, message threads and lots of others and many system-wide switches, notification centre menus, timer scroll wheel — system haptics as Apple market it. It really steps up interaction with an application and imitates real world object, something that was buried with Scott’s Forstall iOS 6 in favor of flat iOS 7–10 when Taptic engine and force touch development was approaching its release.

Apple was able to step up multi-touch interaction, the possibilities of which are yet to be fully implemented by third-party developers and Apple themselves. iOS 9 was ok, but iOS 10 offered much deeper 3D Touch integration.

iPhone 3d Touch “at scale”

How does it relate to the new MacBook’s? iPhone is a so-called test platform for iPad’s and Macbook’s. The biggest challenge is to make a large pressure sensitive surface susceptible to different levels of pressure in small specific areas. Apple undoubtedly is ironing out the next step in Multi-Touch evolution — iPad with Force/3D Touch for the future release.

Now, if you can measure capacitance changes between the cover glass and backlight more precisely it will enable an entirely new scenario when you are typing on iPad keyboard and you actually can feel haptic feedback while you’re doing it.

That will actually change dramatically with the adoption of OLED or other similar screen technology which obviously doesn’t have a traditional backlight like LCD. What end users will get is a wider range of 3D Touch pressure levels as measurements of capacitance changes will get ever more precise.

Apple has big plans for iPhone 8 that will involve OLED tech. Ming-Chi Kuo an analyst from KGI Securities, had this to say about iPhone 8 and the potential switch to OLED:

“The upcoming flagship device will switch from an ‘FBCB sensor to a film sensor’, offering higher sensitivity and a wider range of 3D Touch pressure levels”.

Whether or not it happens this year, no doubt in my mind this will be the next generation of “system haptics”. When Apple enables an iPhone to sense even lightest pressure while typing the keyboard, every time a user will get a sensation of pressing real/physical buttons with a haptic feedback — that will be a massive step forward.

Android folks do it now, but because of a typical linear actuator and inability to provide powerful vibrations at a couple of milliseconds, the whole experience gets watered down when you type at speed because of actuator lag.

MacBook Pro development alternatives

I wanted to focus specifically on MacBook Pro and on iOS integration.

“On September 6, 2012, the US Patent & Trademark Office published a patent application from Apple that reveals their continuing work on virtual keyboards for both the iMac and MacBook”.

Let’s look at 2016 MacBook Pro with touch bar running a modified version of watchOS. There is no secret that Apple has been hard at work on full-size virtual keyboard for at least 12 years now. Cupertino filed for a series of patents in 2011 and 2014 with the first one way back in 2006. They continued to iron out this concept.

Ultimately there are 3 scenarios on how Apple can approach Force Touch keyboard on the future Macbook Pro’s.

Scenario 1. Apple introduces MacBook Pro with a Force Touch keyboard that looks like Butterfly 2 spec, but with force sensors built into each key. These sensors alongside a clever Taptic Engine are likely to give a better click feel than current low travel Butterfly 2 keyboard. It will potentially allow to decrease Macbook Pro thinness even further because buttons will no longer rely on a mechanical travel. What’s more important here is users will be able to adjust the click sensitivity and feel, the way they want. The feel is something that has been compromised in the current Butterfly 2 design.

Scenario 2. Apple takes a leap straight to a MacBook Pro with a highly sensitive iPhone 8-like display instead of keyboard with the support of Apple Pencil. Entirely possible scenario when Apple takes the wraps off a revolutionary MacBook Pro sometime in 2023.

Scenario 3. Apple follows scenario 1 and then 2. This is the most likely alternative given the nature of change and how much development and transition will it involve. Keyboardless MacBook Pro is a massive deviation from what Apple customers were accustomed to for decades and it will take some time to make everybody comfortable. People have to be taught the absence of physical buttons. Force Touch sensors have to be ultra precise on a flat surface and haptic feedback localized to a small area without any vibrations affecting nearby keys. Apple has tried to solve it in one of its patents. They described it as a “haptic feedback system configured to suppress vibratory crosstalk” The complexity of such technology is extraordinary. It’s a daunting technological puzzle. But it’s about getting the whole experience just right. Typing on a virtual keyboard even with built in advanced haptics will never replace physical keys, but it’s a necessary transition and Apple will get as close as possible to make it “real”.

We’ve seen a few try going this route already — Lenovo Yoga Book, but it’s a compromised device and not a typing experience anyone wants.

Haptic feedback is tough to crack on a large surface, it’s not yet that sensitive to the touch. However, screen technology like self-emitting OLED might make implementation easier. We’ve yet to see what screen tech replaces a trusted LCD panel — OLED, microLED or other. Low-power microLED-based displays that Apple bought in 2014 by acquiring LuxVue Technology may prove a better option for the next 5 years.

iOS and macOS, Apple Pencil built into a MacBook Pro?

Support for Apple Pencil on a pressure sensitive virtual display/keyboard wouldn’t be a stretch to imagine. Just think of possibilities it would open in graphic intensive apps like Photoshop, Illustrator, handwritten notes, CAD and a large number of others. Walcom could be in trouble when this happens.

What also comes to mind is the right way to merge macOS and iOS. Apple have said it will never happen and I absolutely support that, but what is a perfect way to combine both? Touch Bar in the current MacBook Pro’s running customized iOS is a glimpse of a bright future and Apple’s strategy to keep the radically different experience of operating systems at bay, yet build them into a MacBook Pro that does things that we could’ve never imagined before.

Current Macbook Pro ad illustrates this perfectly.

Apple’s got a great vision for the future. They really do. At one of the interviews after the release of Apple MacBook Pro with a Touch Bar Apple Chief Designer Johnny Ive had this to say about the new MacBook Pro and the future work.

“This was an area of combining touch and display-based inputs with a mechanical keyboard. That was the focus. We unanimously were very compelled by [the Touch Bar] as a direction, based on, one, using it, and also having the sense this is the beginning of a very interesting direction. But still just marks a beginning.”

We have a glimpse of what’s coming. Now, the big question mark I have is what it will do to iPad? Despite iOS growing up iPad cannot be considered a true replacement of a MacBook. There is some indication that new models are coming out this March at Apple Event. We might get some clues there. But, for now, I just hope Apple will have the courage to build an uncompromising MacBook Pro regardless of potential cannibalization of iOS products because it is something only Apple can do.

Sergey Ross

A single golf clap? Or a long standing ovation?

By clapping more or less, you can signal to us which stories really stand out.