What the Touch Bar will be.

Yogev Ahuvia
12 min readJan 14, 2017

--

Apple’s MacBook Pro with Touch Bar (image: Apple)

I won’t forget how when the first iPhone was announced, my friend told me there’s no chance it is going to sell because people wouldn’t want to let go of their physical keyboards (the first iPhone came right in the peak of an era where every smartphone had a full keyboard taking the good bottom half of the device.)

My friend based his opinion with facts and examples. “When you type a message while driving and you need to keep your eyes on the road, you have to feel the physical keys to be able to type without looking at your phone,” he said. I shockingly replied: “let’s put aside the fact that you should’t use your phone while driving (!), don’t you think having twice the size of the screen real-estate is worth your occasional impulsive texting-and-driving?!”

He insisted. “The iPhone would never work as a mainstream device. Touch screens are lacking the sensual feedback,” there he got my attention. It’s true! Feedback must be a feature of anything we (humans) interact with. In our brains, no feedback equals nothing actually happened.

Before we continue, a short disclaimer for the article ahead of us: I’m mentioning Apple, the iPhone and the new Touch Bar a lot ahead. I don’t actually refer to the branded devices or technologies. I don’t refer specifically to Apple or another brand. I’m talking about technology. In this case, Apple were the ones to introduce the Touch Bar, and so I’m discussing it.

Let’s dive for a moment into that whole feedback saga. In the act of clicking a button with your finger there are actually two commands the brain initiates, for each it expects to receive feedback: (A) send finger to find the desired key. (B) command finger to press that key.

(A) when searching for a key, with a finger, on a physical keyboard, your touch sense feels each key’s recognizable physical properties and uses them time after time to differentiate and decide which one to press. The peaks, the planes, the shapes, the orientation bumps, are all sensual feedback our brain receives when sending a finger to find a desired key.

(B) after the desired key was found, pressing that key ignites a similar set of happenings. Just that in this case the brain asks to push the key down and waits to feel it hit its lowest point, then rises back up.

Physical keyboards (or physical things, in general) feedback us perfectly and naturally, intuitively even. Our brains process physical feedback much better than digital, artificial, feedback.

I’m now running two weeks with Apple’s Touch Bar equipped new MacBook Pro. I can write endlessly on its various fancy features — the glorious almost-edge-to-edge screen, the futuristic new keyboard system, its gigantic Trackpad, its one standard (USB-C) connectors for everything, its build quality and unbelievable thinness — but it’s not the reason we got up to this point in the article.

What the Touch Bar is.

Apple released a new MacBook Pro line in late 2016 featuring a new OLED screen above the keyboard called “Touch Bar”. It shows contextual buttons relevant to the app that is currently at the front. The design is slick, and it almost perfectly looks like it is part of the keyboard (well, it is actually…) In a way, the new Touch Bar is the context menu of the future. Instead of right-clicking to open it, it just sits there on a secondary fancy screen to the bottom of your screen.

By default the Touch Bar only shows the collapsed “Control Strip”, which in that state shows just a few buttons for controlling volume, screen brightness and a button to summon Siri. In addition, as a replacement to the hardware escape button that Apple removed in favor of the Touch Bar, there’s a software escape button aligned to the left of the bar.

That’s it. No function keys, no music control keys, no keys to toggle Launchpad or Mission Control.

By touching the left-pointing arrow, the Control Strip is expanded to show its full functionality of being able to control keyboard lights, media keys and the infamous Launchpad and Mission Control.

I didn’t explicitly say it until now — the new Touch Bar replaced the function keys row that ruled the top row of any keyboard since the beginning of all the keyboards ever, basically. For those who lost a beat with the removal of their beloved function keys, worry not — holding the fn (physical) key on the Mac’s keyboard quickly shows the old-fashioned function row.

I’d like to stop here and discuss this for a moment.

I’m a developer. More specifically, I have an expertise in developing interfaces (web interfaces, that is). The industry calls me a Front-End Engineer, or a UI Engineer if you will.

Being the first one to lay his hands on the new MacBook Pro in a team of highly experienced and skilled developers, I had the pleasure to hear people’s reactions on the new Touch Bar first hand. Here is just a few of the reactions I heard lately:

“But it has no escape key!!!”
"How can you live without the function keys?!”
“Instead of making the big screen touchable, they gave me this narrow bar.”
“Who needs this, anyway?”

And many more. Don’t be naive, I also heard many WOWs and amazings, but it’s not something I would go and write about…

About the escape key. I can’t say a lot more as a response to the saying that it has no escape key, other than it does. It does have an escape key. Right there on the Touch Bar there’s an escape key. Not only it is there, right in the place it always were, Apple have declared an area around the button that links to the same escape button so it would be easy to just throw your finger at it, like you’re used to, and have you not miss it, even if you actually didn’t land right where the button is drawn on the bar.

But all in all, the biggest concern people have shown was around the function keys. Disclaimer: I always felt the function keys are super-weird. As an interface enthusiast, I love semantics and context. Those are two things function keys have never represented.

Function keys are generic keys to support variable functionality through your apps. The problem with them is that they’re limited (there are only 12 of them) and that they’re not visually semantic; instead you remember each one’s function, separately, in every app you use.

Say you’ve custom configured F10 to do action X globally in your operating system, if you open an application, it might override F10 with its own command and you’d never even know. Oh, sorry, you’d know once you’ve pressed the key and it didn’t do what you expected it to do. If you look at the key it still says “F10” on the label, but it does something else than what you have set for it to do.

Even worse, why 12? Why there were only 12 function keys on any keyboard. What if I needed 14, or 20? What if I needed only 2? Now they just take that keyboard space for no use. By the way, the F10 key still shows “F10” even if it doesn’t do anything. Just wanted to point that out.

Touch Bar (or any touch screen, for that matter) can give you the same functionality but it adapts visually to the interface you’re using at the moment on the big screen and its unlimited. Remember that F10 key you’ve set to create a new email? It would show with an icon and a name. It can say “Email” or “New Email” or just show an envelope icon. Whatever it is, it’s exactly what you’ve set it to be, and it can’t be overridden by another app.

In a similar manner, if you set an app-specific button, it will show there whenever you’re in that app. It doesn’t mean that the app doesn’t have a dozen or two of its own functions there, they can all be gathered together in a sliding intuitive interface on the Touch Bar. That’s why it’s limitless.

What does it mean that the Touch Bar adapts visually to the interface you’re using at the moment on the big screen? Remember when we initially discussed the iPhone’s non-physical keyboard and how physical keyboards before took half the device’s screen estate?

These days, smartphones show or hide the keyboard according to when it’s needed or not. That way, they only take their big important space, when you actually need them. The rest of time, you have twice as large of a screen.

The Touch Bar solves that same problem, only in the keyboards world. Where the function keys row would have taken a row to just sit there doing very specific actions, not related to what’s on screen right now.

The Touch Bar is adaptive. It means that when you’re in Finder it looks like this:

When you’re in your calendar it looks like this:

When you’re in the Maps app it looks like this:

When you’re in your email it looks like this:

When you’re in your photos it looks like this:

You’ve got the idea. It’s adaptive. It’s unlimited. It’s visual. It’s contextual. It’s everything that the physical, hard-labeled, function keys weren’t.

What it’s not.

It’s not a physical keyboard. Remember how I talk endlessly about the importance of feedback? Well, the Touch Bar does not directly, physically, feedbacks the user. It’s relying on the fact that whatever the user intended to see happen on the screen by touching the Touch Bar, will feedback them on-screen after it happened.

It’s not enough. It’s not enough because it doesn’t cover cases where nothing happened after the user touched the Touch Bar. It doesn’t cover cases where the user missed the touch area, or when the user didn’t lower his finger enough to actually touch the Touch Bar’s screen. How would users know they miss-touched? How would users know nothing happened? Think about it.

When an interface lacks feedback, it entitles itself instead with a lack of trust from its users. When a touch have failed to lead to the result I was expecting, I’ll first suspect it was a technical issue with the interface itself, thinking it didn’t receive my touch correctly. That, as a result, demands me to do my action again, this time extra-slow to make sure I’m doing nothing wrong. Only then I can be (more) certain, it’s not related to the interface, but something in the phase after the request have made, that failed.

That uncertainness, redoing actions and double-checking is frustrating the user (think of all the times you clicked a button harder just to make sure the click was recorded) and it’s causing trust issues between the interface and the user.

When the iPhone was released it’s touch interface felt so magical, it didn’t even matter that it didn’t provide feedback as an interface should (it provided digital feedback, not physical sensual feedback). These days we expect more from our interfaces.

Now, don’t get me wrong, touching the Touch Bar is exciting, because it feels so weird (weird — good) to have a screen as part of your keyboard. Also, that OLED screen have a great contrast and it looks super slick sitting there near the other black background, white texted, keys.

Another thing that is missing on the Touch Bar is extensibility. As a user, all I can do is wait until app developers add support to their useful functions for the Touch Bar. Most of the apps I use still does not offer such support.

Also frustrates me, that Apple delivered some default behaviors to the Touch Bar that exist only in some cases, and sometimes they don’t. For instance, input fields sometimes show auto-suggestions on the Touch Bar, sometimes they don’t (it depends on the type of input and also within what application that input resides).

But after I got used to that functionality over there, I want to always have it; whenever I write something in a text box I want to see auto-suggestions on the Touch Bar. Is that too much to ask? Consistency.

What it will be.

People can hate it all they want, but in three years, there will be no Apple computer (nor external keyboard) to offer physical function keys (also, soon enough Apple’s rivals will start creating their own implementations of it.)

The first thing that will be added to the Touch Bar is haptic feedback, which is critically missing on the current one. With haptic feedback the Touch Bar would let you know when a touch has been received, you could actually feel a tick on your finger every time you touch that bar. That will make an amazing feedback for that interface.

Ultimately they will add force touch too and make it so a touch is not enough to make a button click. A user would actually need to “force touch” or press down with their fingers, instead of just touching it. That way (I’m gonna need your imagination here — ) you will be able to actually swipe your finger across the touch bar without actually activating any of the visible buttons.

The haptic feedback would help you feel when you’re over a button, then adding a small pressure — “Force Touching” it — to make the “click”. At the beginning of this article I mentioned the two actions the brain sends out in order to click a physical button. This way, we could make a digital button, feel physical and make “clicking” it — intuitive like clicking a physical one. You could start using these virtual buttons without even looking if you could just feel them.

I’ll conclude with extensibility. Again, reflecting on the release version of the original iPhone, it was a closed operating system and very much inextensible. No way for developers to create apps for it and no way to install endless possible uses for it. Only a year later, they introduced the AppStore and the rest is history.

I even see a future, where the Touch Bar will become a secondary screen for the iPhone too, and will offer a similar experience and functionality to the Apple Watch (which is, too, just a secondary screen for a main device.)

The Touch Bar is a new platform. Not too far is the day where every app will have its own Touch Bar support, so it will actually go without saying that when you install a new app you check out what it offers on the Touch Bar too.

Also, as a web developer, it’s just obvious that within a few months, major browsers will offer a way for websites to make use of the Touch Bar. When you’ll visit your favorite news site, you’ll have links to the featured sections in the site; when you’ll visit a hotel website, you’ll have buttons to point you directly to each of the room types they offer; etc. It is just a matter of short time until HTML will offer standardized meta tags to put web, site-specific functionality in the Touch Bar.

Think about the Touch Bar as it will be in a year or two from now, with haptic feedback and actual sensual differentiation between each of its contents. Think about how integral part of our computer interaction, web surfing and professional work it will become.

Don’t judge the Touch Bar as it today. Imagine what the Touch Bar will be.

Hacker Noon is how hackers start their afternoons. We’re a part of the @AMI family. We are now accepting submissions and happy to discuss advertising & sponsorship opportunities.

If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!

--

--

Yogev Ahuvia

Front-End Engineer @Facebook • Former Front-End Architect @Fundbox • Co-founder @YGLF_IL • http://ygv.im/linkedinhttp://ygv.im/codepen