The Touch Bar, Cognitive Load and Usability.

I’m a PhD Candidate, exploring the cognitive function of gestures in learning. A number of other subject areas orbit this one, including neuroscience, intelligence, human cognitive architecture but the main area of focus is Cognitive Load Theory.

Cognitive Load Theory is not a theory in the way we think of scientific theories, but an area of Educational Research that focuses on the limits of working memory (sometimes called short term memory) and the things educators can do to make learning more efficient and effective. Most of these interventions have to do with how materials for learning are designed, and what learning experiences look like.

As a web designer and mobile developer, I cannot separate this area of research from Industrial Design, User Experience, or even Graphic Design because the way we experience the world and process information is the foundation of all of these fields. With that in mind, here are my thoughts on the MacBook Pro (Late 2016) Touch Bar.

Note: This is not intended as a product review. I actually loved playing with the Touch Bar in the store, and admittedly don’t have day to day experience with it, this article is simply an analysis of the Touch Bar from a Cognitive Science perspective. Feel free to check out the reference list below.

Cognitive Load

So what is Cognitive Load? In a nutshell, our working memory is the part of our memory that acts as a buffer for information. It processes information coming in from the world (something we see, hear, touch), and information we’ve pulled out of long-term memory (the stuff we already know). There is no established model of what all this actually looks like, but all this buffered information is then processed in working memory to solve problems, whether it be a math problem, or navigating to our favorite store in a city we’ve only visited once.

The crux of Working memory is that it’s limited in capacity, as well as in duration. In other words, we can only store a certain amount of stuff for a certain amount of time, before our brains start to lose it. Much of what we can store depends on attention and our ability to focus, and this is where the Touch Bar comes in.

Saccades

A saccade, simply put, is a rapid eye movement from one point of focus to another. If you’ve ever looked at someone’s eyes while you’re spending time with them, you’ll notice our eyes don’t move in smooth arcs or sweeping motions, just from point to point very quickly.

Whatever we see is taken in by our brain and processed. We decide, sometimes consciously, sometimes subconsciously what needs more attention. For example, the color of a wall might not be remembered, but the spider on the wall would be, because the spider might be a threat.

Though there’s no definitive scientific proof of this yet, current research is starting to support the idea that the length of a saccade (the distance our focus shifts as we move from one point to another) is related to our ability to retain and therefore process information. In other words, the distance between two related pieces of information affects our ability to integrate and process them.

Split Attention Effect

Research in Cognitive Load Theory over the years has identified a number of observed effects that can influence how we design certain learning materials, and by extension, user experiences.

Split Attention Effect goes like this. Whenever attention is split between two points, attention and cognitive focus is negatively impacted. It was discovered in the late 80s that if learning materials are integrated (information is embedded to reduce saccades), students were able to retain information and solve problems better than students who looked at materials that weren’t integrated (information is separated).

Example of Integrated (right) and non-integrated (left) materials related to Split Attention. Tarmizi, R.A. and Sweller, J. (1988). Guidance during mathematical problem solving. Journal of Educational Psychology, 80 (4) 424–436

The Touch Bar and Split Attention

If we extend this idea of attention, focus and integration of information, and we look at the Touch Bar, the very existence of another screen evokes the classic Split Attention Effect. If we’re looking at the screen, then decide to use the touch bar to perform a task or manipulate something we’ve created, the saccade is longer between the screen and the Touch Bar, than it is between the screen and another part of the screen.

This will lead to a loss of information in working memory when compared to simply performing the same task on main display itself.

http://www.theverge.com/2016/10/28/13454052/apple-macbook-pro-touch-bar-apple-watch-features

Redundancy Effect

Another effect identified in Cognitive Load Theory research is pretty simple. The Redundancy Effect is basically what it sounds like: when redundant information is presented to us, this negatively impacts our ability to recall information or solve problems. Basically what this does, is it creates unnecessary distraction which our brains have to spend time working out.

“I saw this thing here, but I also see it there. Which one should I focus on, which one should I give my attention to?”

This is basically what your brain is doing when it’s presented with redundant information. Part of using a computer since the invention of the GUI (Graphical User Interface) back in the early 80s was the idea of a menu bar, and buttons. It’s true that almost every task we perform in most desktop apps have more than one way to accomplish the same goal.

  • File > Save, click the save button or Command S.
  • Edit > Paste, click the paste button or Command V.

Part of designing good user experiences is to reduce complexity and increase usability.

There is always a learning curve to using any tool. Many of us find the use of shortcuts on a keyboard preferable to clicking a button on screen, because we find it faster. Many usability issues come down to personal preference and ease of use.

Distraction

In many cognitive ability tests, the use of distraction is heavily used to determine a person’s ability to focus and problem solve. By presenting information, like a math problem, then showing something completely unrelated, our brains have to work extra hard to hold what we just saw in memory, and we all have varying degrees of ability. These tests don’t mean we’re less smart, they just measure the differences in our working memory capacity and ability to process information.

Distractions are known across disciplines in cognitive science as a way to ‘trip up’ our brains so that we have to work harder on the task at hand, this is why they’ve been used for so long.

Touch Bar and Distraction

I’ve only briefly played with the Touch Bar in the apple store, but what we can learn from the above is that if the Touch Bar constantly ‘adapts’ to what we’re working on, this could negatively impact our ability to focus on the task at hand. There is a setting in System Preferences that allows the user to revert the Touch Bar to the traditional function row (not F keys, but brightness, volume, etc.), and hold Fn to bring up app specific controls. For users who find that it is too distracting, this is an option to reduce those constant changes (Thanks to @ScottSmith95 for pointing this out!).

System Preferences > Keyboard

But what about Gestures?

The Touch Bar does add the ability for basic multi-touch gestures, which can be beneficial. Recent research, including my own, focuses on the impact of making gestures on learning, and by extension focus and attention. Gestures have been found in research conducted in the last 10 years or so, to be beneficial for learning and problem solving, but there is still much work to do.

We cannot say definitively that the type of gesture performed on a Touch Bar is not beneficial to completing the tasks at this time, but we always have to look at the experience as a whole. We know for a fact that Split Attention and Redundancy don’t help our brains process related information, so adding gestures would only serve to offset the already existing deficits in using the Touch Bar.

Another related branch of research is all about the focus of attention near the hands. It’s been found that we tend to focus on information faster and longer when the object of attention is close to the hands, so in this regard the Touch Bar is beneficial, but only when it presents a faster means to complete a task, such as doing something buried in a File / Edit menu. If there is a button on screen close to what you are working on, this benefit may not be as prominent, and when combined with the other issues, may only serve as an offset to mediate Split Attention and Redundancy.

The Conclusion

From a cognitive perspective, the Touch Bar by its very design adds a cognitive burden on the user. Regardless of how ‘delighted’ users might be, or how it may speed up workflows, when it comes to our brain, looking away from what we’re focusing on even if it’s to something related, leads to a loss of information, making it harder to integrate and process that information.

It’s true that looking for a button or menu item, then moving the cursor with a mouse or trackpad may in fact take more cognitive resources than looking close to where your hands are and tapping, but with all these attentional and gesture issues mixing together, we can’t say for sure at this point, except that when it comes to split attention and redundancy, the Touch Bar may not the perfect solution, especially when so many of us use shortcuts that don’t require us to look anywhere, having already been stored in ‘muscle memory’.

I personally won’t be getting a Macbook Pro with Touch Bar for this reason, unless I can have the ability to make it less distracting or turn it off completely. Given that this was marketed as breakout feature, it seems like it’s here for the long run and while it may become the norm in the next few years. Knowing what I know about how my brain works on a fundamental level may require me to keep holding onto older technologies or doing my best to minimize the use of a the Touch Bar, when I make the choice to upgrade.

Overall, the more we learn about cognition, how our brains experience the world, the relationship between our bodies and our brains and how this all works together to process and store information, the more this type of knowledge should be embedded into the design of products and software.

User Profiles, flow design, case studies, pilot testing and other existing strategies are incredibly useful, but and understanding of human cognition should also start to be integrated into practice. Simple is better, but we have to be able to understand WHY simple is better.

References

If I included a reference list, this article would double in length, so references are located here.

Show your support

Clapping shows how much you appreciated Stoo Sepp’s story.