Rethinking Modern Tools: The Nearby Hand Effect
The development of the radically different Siempo user experience began with a discussion on how smartphones can be used to hijack time and attention. Siempo wanted to see how much we can give the user back both their time and attention, to help the user live a life aligned with his or her true intentions.
In the process we came across a small pocket of the cognitive neuroscience research literature that showed the importance of good design specifically for the small screen that fits in your hand.
Much of the research in cognitive neuroscience has involved study participants looking at screens and doing simple tasks. The participants may be searching for a particular visual target and pressing a button on a keyboard as soon as it is located, or they may be reading words and answering comprehension questions. The critical measurements in such experiments are often accuracy rate or reaction time (captured in hundredths of a second). Amazing progress has been made to infer the wiring diagram of the brain with these behavioral tasks that looks at error patterns and speed.
But not all screens are created equal
A relatively new line of research points to the conclusion that not all screens are created equal. The smartphone screen may get different processing in the brain because it is a “tool” that is being held in the hand.
Research is ongoing into this so-called “nearby-hand effect.” The basic research approach is to take classic tasks used in cognitive neuroscience experiments and run them while the participants view and interact with a phone.
Some tasks show improved performance on a smartphone screen and others are superior on a computer monitor, so more research is needed to get a clearer picture. But in essence, the nearby-hand effect says that there is a special link between how our brains process info that is presented to us in our hands, and there’s good reason why.
Space in the real world is represented in multiple maps in the brain. Somewhat redundant, these different maps have a different system of coordinates relative to the body. It’s basically the math that represents objects in 3D space and converts perception into action. For example, space may be represented in a coordinate system that is centered on the head or, more specifically, it may be locked into a more complex coordinates that are retina-centric, which means that all the math calculating distances between objects has to be almost instantly updated every time the eyes move (about four times per second).
The closer an object is to the body, the more it is relevant to the next action we can do. It may even be relevant to survival! It could be an animal that wants to have you for lunch or it might be lunch, evolutionarily speaking, of course.
Similarly, the closer a tool is to the body, the more it is relevant. It is usable. It amps up movement-related parts of the cerebral cortex because it is represented in the brain in the coordinate system that says this is on the surface of the body.
The brain gives special attention to a tool in our hand
Now, in 2017, the most common tools we hold in our hands are our smartphones. And the research literature on the nearby-hand effect is pointing to how our brains give high-octane attention to the smartphone tool in our hands.
The nearby-hand effect refers to the difference in behavioral measurements (like speed and accuracy of decisions) when the same information is viewed on a smartphone vs. a full-screen computer monitor. There is quite literally a spatial gradient in these research findings, all beautifully summarized by a review article by Tseng, Bridgeman, and Juan (2012).
Here are the important variables underlying the nearby-hand effect, variables that reliably produce different types of behavior (sometimes better and sometimes worse) when the tool (screen) being tested is in the hand.
- There is a gradient where the shorter the distance between the tool and the body, the stronger the effect.
- Maximum effect occurs when the tool is held in the hand.
- The effect occurs when the tool is in contact with the back of the hand, but it is more pronounced when the tool is held in the hand and facing the user, just like a smartphone. It taps into the coordinate system that represents the body surface, which gets highest ranking or priority attention in the brain
- Attention in the hand seems more “bottom up” and prone to distraction, rather than being “top down” and guided by intentions and long-term goals.
More work is needed in this field, but it points to a potentially scary conclusion that the cognitive processing that a smartphone gets because it is a tool that we are holding in our hands is unusual.
Smartphones may be hijacking the attentional circuits of the brain, specifically in ways that make attention easier to capture and harder to disengage. The current results point to how attention is sticky, not willing to move on to other things, even shiny novel things, when the eyes are gazing at a screen that is hand-held.
If the smartphone truly engages attention differently, designers must treat their work with respect for the users’ wellbeing. This notion is foundational to Siempo’s mission.
Tseng, P., Bridgeman, B., & Juan, C.H. (2012). Take the matter into your own hands: A brief review of the effect of nearby-hands on visual processing. Vision Research, 72(1), 74–77.
More about Siempo
We’re creating the phone for humans — learn more about Siempo here.
Stephanie M. Shorter, PhD is a neuroscientist and behavior designer focused on helping mission-driven Silicon Valley startups design with brain science in their product DNA to drive greater engagement and healthy behavior change. Her focus is on tech x sustainability and designing for good.