🙌 Designing for Reachability

Shivansh Singh
3 min readFeb 4, 2018

--

Huge phones + Bigger screens = More stress to the 👍

Note: These are solely my thoughts that originated in my mind while I was working on a product for Razorpay and not stolen from somewhere.

Okay. Let’s build some context

Image acquired from Alex Kirhenstein’s article on Reachability

A study states that most people use only one hand to navigate and take actions on their smartphones. Even I do too. 🤔
With more and more bigger phones ☎️ coming into the market, it has become difficult for people to use only one hand to control their devices anymore.

We are trained to use our thumb to scroll and take actions.
Why should smartphones interrupt or change users habits by making them feel uncomfortable?

This article revolves along the lines of ‘Gestalt Principle’ and ‘Thumb Zone Rule’.

  • What is Gestalt Principle? It is the concept of grouping and separating items of same/similar family/ language. It attempts to describe how people perceive different visual elements.
  • What is Thumb Zone Rule? It is the concept which monitors the availability of actionable elements within one’s thumb reach.
Image courtesy: google.com

Both these principles/laws apply to flows or different steps in a flow, be it adding a new shipping address or selecting an offer from a list or adding filters to that food menu.
But these principles contradict each other when it comes to interactions where the action is taken at a certain point/area on the screen and the result appears from a different point/area(bottom of the screen as an overlay).

Have you ever come across the address selecting interaction of Amazon's mobile web app?

On clicking on the ‘Address strip’, the action appears from the bottom.

When it comes to UX, what’s wrong about this is the response is not in cohesion with the action but the good part is that the user can take an action quickly using this interaction as it’s in the Thumbs’ up zone.

People have a tendency to recognize objects/elements even when there are parts of them missing/misaligned. Our brain matches what we see with familiar patterns stored in our memory and fills in the gaps.
This interaction does feel a little broken but it solves the problem IMO. A user can scan a phone screen and our 👀 attracts things/elements that are in motion.

Most of the apps are now following this guideline as it’s less of an effort for micro-interactions and can be allowed to break the proximity principle once in a while. Take a look at the interaction in the VSCO app.

VSCO App image actions Interaction

This is not the best possible solution to this problem, but I believe people have a tendency to connect dots between things and hence this works fine at this moment. Also, this is where motion design comes into the picture as it helps in understanding and registering the movement of layers and elements in the visual design.

Both these principles can be co-aligned together to create a much better experience for both the user and her thumb.

What are your thoughts on this?

Have you come across interactions like these or confused about using it for designing your next product?
Feel free to drop a comment down there ⤵ in the comments section 😃

Also, If you have time, do read this amazing article by Alex Kirhenstein on how reachability can be handled in an elegant way: https://medium.com/@Draward/mobile-reachability-rules-of-thumb-ce37dd0cd3ad

Follow me 👆 for more amazing insights/thoughts on design topics /ideas.

--

--

Shivansh Singh

👨🏻‍💻 Founding member, Heading Design @Turnip • alum @Flipkart & @Razorpay • Mentor @10kdesigners • 🎓 IIT Roorkee‘17 • Cricketer 🏏