Part 1 — Target Size

Darren Blum
Jun 5, 2018 · 6 min read

Based on a true story:

The crash happened in an instant. On a cool fall morning in 2008, Jason was driving to work on a quiet back road in Santa Rosa California. The traffic was sparse and moving fast. Ahead, he saw a car drift off the road into a ditch. He was the first one on-scene. Adrenaline made his hands shake and it took several tries to enter his password, then swipe the slider to unlock his phone. He got locked out. Fortunately another driver arrived minutes later and was able to dial 911.

It was the first generation of iPhone. No touch or facial ID. It required two separate actions to unlock, and there was no emergency call access. In this extreme situation invaluable minutes were lost.

Device technology and UX design have evolved a great deal since then and this would play out differently today. As UX designers, we have a thorough grasp on what’s happening on the screen, but few consider what’s going on in front of and behind the screen.

Fingers touching screens is the core of how users interact with mobile devices. When you understand how touch technology works, you’ll have a more powerful tool-set to design great apps.

It’s all about capacitance?!

Capacitance is the ability to store an electrical charge. It’s the main way that we interact with our mobile devices. The human body stores electricity, and through our digits, this energy is conducted to interact with touch screens.

Behind the glass of a screen, there is a thin conductive layer powered by electrodes. When we touch a screen, the energy from our body completes a circuit at the location of the touch.

Illustration by Darren Blum

The Finger

The electrical energy stored in our bodies is typically 180–200 picofarads — a tiny unit of energy storage. When we touch a mobile screen with our finger — or any appendage really — it conducts a miniscule amount of this stored energy. The matrix of capacitance sensors behind the screen detects this touch location. This is how Joe Sausage Fingers is able to accurately select the perfect lovey-eye emoji to send to his wife.

The matrix of capacitance sensors behind the screen detects this touch location.

This also explains why interacting with our screen doesn’t work with other objects, unless they:

  • Are conductive and pass along our energy.
  • And are the approximate surface area of a finger.
  • Or have capacitance of their own, like a stylus with a battery.

A car key won’t do it, but the back of a spoon will. Go ahead, I know you’re dying to try it!

The Screen

Even the smallest finger is hundreds of times larger than the hair-like sensing wires behind the screen. So how does the screen know precisely where you are touching?

Upon touch, the computational power of our device then does three things instantaneously:

  • Filters out any background noise.
  • Maps the center of the change-in-capacitance on a grid.
  • Relays the grid coordinates to the processor so it knows precisely where we are touching.

Fun Side story: This is why your screen wont work when you have gloves on unless they have touchscreen compatible tips. As I was skiing with some of these gloves and happily texting on the lift, I wondered why they just don’t make the whole glove out of this magic capacitive material. After I took a spill, I knew why. Because they are conductive! Which meant that the part of my touchscreen friendly gloves — the tips — that touched the snow got wet and cold right way.

Size Matters

Even with this sophisticated technology we still get plenty of mis-taps, and material for autocorrect fails. This is where good UX design can help. Touch and tap target sizes are well known to most mobile UX designers, but what do they really mean?

Material Design defines 48 x 48 dp (density independent pixels) as the minimum size of a touch target

Touch targets should be at least 48 x 48 dp….resulting in a physical size of about 9mm, regardless of screen size.

Material Design

Apple recommends 44 x 44pt (points) for all controls. Points, in this case refers to DTP (desktop publishing points) and is defined as ​1⁄72 of an international inch (about 0.353 mm).

Try to maintain a minimum tappable area of 44 x 44pt for all controls.

Apple Human Interface Guidelines

In a very convoluted way, these numbers translate to a physical size of a touch target when viewed on a device. It’s complicated because these “dimensions” vary a lot depending on the device. If you’d like to know more check out this article by Luke W.

With all of these varying resolutions, pixel densities, and units of measure, it’s pretty easy to get confused. So how can one easily make sense of this when designing?

Making sense of pixels in the physical world.

My index finger measures about .67" wide (17mm). I’m average in this department, so I’ll use it as a benchmark to simplify things. If you need more definitive data, this MIT paper confirms my exhaustive N=1 study.

No other tappable elements should lie within this area.

But, in Sketch, the units are in pixels, not inches or millimeters. To simplify things I select an art board sized to match the target device screen, then I add a rectangle representing an average size. What this means is that for a tappable element, nothing else tappable should lie within this area.

These guidelines are average sizes to help UX designers. They ensure the touch targets we design are just-big-enough to keep us from tapping the wrong button, yet not too big to use up precious screen real estate. They are solid guidelines for the majority of finger sizes, because if we design for Joe Sausage Fingers all the time, we unnecessarily limit information density on the screen.

More than just touch

Beyond simply tapping a screen, there are several dimensions that can translate touch into what we call micro-interactions. As users, we usually don’t even think of these interactions. Designed well, gestures like pinch and swipe become natural to us almost instantly.

These interactive dimensions and combinations of them are what we have to work with as mobile UX designers.

  • Position — where on the screen?
  • Duration — how long is the touch?
  • Motion — from where to where?
  • Quantity — how many contact points?
  • Velocity — how fast?
  • Pressure — how hard are we pressing.

By understanding what these dimensions are, and how they work, we can create apps with a deeper level of usability.

Next up in this series, I’ll unpack each of these ways to interact with the screen, along with examples and design tips.

Stay tuned!

Tradecraft

Stories about startups, technology, traction, and design from Tradecraft members

Darren Blum

Written by

UX Designer based in San Mateo California. Designer of products that get out of the way let users get jobs done. Fish taco enthusiast.

Tradecraft

Stories about startups, technology, traction, and design from Tradecraft members

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade