Cover image by Virginia Poltrack

The previous post marks the end of us talking about drawing to the edges. In this third post we’re going to cover how to handle any gesture conflicts, between your app and the new system gestures in Android 10.

What do we mean by gesture conflicts? Let’s take a look at an example. Here we have a music player app, which allows the user to scrub through the current song by dragging a SeekBar.

Unfortunately, the SeekBar is too close to the home gesture area resulting in the system quick-switch gesture taking over and confusing the user.

The same thing can happen on any of the screen edges with gesture areas. There are plenty of common examples which can cause conflicts, such as: Navigation drawers (DrawerLayout), carousels (ViewPager), sliders (SeekBar), swipe actions on lists.

Which brings us onto the question of ‘how can we fix this?’. To help with this question, we’ve created a flow chart to help guide to one of the solutions.

You can find a printable PDF version of the flow chart here.

Hopefully the questions are self-explanatory, but in case you’re not sure on any of them, let’s explain each of them:

1. App required to hide navigation and status bars?

The first question is asking if your app’s main use case requires hiding of the navigation and/or status bars. By hiding we mean that they those system bars are not visible at all. It does not mean that you’ve made your app go edge-to-edge, or similar.

Possible reasons for answering yes to this question are:

Some common examples of apps which answer yes to this are games, video players, photo viewers, drawing apps.

2. Primary UI use case requires swipes in/near gesture areas?

This question is asking whether the UI contains any elements in/near the gesture zones (both back and home) require the user to swipe on them.

Games will commonly answer yes here due to:

  • On-screen controls tend to be near the left/right edges, and near the bottom of the screen.
  • Some games require swiping on on-screen elements which can be anywhere on the screen, such as board game apps.

Outside of games, common UI examples which will answer yes here:

  • Photo cropping UI, where the drag handles are near the left/right screen edges.
  • Drawing app, where the user can draw on a canvas which covers the screen.

3. Commonly used views in/near gesture areas?

A hopefully simple question. This also includes views which cover the gesture area, and then extend over more of the screen, such as a DrawerLayout or large ViewPager.

4. View requires user to swipe / drag?

We change tact here a little bit, and start looking at individual views. For any of the views where you answered yes to #3, does it require the user to swipe/drag on it?

There are a lot of examples where you would answer yes here: SeekBars, bottom sheets, or even a PopupMenu (can be dragged to open).

5. View is fully/mostly laid out under gesture areas?

Following on from question 4, we’re now asking if the view is either fully, or mostly laid out under a gesture area.

If your view is in scrollable container, such as a RecyclerView, think of this question slightly differently: Is the view fully/mostly laid out under gesture areas at all scroll positions? If the user can scroll the view out of the gesture area, there’s nothing for you to do.

You might have looked at the chart above and seen the example of full-width carousels (ViewPager) answering no here, and wondering why it leads to no handling. This is because the left/right gesture zones are comparatively small in width (default: 20dp each), compared to the width of the view. Your typical phone screen width in portrait is ~360dp, leaving ~320dp wide of visible screen where a user’s swipe is unhindered (that’s nearly 90%). Even with internal padding/margins, the user will still be able to swipe the carousel as normal.

6. View bounds overlap any mandatory gesture zones?

The final question asks whether the view is laid out under any of the mandatory gesture zones. If you think back to our previous blog post, you’ll remember that mandatory system gesture zones are the areas of the screen where the system gestures always take priority.

Android 10 has just one mandatory gesture zone which at the bottom of the screen, allowing the user to either go home or bring up their recent apps. This may change in future platform releases, but for now we only need to think about views at the bottom of the screen.

Common examples here would be:

  • Non-modal bottom sheets, since they tend to collapse to small draggable view at the bottom of the screen.
  • A horizontally scrolling carousel at the bottom of the screen, such as a sticker pack UI.

Now that we’ve covered the questions, hopefully you have arrived at one of the solutions, so let’s go through each in more detail.

No conflicts to handle

Let’s start with the easiest ‘solution’, simply do… nothing! 🙌

There may still be optimizations which you can make (see the section below), but hopefully there’s no major issues when using your app with gesture navigation mode enabled.

If the chart led you here but you still feel that there’s an issue, please let us know. We may have missed something.

Moving views out of gesture areas

As we learnt in our previous blog post, insets are dispatched to tell your app where the system gesture zones are on the screen. One method we can use to resolve gesture conflicts is to move any conflicting views out of the gesture zones. This is especially important for views near the bottom of the screen, since that area is a mandatory gesture zone, and apps can’t use the exclusion APIs there.

Let’s take a look at an example. Here we have a music player UI we showed above. It contains a SeekBar positioned at the bottom of the screen, allowing the user to scrub the song.

Music player UI with a SeekBar at the bottom of the screen

But when the user tries to scrub the song, this happens:

Recording of the system gesture conflicting with the SeekBar

This happens because the bottom gesture zone overlaps the SeekBar, therefore the home gesture takes priority. Here’s the gesture zones visually:

Simple solution

The simplest solution here is to add additional margin/padding so that the SeekBar is pushed up, out of the gesture zone. Something like this:

If we drag the SeekBar with this example, you’ll see that we no longer trigger the home gesture:

Demo showing the SeekBar no longer conflicting with the bottom system gesture

To implement this, we need to use the new system gesture insets, available in API 29 and the Jetpack Core library v1.2.0 (currently in alpha). In the example, we increase the bottom padding of the SeekBar to match the bottom gesture inset value:

You might be interested in reading another blog post we published which explores some of the ways you can make WindowInsets easier to use:

Going further

You may at this point be thinking: “job done”, and for some layouts this might well be the end solution. But in our example, the UI has now visually regressed with a lot of wasted space under the SeekBar. So instead of simply padding the view up, we can instead re-work the layout to avoid the wasted space:

Demo showing the SeekBar moved to the top of the playback bar

Here we’ve moved the SeekBar to the top of the playback bar, completely out of the gesture zone. This means we no longer need to pad/increase the height of the bar to accommodate the SeekBar.

We should however pad/increase the bar height by the system bar height, so that the text is not visibly obscured. This was covered in our second blog post on ‘Handling visual overlaps’.

Use gesture exclusion APIs

In our previous blog post we mentioned that “apps have the ability to exclude the system gestures for certain parts of the screen”. The way apps do that is via the system gesture exclusion APIs, new in Android 10.

There are two different functions which the system provides to exclude areas: View.setSystemGestureExclusionRects() and Window.setSystemGestureExclusionRects(). Which you use depends on your app: if you use the Android View, system prefer the view API, otherwise use the Window API.

The key difference between the two APIs is that the Window API expects any rectangles to be in the window coordinate space. If you’re using views, you will typically be working in the view’s coordinate space instead. The View API takes care of the conversion between coordinate spaces, meaning you only need to think in terms of the view contents.

Let’s take a look at an example. We’re going to use our music player example again, which it’s SeekBar laid out across the whole screen width. We fixed the SeekBar triggering the home gesture in the previous section, but we still have the left and right gesture zones to think about.

Let’s take a look at what happens when the user tries to scrub the song while the SeekBar ‘thumb’ (circular dragger) is positioned near one of the edges:

Demo showing the SeekBar conflicting with the back gesture area

Since the thumb is under the right gesture area, the system thinks that the user is gesturing to go back, so shows the back arrow. This is confusing for the user, since they probably didn’t mean to actually go back. We can fix this by using the gesture exclusion APIs mentioned above, to exclude the bounds of the thumb’.

The gesture exclusion APIs are typically called from two places: onLayout() when your view is laid out, and onDraw() when your view is drawn. Your view passes in a List<Rect>, containing all of the rectangles which should be excluded. As mentioned earlier, these rectangles need to be in the view’s own coordinate system.

Typically you’d create a function similar to this, which would be called from onLayout() and/or onDraw():

A full example can be found here.

Once we’ve added this, scrubbing near the edges works as expected:

Demo showing the SeekBar working with the back gesture area

Note about the example above. SeekBar actually does this automatically for you in Android 10, so there’s no need to do this yourself. It’s just as an example to show you the general pattern.

Restrictions

While the gesture exclusions APIs may seem like the perfect solution to fixing all gesture conflicts, they’re actually not. By using the gesture exclusion APIs, you are declaring that your app gesture is more important than the system action of going back. That is a strong statement to make, which is why this API is meant to be an escape hatch when you can’t do anything else.

By using the gesture exclusion APIs, you are declaring that your app gesture is more important than the system action of going back

Since the behavior which the API enables is disruptive to the user, the system limits how it’s usage: apps can only exclude up to 200dp per edge.

Some common questions from devs when they hear this are:

Why have a limit? Hopefully the explanation above has given you an inkling. We think that users being able to consistently go back from an edge swipe is very important. Consistently across the entire device, not just a single app. The limit may seem restrictive, but it only takes a single app to exclude an entire edge of the screen to confuse a user, leading to either app uninstalls or something more drastic.

Put in another way, the system navigation needs to be always consistent and usable.

Why 200dp? The thinking behind 200dp is pretty simple. As we mentioned earlier, the gesture exclusion APIs are meant as an escape hatch, so the limit was calculated as a multiple of important touch targets. The minimum recommended size for a touch target is 48dp. 4 touch targets × 48dp = 192dp. Add a bit of padding and we’ve got our value of 200dp.

What if I request to exclude more than 200dp on an edge? The system will only honor the bottom-most 200dp which you requested.

The system honors requests totalling 200dp in height, counted from the bottom edge

My view is off screen, does it count towards the limit? No, the system only counts excluded rectangles which are within the screen bounds. Similarly if the view is partially on-screen, only the visible part of the requested rectangle is counted.

Immerse in the next post

You might have got to here and are wondering why we haven’t covered the right side of the flowchart 🤔. Those solutions are specifically for apps which need to draw across the entire screen. We’ll those in the next blog post, which is already available 👇

Android Developers

The official Android Developers publication on Medium

Chris Banes

Written by

Work @Google on #Android

Android Developers

The official Android Developers publication on Medium

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade