Peek & Pan: Extending 3D Touch

Last September, Apple released a new feature in iOS 9 called 3D Touch. This introduced a new form of interaction which tracks how hard or light a user presses into the display. Apple has also added an API which utilizes this feature to allow the user to take a ‘Peek’ at content and ‘Pop’ into it.

Nicolas Dehghani — NicolasD_ : “Personnal illustrations_part SIX” is licensed under CC_BY 4.0

It was pretty nifty.

Ideas

While experimenting with different ways to use Peek and Pop, we found that it was possible to track the user’s touch location while Peeking. And since we can also change the size of the preview, this opens up many possibilities for the way Peek can be used.

Here are different experiments we’ve set up to test this out:

NOTE: Panning vertically is reserved for displaying action items so we settled on handling the user’s horizontal movements only.

Alberto Seveso — indiffident : “Quinteassential” is licensed under CC_BY 4.0

Display a preview of an image effect while modifying its intensity. The user starts with an image and can Peek on a UISlider which will show a preview of the image with an image effect. Panning left and right will show the image with varying image effect intensities.

Hakob Minasian — HakobDesigns : “Concept Art Portfolio” is licensed under CC_BY 4.0

Scroll through the contents of a target. Each thumbnail, when tapped on, will display a table view containing multiple images. The user can Peek a thumbnail and pan left to right to display each cell.

Toros Köse — toroskose : “Iceland 2015” is licensed under CC_BY 4.0

Navigate through previews of on-screen content. Depending on which thumbnail the user Peeks, the user can pan left and right to view previews of its neighboring cells.


For our purposes, we focused on two use cases:

  • Being able to scroll through the contents of a target
  • Navigating through previews of on-screen content

Mechanics

Gestures

Before we try to implement the use cases, we subclassed a gesture recognizer that begins tracking the user’s touch location once the preview is displayed:

Since we were making this in Swift, we had to import <UIKit/UIGestureRecognizerSubclass.h> in a bridging header to override the methods.

NOTE: It’s recommended to remove preview action items because the gesture recognizer stops tracking once the user’s touch is released. It will also prevent the preview from bobbing up and down while panning.

Now, onto the two use cases.


Scrolling through the contents of a target

We want to have the user start at the beginning of a range of indices once the preview is displayed and then be able to pan through the rest of the indices. It was important for us to make sure that the user can go from beginning to end.

To do this, we need a point that represents the start of the range and a point that defines the end of the range. We can simply define the location where the user starts peeking as the starting point (minimumPoint) and the right edge of the screen as the ending point (maximumPoint). The area in between these points are segmented to situate the range of indices.

However, the problem arises when users begin their Peek by pressing near the right side of the screen. This would leave the user with little room to pan through all of the indices accurately. We could increase the space by having the minimumPoint start towards the left side of the screen, but that would leave the user starting somewhere in the middle of the range.

What makes this problem even worse is that we need margins on the edges of the screen because it’s somewhat difficult to pan to the very edges of the screen.

The solution:

if currentPoint.x < minimumPoint.x {
minimumPoint.x = currentPoint.x
}

Panning to the left of minimumPoint will increase the space needed to go through the range of indices. This fixes the problem since the user will start at the beginning of the range and can increase the pan space if needed.

We also want to define a subrange for the maximum amount of indices the user can pan through. Otherwise, we would run into the same problem of not having enough space to pan through the indices with accuracy.

In action:

Vivien Bertin — vivienbertin : “Line-up #3” is licensed under CC_BY 4.0

Navigating through previews of on-screen content

For this situation, we wanted the user to be able to start at any specified index in a range of indices and navigate through its neighboring indices.

To do this, we can simply measure how far the user pans away from the starting point and determine how many indices that change accounts for. We can segment the width of the screen to determine how far the user would have to pan before jumping to the next index. We would also need to store the specified index and the user’s touch point when the tracking begins. Keep in mind that the user will start in the middle of a segment as well.

However, a problem occurs when the user reaches past the ends of the range and goes out of bounds. To solve this, we can simply shift the starting point by the amount the user goes out of bounds — similar to how the first use case works.

In action:

Toros Köse — toroskose : “Iceland 2015” is licensed under CC_BY 4.0

Integration

Creating this in the form of a delegate/data source object allows for easy integration into a project. The data source would provide information like the minimum/maximum indices, the size of the subrange, and the margins. The delegate would call methods that notify when the gesture recognizer begins/updates/ends tracking or when there’s a change in index.

Here’s how it would be integrated for the two use cases:

Scrolling through the contents of a target: We can set the delegate/data source to the view controller being passed in the previewingContext(_:viewControllerForLocation:) method and handle changes to the index in the view controller’s class.

Navigating through previews of on-screen content: We can set the delegate/data source to the current view controller, pass the starting index in the previewingContext(_:viewControllerForLocation:) method, and handle index changes in the class.

Conclusion

The code examples shown are rough implementations of the cases at hand. To see everything in its entirety and more, you can find it here: PeekPan. It’s open source as well!

Also, check out our Behance for iOS app to see all of this in action!

Varya Kolesnikova — paskamarja: “Meoof!”(left) is licensed under CC_BY 4.0