hitTest(_:with:) vs point(inside:with)
Which is the best for your needs?
It seems easy to create a link on a label. It is, if you want the easy way. If you want the fantastic way you might have to put in a little more work.
In the case of the latter, you are better to read on.
Difficulty: Easy | Normal | Challenging
- Know something about the touches in a Swift App
hitTest: A function which returns the furthest descendant of the receiver in the view hierarchy
point: A function which returns a Boolean indicating whether the receiver contains the specified point
UIGestureRecogniser: The base class for gesture recognisers, which include tap and pinch recognisers
UIPanGestureRecognizer: A gesture recogniser that responds to pan (dragging) gestures
UIView: An object that manages the content for a rectangular area on the screen
This basically answers “is the point inside the view?”
This means that if you have a view and only want to respond to touches that are within the view, you can override point(inside:with) and return false for anything outside the target view.
This basically answers the question “What is the first view that would be touched at this point?”
When a touch event is detected, it starts at the root view controller and calls hitTest(_:with:) on that root view. The subviews are checked in order to find the foremost view that contains the point touched, at which point hitTest(_:with:) is recursively called.
Not only that…
hitTest(_:with) uses point(inside:with:), and (as we will see) this means that we can’t use both functions together. This is actually a good thing, and as we become aware of the power of hit test returning the touched view we should be aware of the ability to use this to increase the hit area.
A quick experiment
We set up a subclass of a UIView, with a rather fetching background which is blue to make it visible on the cold white of the viewcontroller’s background.
The rather excellent “Reset View” button resets the position of the subclassed UIView.
By using a
UIPanGestureRecognizer(target:action) and thenusing
addGestureRecognizer to add the following code snippet that allows your view to be moved when dragged.
Now, for this to work we need to make sure that the view has the (mentioned in the code block) a parent view (which will be set in the parent view controller):
and we have to make sure our SubclassedView sets the initial location so the system knows the initial position of the view.
A quick experiment: point(inside:with) -> Bool
We would expect the following code:
For any point on the screen, a boolean is returned (from within our subclassed view)
A quick experiment: hitTest(point:with) -> UIIView?
We would expect the following code:
What if we press around the rectangle at certain points? Well, the result of this experiment is featured in the diagram below:
So hittest can detect the point that is touched, and returns the view if we press inside this view.
However, for each touch the same thing is printed twice to the console. What’s going on?
UIResponder methods (i.e.
touchesBegan)are not called before the last hitTest has been called, and since hitTest is a pure function there is no reason why it should change in between calls but this gives the system the opportunity to tweak the points in between calls.
Why might this be useful?
We can see that the point is returned from within the view that is foremost. In order to test this we require a subview.
We need to create an experiment with two views to demonstrate this.
A quick experiment with two views: hitTest(point:with) -> UIIView?
First we need to understand that there are two views here. There is the added blue
SubclassedView (here called specialview), and the background
self.view (here which is white). Now both of these return from hitTest depending on the situation
if we add to print (self)- this actually returns the same view twice
we can add print (self, with: event
so then it only returns if we have pressed the blue portion of the screen
Something like this:
In order to tell the views apart, I have set one with a
backgroundColor of red, and one with a
backgroundColor of blue and a tag for each view (0 for the blue, and 1 for the red view). In order to do this, I have changed how the views are setup in the viewcontroller. Like this:
Now when we press the view we can return the point, the view and the tag of the view with
print(point,super.hitTest(point, with:event),self.tag) once the user presses
Now it gets interesting.
When we touch the blue
When we touch the red
So what is going on?
The red rectangle (tag 2) is a
subview of the blue rectangle (tag 1).
When we touch the red triangle both the rectangles respond to the touch. Now we expect the foremost window to respond first. Now it is clear that it does, and then passes the touch through the responder chain (that is through to the blue triangle). Now each view responds more than once to our touch and that is fine since they are pure functions.
NOTE: If you hittest a view with a zero frame it will not respond. This can be important if you are using a CALayer backed view (as they don’t need a framesize to display content). Be careful.
A quick experiment: hitTest(point:with) -> UIIView? + point(inside:with) -> Bool TOGETHER
Drum roll; hitTest responds first. Now is this a surprise?
Pressing the red rectangle means that hitTest responds, calling point inside recursively on the views (the result being that it returns the red rectangle, tag 2).
Whenever point inside is called, it returns true if the touch is registered within that rectangle. In this case, the touch is within both rectangles!
What we want to achieve
We can add a gesture recogniser into a rectangle that allows the UIView to move downwards
Now, if an object is smaller than a certain size it will not be easy for a user to pan that particular object. In this case, wouldn’t it be great to respond to a gesture outside the usual touch area of a rectangle?
I’ve set up a subclass of a UIView, with a rather fetching background which is blue to make it visible on the cold white of the viewcontroller’s background.
The rather excellent “Reset View” button helps us by, well, resetting the views that have been dragged.
We can then implement a touch gesture in the subclassedView
This then references the selector to drag the view
Which works fine, but only when we touch the rectangle
What if we want to have a larger area to touch — that is touch outside the rectangle and the gesture be recognised.
Well, hitTest to the rescue!
You can now press outside the rectangle, and it moves just as if you pressed inside the rectangle!
HitTest ignores objects that are hidden, have their alpha set to less than 0.01 or have user interactions disabled.
Equally, a view which has transparent content or partially transparent content will respond to hitTest the same way a an opaque object. That is, your space rocket will respond to a touch anywhere in the rectangle sprite that you have created (no free per pixel hit detection here!)
The tutorial link:
The repo link:
You can’t perform that action at this time. You signed in with another tab or window. You signed out in another tab or…
Want to get in contact? Try the link here: