UIGestureRecognizer and UIButtons: Reconciling Compatibility

This particular blog post will provide a simple explanation for why UIGestureRecognizers cannot be used on UIButtons, and how we can work around this problem. I publish this after running into an abundance of unhelpful explanations, and hope that by doing so, this post will save somebody time and effort for implementing effective work-arounds, as well as provide a greater level of understanding of GestureRecognizers in general.

My research into this topic stemmed from the desire to have a UIButton that performed one function when tapped, and another one when the the button was held down with a LongPressGestureRecognizer. The concept appeared simple on the surface, however after several failed attempts to add the GestureRecognizer to the UIButton, I decided to consult the documentation.

Apple’s documentation for the UIGestureRecognizer class states:

A gesture recognizer operates on touches hit-tested to a specific view and all of that view’s subviews. It thus must be associated with that view. To make that association you must call the UIView method addGestureRecognizer(_:).

Given this explicitly stated information, it would seem that it is indeed impossible to add a GestureRecognizer to a UIButton. But how could this be so if there are plenty of apps where buttons appear to have several functions depending on how they are touched?

I scoured the internet for information on how to accomplish this, but kept running into the same response- “You can’t do that”, “This isn’t possible.” Thanks for the this elucidating reply, but can anyone explain why or how it can be bypassed?!

Eventually relying on my own logic, I accepted that you can only add GestureRecognizers to UIViews and not to UIButtons. So now what? Well, is it possible to add a UIButton to a UIView? Of course! But will both the view and the button be responsive to the proper stimuli? This we will find out.

In the following code, I created a view with a simple UIView and a UIButton labeled padView1 and padButton1 respectively, and constrain them with the following properties:

I also added insets to the UIButton to make it easier to see when it was clicked or pressed

Next in the ViewController where I initialized the view, I assigned a GestureRecognizer to padView1 and a target to padButton1:

Keep in mind that the ViewController must adopt and conform to the UIGestureRecognizerDelegate!
All the buttons were placed within an array to format them easier. This did not apply to GestureRecognizers because they may only be applied to one view, so in that instance I had to make several with the same functionality but different names

The functions that I applied to these views and buttons would enable recording for that view/button if it was held down with the LongPressGestureRecognizer, or play the recording (if it existed) when tapped. The view would also turn from orange to green if recording was enabled.

With the finished product I have an app that can record and play back audio much akin to an MPC player, the original concept that was proposed in my previous article!

When I hold down the padView, it tells my recordingSetUp method to record to a filepath with that specific padView’s tag number. If I hold down another padView, that filepath changes so that multiple recordings are enabled.

When the padButtons are tapped, they will play back the recording for that specific filepath, if there indeed is a recording at that path.

So there you have it! Now your buttons and views have exceptionally more functionality and can implement various methods depending on how they are interacted with! Thanks for the read and I hope this post has helped shed light on the subtleties of GestureRecognizers!