Simultaneous gesture recognizers

Touch screens put direct interface manipulation at the center of the user interaction with the device. Complex gestures, which go far beyond the simple button press, can be detected and translated by an app into meaningful actions. An entire gesture language has developed since the touch screens became popular, based on conventions that any developer must know and make correct use of in his apps.

Primarily used in games, gesture interactions are not limited to this type of apps. If used cleverly, they add another dimension to any kind of app and help set it apart from the competition. Generally, users love interacting with the screen in various ways and come to expect at least basic gesture support from any modern app.

In this article I demonstrate how to handle simultaneous gestures that occur on the same UIView instance.

Detecting touches

Single and multiple touches are detected by instances of UIResponder subclasses such as UIViewController, UIView or even UIApplication. They can be handled at any level of the responder chain, depending on the custom behavior implemented by each UIResponder instance. Each instance can choose to handle a specific touch event or to pass it to the next responder in the chain.

The next methods can be implemented by any UIResponder subclass to handle touch events life cycle:

  • touchesBegan:withEvent:
  • touchesMoved:withEvent:
  • touchesEnded:withEvent:
  • touchesCancelled:WithEvent:

Even if it’s technically possible to use the previous methods to identify specific gestures like pinching, panning or rotating, the easiest way to detect and respond to these gestures is to use the gesture recognizers API available in the iOS SDK since version 3.2.

For each common gesture there is a particular class: UITapGestureRecognizer, UIPanGestureRecognizer, UIRotationGestureRecognizer, etc. All classes extend the UIGestureRecognizer class by exposing properties and methods specific to the corresponding gesture.

Adding the gesture recognizers

In the simple example I implemented for this article, I created a UIView subclass and attached the gesture recognizers to it. I wanted the custom view to respond to the following gestures:

  • UILongPressGestureRecognizer with two fingers: enables the response to the other gestures. As long as this gesture is not detected, the custom view ignores all gestures.
  • UIPanGestureRecognizer: moves the custom view inside its super view. Only two fingers translations are allowed.
  • UIRotationGestureRecognizer: rotates the custom view.

The gesture recognizers are attached to the custom view in the initializer and the custom view is the delegate for each of them.
They are created in two steps:

  • instantiation and definition of the selector to be executed when the gesture is recognized
  • attachment to the custom view

Enable simultaneous gesture recognition

Even if multiple gesture recognizers are attached to the custom view, by default only one of them is detected at a given time. To enable recognition for multiple gestures that occur at the same time, the UIGestureRecognizerDelegate method -gestureRecognizer:shouldRecognizeSimultaneouslyWithGestureRecognizer: must be implemented and return YES.

Handle individual gestures

At the instantiation of each gesture recognizer, I specified the message to be send when the gesture is detected.

When the two-finger long press is detected, the panning and rotation gestures are enabled by setting the isCaptured property to YES. As long as this gesture is active (its state is not UIGestureRecognizerStateEnded or UIGestureRecognizerStateCancelled) the custom view can be dragged and rotated. Notice that the gesture ends even when one finger is lifted from the screen.

When the pan gesture is detected, the custom view follows the movement of the two touching fingers. A pan gesture is triggered regardless of the number of fingers on the screen, but in the custom view only responds to two-finger panning.
The gesture handling method is called multiple times per second and the translation along the horizontal and vertical axis is performed by updating the transform property to allow the custom view to smoothly follow the movement of the fingers.
Because the panning and the rotation can occur simultaneously, the transform property is also updated with the current rotation angle.

The current rotation angle is memorized in a private property of the custom view and it is updated by the UIRotationGestureRecognizer handler. If no panning gesture is detected at the same time, the custom view is only rotated by updating its transform property.

When the first touch is detected, the current position of the view along the horizontal and vertical axis is memorized in two private properties. These values are used as the starting position for the translation performed when the next panning gesture is recognized.

Conclusion

Using gesture recognizers simplifies the process of detecting and responding to interactions with the elements on screen. They should be preferred to direct touch detection, but this might not be always possible in case of complex gestures that require custom catching and handling code. For example, drawing apps cannot only rely on implementing standard gesture recognizers to deal with all the complex manipulations that must be supported.

 

Catalin Rosioru