Mobile Design Patterns: Interaction Models
This design pattern is part of the Mobile Design Patterns series.
A model describing the method of user interaction with a device and its UI. Mobile devices typically use one of two models—direct or indirect manipulation. More recently, devices have been designed which also respond to gestural interactions.
1. Indirect Manipulation
Indirect manipulation is the model most commonly used on mobile devices. On indirect manipulation devices, interaction is not achieved directly (by pressing or physically manipulating UI elements) but through an intermediary set of controls.
On a mobile device, these intermediary controls typically include the 5-way navigation keys/joystick, softkeys and alphanumeric keys. These are mapped to specific controls or actions on screen, and although their use can be quite intuitive; there can be learning required on the part of the user to determine appropriate key mappings.
There is also great reliance on good design and consequently a risk of mis-mapping. For example, would softkeys be as intuitive to use if the mapping were reversed or if the softkeys themselves were placed vertically on the side of the device? Is there a drop in softkey usability when the device is used in landscape orientation?
Figure: Softkeys and equivalent on-screen controls must be carefully mapped to reduce the amount of learning required.
Certain devices also include additional physical controls such as bespoke keys, toggles or sliders that are specifically mapped to one function ex. Launching the camera or adjusting the volume.
These can be quite useful but do have disadvantages, which include: • Lack of discoverability unless intuitively marked by iconography or form factor. • These controls can be modal, requiring the user to learn under which context they can be used ex. Volume adjustment keys may unexpectedly only work in the music player. • As these keys are often quite prominent on the device, pressing them accidentally can unexpectedly change views or launch features.
2. Direct Manipulation/Touch/Haptic interaction
Direct manipulation (or touch) devices allow the user to navigate and interact with the UI through actual manipulation of on-screen controls ex: pressing, clicking or dragging. Manipulation can occur with a thumb, finger, stylus or—in the case of the Nokia 5800 XpressMusic—a bespoke manipulation instrument such as a plectrum. All the user must do to perform a task is decide what to interact with, move their finger or instrument to the correct spot and perform the required touch action.
S60 Touch In S60 touch, there are two main interaction strategies in use; focus and select, and direct selection. In focus and select, the user taps on an item to move the focus to it, then taps a second time on the focused item to initiate the action. In direct selection, the interaction happens on the first contact.
See Touch Strategies within the Nokia Developer Design and User Experience library for more information.
While touch interfaces may at first glance seem quite intuitive, touch does not simply consist of tapping the screen. Many advanced touch events consist of sequences, combinations or variations on simple touch actions. Strong design is crucial to ensure users understand how to manipulate the UI and what result can be expected once they do. Knowing which actions exist and in which context to use them may not be immediately obvious.
Here are a the most common uses of custom touch events:
Long and short taps
Simple taps are the most common and intuitive touch action and also mimic the mouse click behaviours we are used to on the desktop. Taps of different durations may however be used to prompt different actions. Once an object has been selected (i.e. focussed) using a simple (short) tap, a longer tap (i.e. a sustained press action) can then be used to reveal contextual options.
Figure: In list view on the Nokia 5800 XpressMusic, executing a long tap on a focussed list item reveals a contextual menu.
Dragging is most often used within the context of moving or reordering objects within a view. To accomplish this, the object is pressed then dragged and ‘dropped’ in its desired location.
Figure: The Nokia 5800 XpressMusic enables moving or reordering of Applications through drag and drop.
Sliding or Swiping
A slide action is similar to a drag but affects an entire view thereby performing a scrolling action.
Figure: In the Nokia 5800 XpressMusic image viewer, the user can swipe the screen with the stylus horizontally to go to the next or previous image.
See Utilizing Strokes within the Nokia Developer Design and User Experience library for more information about custom S60 5th Edition strokes.
This is in effect a circular scrolling action and to be intuitive should therefore be paired with a circular interface or physical control.
Figure: An experimental Qt for S60 widget dial.
Pinching and expanding
Most often used within the context of photography or mapping, this action consists of pressing the display while moving the thumb and index finger closer or farther apart. Doing so causes the object below the finger to contract or enlarge.
While popular on the web, double taps can be problematic on mobile devices as they can decrease the efficiency of single taps. Every time the user taps the screen, the device has to wait a few moments to see if there will be a second tap. If the second tap does not materialize, the system continues with the action but to the user; this unnecessary delay can be troubling—especially on a device which may already be prone to delays due to network latency.
Reminder: While these custom touch events can be quite powerful they are not always obvious to the user, and must either be easily discoverable, or communicated clearly.
See Touch Strategies within the Nokia Developer Design and User Experience Library for more information about S60 5th Edition touch actions.
In addition to the basic touch events and customised strokes, many newer devices include sensors, which enable the use of gestures. Instead of simply interacting with the on-screen user interface or physical controls; gestures involve interacting with the whole device ex. tilting, blowing on or shaking the device. Gestures are not obvious for the user, and must be either discovered easily, or communicated clearly.
The following types of interactions can be enabled using the sensors within S60 5th Edition devices:
- Changing the screen orientation on the device from portrait to landscape as the device is rotated.
- Silencing an incoming call when the device orientation is changed from screen down (for example on a table) to screen up and back again.
- Changing application settings based on the ambient light conditions.
- Changing the orientation of a map based on the device's compass orientation.
- Allowing movements such as device rotation to trigger an action.
- Triggering an action when the device comes in close proximity to the user's hand or head.
See Utilizing Gestures within the Nokia Developer Design and User Experience Library for additional information.
Tip: Nokia’s recently released Point and Find service enables users to point their device at a real world object to receive related information.
Tip: The recently published Designing Gestural Interfaces provides a solid overview of the important things to consider when designing for touchscreens and motion-sensitive controllers.