Touch-based interaction problems and possible solutions

Imprecise positioning, large fingers which may be too large to point and tap accurately, fast movements and poor screen coverage due to dust, glare, poor light conditions or occlusion by the user’s finger are some of the main considerations in interaction design for touchscreen applications.  Speed control and accuracy are vital to ensure efficiency of use and researchers have focused on precise selection techniques that utilize virtual menu tools and touch events to provide the user with optimized control.
Hrvoje Benko, Andrew D. Wilson and Patrick Baudisch, researchers in the Natural Interaction Research Group at Microsoft Research, presented in 2006 a set of dual finger interactions to help the users in the task of selecting small targets (Benko, Wilson and Baudisch, 2006). These techniques made precise selection possible and enabled users to reliably click on targets as small as 8 pixels, with the use of a two-handed input. A right-handed user would control the movement of a mouse pointer with the right hand, while his/her left hand would provide assistance in controlling the display ratio and the speed of the mouse pointer.  The virtual menu tool which was called “X-menu”, contained selections of different modes of pointer speed and a magnifier tool that would output an enlarged view of a selection to a side of the screen. The user could also control the position of the pointer very effectively in a technique that placed the cursor at the midway point between two fingers.  Hrvoje Benko continued his research and created Microsoft’s Touch mouse a few years later. Microsoft’s Touch mouse responds to multi-touch gestures, like “Swipe up” and “Swipe left” which are equally familiar not only to the Windows 7 users but also to the users of the iOS 5 and OS X Lion (Touch, 2011). F.e. the “Pinch” is a multi-touch gesture that users now anticipate in every touchscreen and has become natural and intuitive. Although virtual multitouch menus still exist, with unique gestures that not only enable precise selection but also enhance user satisfaction, the same gestures do not deliver the intended user experience in every platform. In other words, although virtual menus are a possible solution to imprecise positioning, the time needed to learn their usage may be counterproductive. Apple and the W3C working group are working towards touch event specifications in order to establish a universal common framework. Indeed, it is increasingly important to focus on common controls which are the most familiar to users, have clear and visible affordances and provide for the intended experience. To eliminate problems related to precise selection, a designer should implement common controls which are large enough to touch and avoid unresponsive behavior by implementing “light” applications. Ensuring that the application selects the appropriate default values also eliminates unnecessary interaction and thus enhances error-prevention. Basic touch design principles use Fitts’ Law to calculate the appropriate distance between targets and disclose the areas that need improvement. Buttons should have a minimum size and distance for successful finger operation and bright background colors should hide fingerprints and glare. Related menu items should be grouped together in a simple and minimal structure that guides the user as much as possible.  The Gestalt Laws should help the designer to arrange and group elements so that everything is easily identified.


References:
Benko, H., Wilson, A. D. and Baudisch, P., 2006. Precise Selection Techniques for Multi-Touch Screens. Microsoft Research, [online]. Available at: <http://research.microsoft.com/en-us/um/people/awilson/publications/benkochi2006/benko-chi06_final.pdf>
Touch (2011), [online]. Available at: < http://msdn.microsoft.com/en-us/library/windows/desktop/cc872774.aspx>