Interpreting multitouch interactions can be as simple as understanding code that handles these multitouches and this code's associated actions. As more touch events are added to an input, inputs become more complex. There are multiple approaches to interpreting these inputs between users and touchscreens. Researchers in this field find answers to common problems and provide developers with tools that make interactions with multitouch devices easier to describe and incorporate into their systems. These tools are then used to create gestures through different approaches, specifically through demonstration and by declaration. In this paper, these researchers' tools are described and compared.
"Interpreting Multitouch Gestures,"
Scholarly Horizons: University of Minnesota, Morris Undergraduate Journal: Vol. 2:
1, Article 1.
Available at: https://digitalcommons.morris.umn.edu/horizons/vol2/iss1/1