TY - GEN
T1 - Including multi-stroke gesture-based interaction in user interfaces using a model-driven method
AU - González, Otto Parra
AU - España, Sergio
AU - Pastor, Oscar
N1 - Publisher Copyright:
© 2015 ACM.
PY - 2015/9/7
Y1 - 2015/9/7
N2 - Technological advances in touch-based devices now allow users to interact with information systems in new ways, being gesture-based interaction a popular new kid on the block. Many daily tasks can be performed on mobile devices and desktop computers by applying multi-stroke gestures. Scaling up this type of interaction to bigger information systems and software tools entails difficulties, such as the fact that gesture definitions are platform-specific and this interaction is often hard-coded in the source code and hinders their analysis, validation and reuse. In an attempt to solve this problem, we here propose gestUI, a model-driven approach to the multi-stroke gesture-based user interface development. This system allows modelling gestures, automatically generating gesture catalogues for different gesture-recognition platforms, and user-testing the gestures. A model transformation automatically generates the user interface components that support this type of interaction for desktop applications (further transformations are under development). We applied our proposal to two cases: a form-based information system and a CASE tool. We include details of the underlying software technology in order to pave the way for other research endeavours in this area.
AB - Technological advances in touch-based devices now allow users to interact with information systems in new ways, being gesture-based interaction a popular new kid on the block. Many daily tasks can be performed on mobile devices and desktop computers by applying multi-stroke gestures. Scaling up this type of interaction to bigger information systems and software tools entails difficulties, such as the fact that gesture definitions are platform-specific and this interaction is often hard-coded in the source code and hinders their analysis, validation and reuse. In an attempt to solve this problem, we here propose gestUI, a model-driven approach to the multi-stroke gesture-based user interface development. This system allows modelling gestures, automatically generating gesture catalogues for different gesture-recognition platforms, and user-testing the gestures. A model transformation automatically generates the user interface components that support this type of interaction for desktop applications (further transformations are under development). We applied our proposal to two cases: a form-based information system and a CASE tool. We include details of the underlying software technology in order to pave the way for other research endeavours in this area.
KW - Customised gesture
KW - Gesture-based interaction
KW - Model-driven engineering
KW - User interface
UR - https://www.scopus.com/pages/publications/84960101148
U2 - 10.1145/2829875.2829931
DO - 10.1145/2829875.2829931
M3 - Contribución a la conferencia
AN - SCOPUS:84960101148
T3 - ACM International Conference Proceeding Series
BT - Proceedings of the 16th International Conference on Human Computer Interaction, INTERACCION 2015
PB - Association for Computing Machinery
T2 - 16th International Conference on Human Computer Interaction, INTERACCION 2015
Y2 - 7 September 2015 through 9 September 2015
ER -