TY - GEN
T1 - Tailoring user interfaces to include gesture-based interaction with gestUI
AU - Parra, Otto
AU - España, Sergio
AU - Pastor, Oscar
N1 - Publisher Copyright:
© Springer International Publishing AG 2016.
PY - 2016
Y1 - 2016
N2 - The development of custom gesture-based user interfaces requires software engineers to be skillful in the use of the tools and languages needed to implement them. gestUI, a model-driven method, can help them achieve these skills by defining custom gestures and including gesture-based interaction in existing user interfaces. Up to now, gestUI has used the same gesture catalogue for all software users, with gestures that could not be subsequently redefined. In this paper, we extend gestUI by including a user profile in the metamodel that permits individual users to define custom gestures and to include gesture-based interaction in user interfaces. Using tailoring mechanisms, each user can redefine his custom gestures during the software runtime. Although both features are supported by models, the gestUI tool hides its technical complexity from the users. We validated these gestUI features in a technical action research in an industrial context. The results showed that these features were perceived as both useful and easy to use when defining/redefining custom gestures and including them in a user interface.
AB - The development of custom gesture-based user interfaces requires software engineers to be skillful in the use of the tools and languages needed to implement them. gestUI, a model-driven method, can help them achieve these skills by defining custom gestures and including gesture-based interaction in existing user interfaces. Up to now, gestUI has used the same gesture catalogue for all software users, with gestures that could not be subsequently redefined. In this paper, we extend gestUI by including a user profile in the metamodel that permits individual users to define custom gestures and to include gesture-based interaction in user interfaces. Using tailoring mechanisms, each user can redefine his custom gestures during the software runtime. Although both features are supported by models, the gestUI tool hides its technical complexity from the users. We validated these gestUI features in a technical action research in an industrial context. The results showed that these features were perceived as both useful and easy to use when defining/redefining custom gestures and including them in a user interface.
KW - Custom gesture
KW - Gesture-based interaction
KW - Human-computer interaction
KW - Model-driven development
KW - Technical action research
KW - User interface
UR - https://www.scopus.com/pages/publications/84997327184
U2 - 10.1007/978-3-319-46397-1_38
DO - 10.1007/978-3-319-46397-1_38
M3 - Contribución a la conferencia
AN - SCOPUS:84997327184
SN - 9783319463964
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 496
EP - 504
BT - Conceptual Modeling - 35th International Conference, ER 2016, Proceedings
A2 - Comyn-Wattiau, Isabelle
A2 - Song, Il-Yeol
A2 - Yamamoto, Shuichiro
A2 - Saeki, Motoshi
A2 - Tanaka, Katsumi
PB - Springer Verlag
T2 - 35th International Conference on Conceptual Modelling, ER 2016 held in conjunction with Workshops on AHA, MoBiD, MORE-BI, MReBA, QMMQ, SCME and WM2SP, 2016
Y2 - 14 November 2016 through 17 November 2016
ER -