File Download
Supplementary
-
Citations:
- Appears in Collections:
postgraduate thesis: Attentive gestural user interface for touch screens
Title | Attentive gestural user interface for touch screens |
---|---|
Authors | |
Advisors | Advisor(s):Lau, FCM |
Issue Date | 2013 |
Publisher | The University of Hong Kong (Pokfulam, Hong Kong) |
Citation | Li, S. [李思锐]. (2013). Attentive gestural user interface for touch screens. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5090008 |
Abstract | Gestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed.
Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive.
To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm.
It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness
This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces. |
Degree | Master of Philosophy |
Subject | User interfaces (Computer science) Human-computer interaction Touch screens. |
Dept/Program | Computer Science |
Persistent Identifier | http://hdl.handle.net/10722/192863 |
HKU Library Item ID | b5090008 |
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Lau, FCM | - |
dc.contributor.author | Li, Sirui. | - |
dc.contributor.author | 李思锐. | - |
dc.date.accessioned | 2013-11-24T02:01:16Z | - |
dc.date.available | 2013-11-24T02:01:16Z | - |
dc.date.issued | 2013 | - |
dc.identifier.citation | Li, S. [李思锐]. (2013). Attentive gestural user interface for touch screens. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5090008 | - |
dc.identifier.uri | http://hdl.handle.net/10722/192863 | - |
dc.description.abstract | Gestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed. Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive. To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm. It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces. | - |
dc.language | eng | - |
dc.publisher | The University of Hong Kong (Pokfulam, Hong Kong) | - |
dc.relation.ispartof | HKU Theses Online (HKUTO) | - |
dc.rights | The author retains all proprietary rights, (such as patent rights) and the right to use in future works. | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.source.uri | http://hub.hku.hk/bib/B50900080 | - |
dc.subject.lcsh | User interfaces (Computer science) | - |
dc.subject.lcsh | Human-computer interaction | - |
dc.subject.lcsh | Touch screens. | - |
dc.title | Attentive gestural user interface for touch screens | - |
dc.type | PG_Thesis | - |
dc.identifier.hkul | b5090008 | - |
dc.description.thesisname | Master of Philosophy | - |
dc.description.thesislevel | Master | - |
dc.description.thesisdiscipline | Computer Science | - |
dc.description.nature | published_or_final_version | - |
dc.identifier.doi | 10.5353/th_b5090008 | - |
dc.date.hkucongregation | 2013 | - |
dc.identifier.mmsid | 991035826949703414 | - |