File Download
  Links for fulltext
     (May Require Subscription)
Supplementary

postgraduate thesis: Attentive gestural user interface for touch screens

TitleAttentive gestural user interface for touch screens
Authors
Advisors
Advisor(s):Lau, FCM
Issue Date2013
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Citation
Li, S. [李思锐]. (2013). Attentive gestural user interface for touch screens. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5090008
AbstractGestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed. Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive. To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm. It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces.
DegreeMaster of Philosophy
SubjectUser interfaces (Computer science)
Human-computer interaction
Touch screens.
Dept/ProgramComputer Science
Persistent Identifierhttp://hdl.handle.net/10722/192863
HKU Library Item IDb5090008

 

DC FieldValueLanguage
dc.contributor.advisorLau, FCM-
dc.contributor.authorLi, Sirui.-
dc.contributor.author李思锐.-
dc.date.accessioned2013-11-24T02:01:16Z-
dc.date.available2013-11-24T02:01:16Z-
dc.date.issued2013-
dc.identifier.citationLi, S. [李思锐]. (2013). Attentive gestural user interface for touch screens. (Thesis). University of Hong Kong, Pokfulam, Hong Kong SAR. Retrieved from http://dx.doi.org/10.5353/th_b5090008-
dc.identifier.urihttp://hdl.handle.net/10722/192863-
dc.description.abstractGestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed. Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive. To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm. It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces.-
dc.languageeng-
dc.publisherThe University of Hong Kong (Pokfulam, Hong Kong)-
dc.relation.ispartofHKU Theses Online (HKUTO)-
dc.rightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works.-
dc.rightsThis work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.-
dc.source.urihttp://hub.hku.hk/bib/B50900080-
dc.subject.lcshUser interfaces (Computer science)-
dc.subject.lcshHuman-computer interaction-
dc.subject.lcshTouch screens.-
dc.titleAttentive gestural user interface for touch screens-
dc.typePG_Thesis-
dc.identifier.hkulb5090008-
dc.description.thesisnameMaster of Philosophy-
dc.description.thesislevelMaster-
dc.description.thesisdisciplineComputer Science-
dc.description.naturepublished_or_final_version-
dc.identifier.doi10.5353/th_b5090008-
dc.date.hkucongregation2013-
dc.identifier.mmsid991035826949703414-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats