Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorLei, Yaxiong
dc.contributor.authorWang, Yuheng
dc.contributor.authorCaslin, Tyler
dc.contributor.authorWisowaty, Alexander
dc.contributor.authorZhu, Xu
dc.contributor.authorKhamis, Mohamed
dc.contributor.authorYe, Juan
dc.date.accessioned2023-05-25T14:30:05Z
dc.date.available2023-05-25T14:30:05Z
dc.date.issued2023-05-18
dc.identifier286260006
dc.identifier9f1ddf19-dbdd-4026-9288-c6027f404cdc
dc.identifier85160424436
dc.identifier.citationLei , Y , Wang , Y , Caslin , T , Wisowaty , A , Zhu , X , Khamis , M & Ye , J 2023 , ' DynamicRead : exploring robust gaze interaction methods for reading on handheld mobile devices under dynamic conditions ' , Proceedings of the ACM on Human-Computer Interaction , vol. 7 , no. ETRA , 158 . https://doi.org/10.1145/3591127en
dc.identifier.issn2573-0142
dc.identifier.otherORCID: /0000-0002-0697-7942/work/135388094
dc.identifier.otherORCID: /0000-0002-2838-6836/work/135455079
dc.identifier.urihttps://hdl.handle.net/10023/27675
dc.descriptionFunding: Lei, Y. and Wang, Y. acknowledge the financial support by the University of St Andrews and China Scholarship Council Joint Scholarshipen
dc.description.abstractEnabling gaze interaction in real-time on handheld mobile devices has attracted significant attention in recent years. An increasing number of research projects have focused on sophisticated appearance-based deep learning models to enhance the precision of gaze estimation on smartphones. This inspires important research questions, including how the gaze can be used in a real-time application, and what type of gaze interaction methods are preferable under dynamic conditions in terms of both user acceptance and delivering reliable performance. To address these questions, we design four types of gaze scrolling techniques: three explicit technique based on Gaze Gesture, Dwell time, and Pursuit; and one implicit technique based on reading speed to support touch-free, page-scrolling on a reading application. We conduct a 20-participant user study under both sitting and walking settings and our results reveal that Gaze Gesture and Dwell time-based interfaces are more robust while walking and Gaze Gesture has achieved consistently good scores on usability while not causing high cognitive workload.
dc.format.extent17
dc.format.extent5322606
dc.language.isoeng
dc.relation.ispartofProceedings of the ACM on Human-Computer Interactionen
dc.subjectEye Trackingen
dc.subjectMobile devicesen
dc.subjectSmartphonesen
dc.subjectGaze-based Interactionen
dc.subjectDwellen
dc.subjectPursuiten
dc.subjectGaze Gestureen
dc.subjectScrolling Techniquesen
dc.subjectReadingen
dc.subjectQA75 Electronic computers. Computer scienceen
dc.subject3rd-DASen
dc.subjectMCCen
dc.subjectACen
dc.subject.lccQA75en
dc.titleDynamicRead : exploring robust gaze interaction methods for reading on handheld mobile devices under dynamic conditionsen
dc.typeJournal articleen
dc.contributor.institutionUniversity of St Andrews. School of Computer Scienceen
dc.contributor.institutionUniversity of St Andrews. Statisticsen
dc.identifier.doi10.1145/3591127
dc.description.statusPeer revieweden


This item appears in the following Collection(s)

Show simple item record