Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorDemšar, Urška
dc.contributor.authorÇöltekin, Arzu
dc.date.accessioned2017-08-07T09:30:15Z
dc.date.available2017-08-07T09:30:15Z
dc.date.issued2017-08-04
dc.identifier.citationDemšar , U & Çöltekin , A 2017 , ' Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology ' PLoS One , vol. 12 , no. 8 . https://doi.org/10.1371/journal.pone.0181818en
dc.identifier.issn1932-6203
dc.identifier.otherPURE: 250537402
dc.identifier.otherPURE UUID: a532183c-53cd-40b1-80bc-c521358c95eb
dc.identifier.otherScopus: 85026812809
dc.identifier.otherORCID: /0000-0001-7791-2807/work/48516829
dc.identifier.urihttp://hdl.handle.net/10023/11395
dc.descriptionThis research was supported by the Royal Society International Exchange Programme (grant no. IE120643).en
dc.description.abstractEye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.
dc.format.extent36
dc.language.isoeng
dc.relation.ispartofPLoS Oneen
dc.rights© 2017 Demšar, Çöltekin. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.en
dc.subjectHuman-Computer Interactionen
dc.subjectEye Trackingen
dc.subjectMouse Trackingen
dc.subjectHand-Eye coordinationen
dc.subjectMovement Analyticsen
dc.subjectMovement Visualizationen
dc.subjectGeographic Information Scienceen
dc.subjectRC0321 Neuroscience. Biological psychiatry. Neuropsychiatryen
dc.subjectDASen
dc.subject.lccRC0321en
dc.titleQuantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodologyen
dc.typeJournal articleen
dc.description.versionPublisher PDFen
dc.contributor.institutionUniversity of St Andrews.School of Geography & Sustainable Developmenten
dc.contributor.institutionUniversity of St Andrews.Bell-Edwards Geographic Data Instituteen
dc.identifier.doihttps://doi.org/10.1371/journal.pone.0181818
dc.description.statusPeer revieweden
dc.identifier.urlhttp://journals.plos.org/plosone/article?id=10.1371/journal.pone.0181818#sec032en


This item appears in the following Collection(s)

Show simple item record