St Andrews Research Repository

St Andrews University Home
View Item 
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  • Login
JavaScript is disabled for your browser. Some features of this site may not work without it.

User-defined interface gestures : dataset and analysis

Thumbnail
View/Open
AuxiliaryMaterialCompressed.zip (97.63Mb)
GrijincuNacentaKristensson_UserDefinedInterfaceGesturesDataset.pdf (1.906Mb)
Date
16/11/2014
Author
Grijincu, Daniela
Nacenta, Miguel
Kristensson, Per Ola
Keywords
Gesture design
User-defined gestures
Gesture elicitation
Gesture analysis methodology
Gesture annotation
Gesture memorability
Gestures
Gesture datasets
Crowdsourcing
QA75 Electronic computers. Computer science
Metadata
Show full item record
Altmetrics Handle Statistics
Altmetrics DOI Statistics
Abstract
We present a video-based gesture dataset and a methodology for annotating video-based gesture datasets. Our dataset consists of user-defined gestures generated by 18 participants from a previous investigation of gesture memorability. We design and use a crowd-sourced classification task to annotate the videos. The results are made available through a web-based visualization that allows researchers and designers to explore the dataset. Finally, we perform an additional descriptive analysis and quantitative modeling exercise that provide additional insights into the results of the original study. To facilitate the use of the presented methodology by other researchers we share the data, the source of the human intelligence tasks for crowdsourcing, a new taxonomy that integrates previous work, and the source code of the visualization tool.
Citation
Grijincu , D , Nacenta , M & Kristensson , P O 2014 , User-defined interface gestures : dataset and analysis . in Proceedings of the 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS 2014) . ACM , New York, NY , pp. 25-34 . https://doi.org/10.1145/2669485.2669511
Publication
Proceedings of the 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS 2014)
DOI
https://doi.org/10.1145/2669485.2669511
Type
Conference item
Rights
© Authors/owners 2014. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in the Proceedings of the 2014 ACM Interactive Tabletops and Surfaces Conference (ITS '14), http://dx.doi.org/10.1145/2669485.2669511 The copy of record of the paper can be found in: http://dx.doi.org/10.1145/2669485.2669511
 
This material is open for use by anyone. We would appreciate if you cite the original paper if you use the data or the videos in your own work.
Collections
  • University of St Andrews Research
URL
http://udigesturesdataset.cs.st-andrews.ac.uk/
URI
http://hdl.handle.net/10023/5841

Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

Related items

Showing items related by title, author, creator and subject.

  • Great ape gestures : intentional communication with a rich set of innate signals 

    Byrne, R. W.; Cartmill, E.; Genty, E.; Graham, K. E.; Hobaiter, C.; Tanner, J. (2017-07) - Journal item
    Great apes give gestures deliberately and voluntarily, in order to influence particular target audiences, whose direction of attention they take into account when choosing which type of gesture to use. These facts make the ...
  • Memorability of pre-designed and user-defined gesture sets 

    Nacenta, Miguel; Kamber, Yemliha; Qiang, Yizhou; Kristensson, Per Ola (ACM, 2013-04-27) - Conference item
    We studied the memorability of free-form gesture sets for invoking actions. We compared three types of gesture sets: user-defined gesture sets, gesture sets designed by the authors, and random gesture sets in three studies ...
  • Itchy Nose : discreet gesture interaction using EOG sensors in smart eyewear 

    Lee, Juyoung; Yeo, Hui Shyong; Dhuliawala, Murtaza; Akano, Jedidiah; Shimizu, Junichi; Starner, Thad; Quigley, Aaron John; Woo, Woontack; Kunze, Kai (ACM, 2017-09-11) - Conference item
    We propose a sensing technique for detecting finger movements on the nose, using EOG sensors embedded in the frame of a pair of eyeglasses. Eyeglasses wearers can use their fingers to exert different types of movement on ...
Advanced Search

Browse

All of RepositoryCommunities & CollectionsBy Issue DateNamesTitlesSubjectsClassificationTypeFunderThis CollectionBy Issue DateNamesTitlesSubjectsClassificationTypeFunder

My Account

Login

Open Access

To find out how you can benefit from open access to research, see our library web pages and Open Access blog. For open access help contact: openaccess@st-andrews.ac.uk.

Accessibility

Read our Accessibility statement.

How to submit research papers

The full text of research papers can be submitted to the repository via Pure, the University's research information system. For help see our guide: How to deposit in Pure.

Electronic thesis deposit

Help with deposit.

Repository help

For repository help contact: Digital-Repository@st-andrews.ac.uk.

Give Feedback

Cookie policy

This site may use cookies. Please see Terms and Conditions.

Usage statistics

COUNTER-compliant statistics on downloads from the repository are available from the IRUS-UK Service. Contact us for information.

© University of St Andrews Library

University of St Andrews is a charity registered in Scotland, No SC013532.

  • Facebook
  • Twitter