Multi-scale gestural interaction for augmented reality
Abstract
We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.
Citation
Ens , B , Quigley , A J , Yeo , H S , Irani , P & Billinghurst , M 2017 , Multi-scale gestural interaction for augmented reality . in SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications . , 11 , ACM , New York , 10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia , Bangkok , Thailand , 27/11/17 . https://doi.org/10.1145/3132787.3132808 conference
Publication
SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications
Type
Conference item
Rights
© 2017, the Owner(s) / the Author(s). This work has been made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1145/3132787.3132808
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.