Multi-scale gestural interaction for augmented reality
MetadataShow full item record
We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.
Ens , B , Quigley , A J , Yeo , H S , Irani , P & Billinghurst , M 2017 , Multi-scale gestural interaction for augmented reality . in SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications . , 11 , ACM , New York , 10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia , Bangkok , Thailand , 27/11/17 . https://doi.org/10.1145/3132787.3132808conference
SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications
© 2017, the Owner(s) / the Author(s). This work has been made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1145/3132787.3132808
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.