SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays
MetadataShow full item record
In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed Reality head-mounted displays. Robust tracking of hand and finger with egocentric camera remains a challenging problem, especially with self-occlusion – for example, when user tries to grab a virtual object in midair by closing the palm. Our approach leverages the use of a common smart watch worn on the wrist to provide a more reliable palm and wrist orientation data, while fusing the data with camera to achieve robust hand motion and orientation for interaction.
Kim , H , Lee , J , Yeo , H S , Quigley , A J & Woo , W 2018 , SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays . in 2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) . IEEE Computer Society , pp. 428-429 , 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) , Munich , Germany , 16/12/18 . https://doi.org/10.1109/ISMAR-Adjunct.2018.0013conference
2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
© 2018, IEEE. This work has been made available online in accordance with the publisher's policies. This is the author created accepted version manuscript following peer review and as such may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1109/ISMAR-Adjunct.2018.0013
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.