Files in this item
SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays
Item metadata
dc.contributor.author | Kim, Hyung-il | |
dc.contributor.author | Lee, Juyoung | |
dc.contributor.author | Yeo, Hui Shyong | |
dc.contributor.author | Quigley, Aaron John | |
dc.contributor.author | Woo, Woontack | |
dc.date.accessioned | 2019-02-08T10:30:04Z | |
dc.date.available | 2019-02-08T10:30:04Z | |
dc.date.issued | 2019-04-25 | |
dc.identifier.citation | Kim , H , Lee , J , Yeo , H S , Quigley , A J & Woo , W 2019 , SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays . in Adjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018 . , 8699201 , Institute of Electrical and Electronics Engineers Inc. , pp. 428-429 , 17th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018 , Munich , Germany , 16/10/18 . https://doi.org/10.1109/ISMAR-Adjunct.2018.00130 | en |
dc.identifier.citation | conference | en |
dc.identifier.isbn | 9781538675922 | |
dc.identifier.other | PURE: 257369125 | |
dc.identifier.other | PURE UUID: e6662490-bb20-4dec-8e2e-25ca8b30b991 | |
dc.identifier.other | Scopus: 85065528667 | |
dc.identifier.other | ORCID: /0000-0002-5274-6889/work/51943797 | |
dc.identifier.other | Scopus: 85065528667 | |
dc.identifier.other | WOS: 000487013100111 | |
dc.identifier.uri | http://hdl.handle.net/10023/17018 | |
dc.description.abstract | In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed Reality head-mounted displays. Robust tracking of hand and finger with egocentric camera remains a challenging problem, especially with self-occlusion – for example, when user tries to grab a virtual object in midair by closing the palm. Our approach leverages the use of a common smart watch worn on the wrist to provide a more reliable palm and wrist orientation data, while fusing the data with camera to achieve robust hand motion and orientation for interaction. | |
dc.format.extent | 2 | |
dc.language.iso | eng | |
dc.publisher | Institute of Electrical and Electronics Engineers Inc. | |
dc.relation.ispartof | Adjunct Proceedings - 2018 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2018 | en |
dc.rights | © 2018, IEEE. This work has been made available online in accordance with the publisher's policies. This is the author created accepted version manuscript following peer review and as such may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1109/ISMAR-Adjunct.2018.00130 | en |
dc.subject | Augmented reality | en |
dc.subject | Wearable computing | en |
dc.subject | 3D user interfaces | en |
dc.subject | Hand interaction | en |
dc.subject | Virtual 3D object manipulation | en |
dc.subject | QA75 Electronic computers. Computer science | en |
dc.subject | T Technology | en |
dc.subject | Electrical and Electronic Engineering | en |
dc.subject | Artificial Intelligence | en |
dc.subject | NDAS | en |
dc.subject.lcc | QA75 | en |
dc.subject.lcc | T | en |
dc.title | SWAG demo : smart watch assisted gesture interaction for mixed reality head-mounted displays | en |
dc.type | Conference item | en |
dc.description.version | Postprint | en |
dc.contributor.institution | University of St Andrews. School of Computer Science | en |
dc.identifier.doi | https://doi.org/10.1109/ISMAR-Adjunct.2018.00130 |
This item appears in the following Collection(s)
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.