Tangible UI by object and material classification with radar
Abstract
Radar signals penetrate, scatter, absorb and reflect energy into proximate objects and ground penetrating and aerial radar systems are well established. We describe a highly accurate system based on a combination of a monostatic radar (Google Soli), supervised machine learning to support object and material classification based Uls. Based on RadarCat techniques, we explore the development of tangible user interfaces without modification of the objects or complex infrastructures. This affords new forms of interaction with digital devices, proximate objects and micro-gestures.
Citation
Yeo , H S , Ens , B & Quigley , A J 2017 , Tangible UI by object and material classification with radar . in SA '17 SIGGRAPH Asia 2017 Emerging Technologies . , 14 , ACM , New York , 10th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia , Bangkok , Thailand , 27/11/17 . https://doi.org/10.1145/3132818.3132824 conference
Publication
SA '17 SIGGRAPH Asia 2017 Emerging Technologies
Type
Conference item
Rights
© 2017, the Owner(s) / the Author(s). This work has been made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1145/3132818.3132824
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.