Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorYeo, Hui Shyong
dc.contributor.authorFlamich, Gergely
dc.contributor.authorSchrempf, Patrick Maurice
dc.contributor.authorHarris-Birtill, David Cameron Christopher
dc.contributor.authorQuigley, Aaron John
dc.date.accessioned2016-10-16T23:34:17Z
dc.date.available2016-10-16T23:34:17Z
dc.date.issued2016-10-16
dc.identifier245678240
dc.identifierc267eae2-2848-4520-ac0a-db13fd9068a3
dc.identifier84995792510
dc.identifier000387605000076
dc.identifier.citationYeo , H S , Flamich , G , Schrempf , P M , Harris-Birtill , D C C & Quigley , A J 2016 , RadarCat  : Radar Categorization for input & interaction . in Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16) . ACM , pp. 833-841 , 29th ACM User Interface Software and Technology Symposium , Tokyo , Japan , 16/10/16 . https://doi.org/10.1145/2984511.2984515en
dc.identifier.citationconferenceen
dc.identifier.isbn9781450341899
dc.identifier.otherORCID: /0000-0002-5274-6889/work/34040080
dc.identifier.otherORCID: /0000-0002-0740-3668/work/43832018
dc.identifier.otherORCID: /0000-0003-2484-6855/work/48516901
dc.identifier.urihttps://hdl.handle.net/10023/9672
dc.descriptionThe research described here was supported by the University of St Andrews and the Scottish Informatics and Computer Science Alliance (SICSA).en
dc.description.abstractIn RadarCat we present a small, versatile radar-based system for material and object classification which enables new forms of everyday proximate interaction with digital devices. We demonstrate that we can train and classify different types of materials and objects which we can then recognize in real time. Based on established research designs, we report on the results of three studies, first with 26 materials (including complex composite objects), next with 16 transparent materials (with different thickness and varying dyes) and finally 10 body parts from 6 participants. Both leave one-out and 10-fold cross-validation demonstrate that our approach of classification of radar signals using random forest classifier is robust and accurate. We further demonstrate four working examples including a physical object dictionary, painting and photo editing application, body shortcuts and automatic refill based on RadarCat. We conclude with a discussion of our results, limitations and outline future directions.
dc.format.extent9
dc.format.extent7739493
dc.language.isoeng
dc.publisherACM
dc.relation.ispartofProceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16)en
dc.subjectContext-aware interactionen
dc.subjectMachine learningen
dc.subjectMaterial classificationen
dc.subjectObject recognitionen
dc.subjectUbiquitous computingen
dc.subjectQA76 Computer softwareen
dc.subjectT Technologyen
dc.subjectNDASen
dc.subjectBDCen
dc.subjectR2Cen
dc.subject~DC~en
dc.subject.lccQA76en
dc.subject.lccTen
dc.titleRadarCat  : Radar Categorization for input & interactionen
dc.typeConference itemen
dc.contributor.institutionUniversity of St Andrews. School of Computer Scienceen
dc.identifier.doi10.1145/2984511.2984515
dc.date.embargoedUntil2016-10-16


This item appears in the following Collection(s)

Show simple item record