SLearn : shared learning human activity labels across multiple datasets
Abstract
The research of sensor-based human activity recognition has been attracting increasing attention over years as it is playing an important role in various human-beneficiary applications such as ambient assistive living, health monitoring, and behaviour changing. Nowadays, the advancement of sensing and communication technologies has led to the possibility of collecting a large amount of sensor data, however, to build a reliable computational model and accurately recognise human activities we still need the annotations on sensor data. Acquiring high-quality, detailed, continuous annotations is a challenging task. In this paper, we explore the solution space on sharing annotated activities across different datasets in order to enhance the recognition accuracies. We have designed and developed two approaches: sharing training data and sharing classifiers towards addressing this challenge. We have validated the approach on three datasets and demonstrated their effectiveness in recognising activities only with annotations from as little as 0.1% of each dataset.
Citation
Ye , J 2018 , SLearn : shared learning human activity labels across multiple datasets . in 2018 IEEE International Conference on Pervasive Computing and Communications . , 8444594 , IEEE Computer Society , IEEE International Conference on Pervasive Computing and Communications (PerCom) , Athens , Greece , 19/03/18 . https://doi.org/10.1109/PERCOM.2018.8444594 conference
Publication
2018 IEEE International Conference on Pervasive Computing and Communications
Type
Conference item
Rights
© 2018, IEEE. This work has been made available online in accordance with the publisher’s policies. This is the author created, accepted version manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1109/PERCOM.2018.8444594
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.