Shared learning activity labels across heterogeneous datasets
Abstract
Nowadays, the advancement of sensing and communication technologies has led to the possibility of collecting a large amount of sensor data, however, to build a reliable computational model and accurately recognise human activities we still need the annotations on sensor data. Acquiring high-quality, detailed, continuous annotations is a challenging task. In this paper, we explore the solution space on sharing annotated activities across different datasets in order to enhance the recognition accuracies. The main challenge is to resolve heterogeneity in feature and activity space between datasets; that is, each dataset can have a different number of sensors in heterogeneous sensing technologies and deployed in diverse environments and record various activities on different users. To address the challenge, we have designed and developed sharing data and sharing classifiers algorithms that feature the knowledge model to enable computationally-efficient feature space remapping and uncertainty reasoning to enable effective classifier fusion. We have validated the algorithms on three third-party real-world datasets and demonstrated their effectiveness in recognising activities only with annotations from as little as 0.1% of each dataset.
Citation
Ye , J 2021 , ' Shared learning activity labels across heterogeneous datasets ' , Journal of Ambient Intelligence and Smart Environments , vol. Pre-press , pp. 1-18 . https://doi.org/10.3233/AIS-210590
Publication
Journal of Ambient Intelligence and Smart Environments
Status
Peer reviewed
ISSN
1876-1364Type
Journal article
Rights
Copyright © 2021 IOS Press and the Author(s). All rights reserved. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.3233/AIS-210590.
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.