Unsupervised domain adaptation for activity recognition across heterogeneous datasets
MetadataShow full item record
Sensor-based human activity recognition is to recognise human daily activities through a collection of ambient and wearable sensors. It is the key enabler for many healthcare applications, especially in ambient assisted living. The advance of sensing and communication technologies has driven the deployment of sensors in many residential and care home settings. However, the challenge still resides in the lack of sufficient, high-quality activity annotations on sensor data, which most of the existing activity recognition algorithms rely on. In this paper, we propose an Unsupervised Domain adaptation technique for Activity Recognition, called UDAR, which supports sharing and transferring activity models from one dataset to another heterogeneous dataset without the need of activity labels on the latter. This approach has combined knowledge- and data-driven techniques to achieve coarse- and fine-grained feature alignment. We have evaluated UDAR on five third-party, real-world datasets and have demonstrated high recognition accuracy and robustness against sensor noise, compared to the state-of-the-art domain adaptation techniques.
Rosales Sanabria , A & Ye , J 2020 , ' Unsupervised domain adaptation for activity recognition across heterogeneous datasets ' , Pervasive and Mobile Computing , vol. In press , 101147 . https://doi.org/10.1016/j.pmcj.2020.101147
Pervasive and Mobile Computing
Copyright © 2020 Published by Elsevier B.V. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1016/j.pmcj.2020.101147
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.