ContrasGAN : unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning
Date
06/11/2021Keywords
Metadata
Show full item recordAltmetrics Handle Statistics
Altmetrics DOI Statistics
Abstract
Human Activity Recognition (HAR) makes it possible to drive applications directly from embedded and wearable sensors. Machine learning, and especially deep learning, has made significant progress in learning sensor features from raw sensing signals with high recognition accuracy. However, most techniques need to be trained on a large labelled dataset, which is often difficult to acquire. In this paper, we present ContrasGAN, an unsupervised domain adaptation technique that addresses this labelling challenge by transferring an activity model from one labelled domain to other unlabelled domains. ContrasGAN uses bi-directional generative adversarial networks for heterogeneous feature transfer and contrastive learning to capture distinctive features between classes. We evaluate ContrasGAN on three commonly-used HAR datasets under conditions of cross-body, cross-user, and cross-sensor transfer learning. Experimental results show a superior performance of ContrasGAN on all these tasks over a number of state-of-the-art techniques, with relatively low computational cost.
Citation
Rosales Sanabria , A , Zambonelli , F , Dobson , S A & Ye , J 2021 , ' ContrasGAN : unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning ' , Pervasive and Mobile Computing , vol. In Press , 101477 , pp. 1-34 . https://doi.org/10.1016/j.pmcj.2021.101477
Publication
Pervasive and Mobile Computing
Status
Peer reviewed
ISSN
1574-1192Type
Journal article
Rights
Copyright © 2021 Elsevier B.V. All rights reserved. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1016/j.pmcj.2021.101477.
Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.