Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorRosales Sanabria, Andrea
dc.contributor.authorZambonelli, Franco
dc.contributor.authorDobson, Simon Andrew
dc.contributor.authorYe, Juan
dc.date.accessioned2022-11-06T00:44:32Z
dc.date.available2022-11-06T00:44:32Z
dc.date.issued2021-11-06
dc.identifier276192008
dc.identifiercbfb37bc-0d93-4733-9ef5-b4e01fe2c5f0
dc.identifier85119333188
dc.identifier000724992900003
dc.identifier.citationRosales Sanabria , A , Zambonelli , F , Dobson , S A & Ye , J 2021 , ' ContrasGAN : unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learning ' , Pervasive and Mobile Computing , vol. In Press , 101477 , pp. 1-34 . https://doi.org/10.1016/j.pmcj.2021.101477en
dc.identifier.issn1574-1192
dc.identifier.otherORCID: /0000-0002-2838-6836/work/103137333
dc.identifier.otherORCID: /0000-0001-9633-2103/work/103137494
dc.identifier.urihttps://hdl.handle.net/10023/26305
dc.description.abstractHuman Activity Recognition (HAR) makes it possible to drive applications directly from embedded and wearable sensors. Machine learning, and especially deep learning, has made significant progress in learning sensor features from raw sensing signals with high recognition accuracy. However, most techniques need to be trained on a large labelled dataset, which is often difficult to acquire. In this paper, we present ContrasGAN, an unsupervised domain adaptation technique that addresses this labelling challenge by transferring an activity model from one labelled domain to other unlabelled domains. ContrasGAN uses bi-directional generative adversarial networks for heterogeneous feature transfer and contrastive learning to capture distinctive features between classes. We evaluate ContrasGAN on three commonly-used HAR datasets under conditions of cross-body, cross-user, and cross-sensor transfer learning. Experimental results show a superior performance of ContrasGAN on all these tasks over a number of state-of-the-art techniques, with relatively low computational cost.
dc.format.extent34
dc.format.extent5332212
dc.language.isoeng
dc.relation.ispartofPervasive and Mobile Computingen
dc.subjectHuman activity recognitionen
dc.subjectUnsupervised domain adaptationen
dc.subjectGANen
dc.subjectContrastive lossen
dc.subjectQA75 Electronic computers. Computer scienceen
dc.subject3rd-DASen
dc.subjectACen
dc.subject.lccQA75en
dc.titleContrasGAN : unsupervised domain adaptation in Human Activity Recognition via adversarial and contrastive learningen
dc.typeJournal articleen
dc.contributor.institutionUniversity of St Andrews. School of Computer Scienceen
dc.contributor.institutionUniversity of St Andrews. Sir James Mackenzie Institute for Early Diagnosisen
dc.identifier.doi10.1016/j.pmcj.2021.101477
dc.description.statusPeer revieweden
dc.date.embargoedUntil2022-11-06


This item appears in the following Collection(s)

Show simple item record