Short and long range relation based spatio-temporal transformer for micro-expression recognition
Abstract
Being spontaneous, micro-expressions are useful in the inference of a person's true emotions even if an attempt is made to conceal them. Due to their short duration and low intensity, the recognition of micro-expressions is a difficult task in affective computing. The early work based on handcrafted spatio-temporal features which showed some promise, has recently been superseded by different deep learning approaches which now compete for the state of the art performance. Nevertheless, the problem of capturing both local and global spatio-temporal patterns remains challenging. To this end, herein we propose a novel spatio-temporal transformer architecture – to the best of our knowledge, the first purely transformer based approach (i.e. void of any convolutional network use) for micro-expression recognition. The architecture comprises a spatial encoder which learns spatial patterns, a temporal aggregator for temporal dimension analysis, and a classification head. A comprehensive evaluation on three widely used spontaneous micro-expression data sets, namely SMIC-HS, CASME II and SAMM, shows that the proposed approach consistently outperforms the state of the art, and is the first framework in the published literature on micro-expression recognition to achieve the unweighted F1-score greater than 0.9 on any of the aforementioned data sets.
Citation
Zhang , L , Hong , X , Arandjelovic , O & Zhao , G 2022 , ' Short and long range relation based spatio-temporal transformer for micro-expression recognition ' , IEEE Transactions on Affective Computing , vol. 13 , no. 4 , pp. 1973-1985 . https://doi.org/10.1109/TAFFC.2022.3213509
Publication
IEEE Transactions on Affective Computing
Status
Peer reviewed
ISSN
1949-3045Type
Journal article
Description
Funding: The authors would like to thank the China Scholarship Council – University of St Andrews Scholarships (No.201908060250) funds L. Zhang for her PhD. This work is funded by the National Key Research and Development Project of China under Grant No. 2019YFB1312000, the National Natural Science Foundation of China under Grant No. 62076195, and the Fundamental Research Funds for the Central Universities under Grant No. AUGA5710011522.Collections
Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.