Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.advisorArandelovic, Ognjen
dc.contributor.authorZhang, Liangfei
dc.coverage.spatial143en_US
dc.date.accessioned2023-11-03T12:19:41Z
dc.date.available2023-11-03T12:19:41Z
dc.date.issued2023-11-28
dc.identifier.urihttps://hdl.handle.net/10023/28628
dc.description.abstractEmotional states exert a profound influence on individuals' overall well-being, impacting them both physically and psychologically. Accurate recognition and comprehension of human emotions represent a crucial area of scientific exploration. Facial expressions, vocal cues, body language, and physiological responses provide valuable insights into an individual's emotional state, with facial expressions being universally recognised as dependable indicators of emotions. This thesis centres around three vital research aspects concerning the automated inference of latent emotions from spontaneous facial micro-expressions, seeking to enhance and refine our understanding of this complex domain. Firstly, the research aims to detect and analyse activated Action Units (AUs) during the occurrence of micro-expressions. AUs correspond to facial muscle movements. Although previous studies have established links between AUs and conventional facial expressions, no such connections have been explored for micro-expressions. Therefore, this thesis develops computer vision techniques to automatically detect activated AUs in micro-expressions, bridging a gap in existing studies. Secondly, the study explores the evolution of micro-expression recognition techniques, ranging from early handcrafted feature-based approaches to modern deep-learning methods. These approaches have significantly contributed to the field of automatic emotion recognition. However, existing methods primarily focus on capturing local spatial relationships, neglecting global relationships between different facial regions. To address this limitation, a novel third-generation architecture is proposed. This architecture can concurrently capture both short and long-range spatiotemporal relationships in micro-expression data, aiming to enhance the accuracy of automatic emotion recognition and improve our understanding of micro-expressions. Lastly, the thesis investigates the integration of multimodal signals to enhance emotion recognition accuracy. Depth information complements conventional RGB data by providing enhanced spatial features for analysis, while the integration of physiological signals with facial micro-expressions improves emotion discrimination. By incorporating multimodal data, the objective is to enhance machines' understanding of latent emotions and improve latent emotion recognition accuracy in spontaneous micro-expression analysis.en_US
dc.language.isoenen_US
dc.relationZhang, L., & Arandjelović, O. (2021). Review of automatic microexpression recognition in the past decade. Machine Learning and Knowledge Extraction, 3(2), 414-434. https://doi.org/10.3390/make3020021 [http://hdl.handle.net/10023/23112 : Open Access version]en
dc.relationZhang, L., Arandjelovic, O., & Hong, X. (2021). Facial action unit detection with local key facial sub-region based multi-label classification for micro-expression analysis. In FME'21: Proceedings of the 1st Workshop on Facial Micro-expression: Advanced Techniques for Facial Expressions Generation and Spotting (pp. 11-18). ACM. https://doi.org/10.1145/3476100.3484462 [http://hdl.handle.net/10023/24308 : Open Access version]en
dc.relationZhang, L., Hong, X., Arandjelovic, O., & Zhao, G. (2022). Short and long range relation based spatio-temporal transformer for micro-expression recognition. IEEE Transactions on Affective Computing, Early Access, 1-13. https://doi.org/10.1109/TAFFC.2022.3213509 [http://hdl.handle.net/10023/26276 : Open Access version]en
dc.relationZhang, L., Arandjelovic, O., Dewar, S., Astell, A., Doherty, G., & Ellis, M. (2020). Quantification of advanced dementia patients' engagement in therapeutic sessions: an automatic video based approach using computer vision and machine learning. In 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC) (Vol. 2020, pp. 5785-5788). (Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference). IEEE. https://doi.org/10.1109/EMBC44109.2020.9176632en
dc.relation.urihttp://hdl.handle.net/10023/23112
dc.relation.urihttp://hdl.handle.net/10023/24308
dc.relation.urihttp://hdl.handle.net/10023/26276
dc.relation.urihttps://doi.org/10.1109/EMBC44109.2020.9176632
dc.subjectLatent emotion recognitionen_US
dc.subjectSpontaneous micro-expression analysisen_US
dc.subjectAffective computingen_US
dc.subjectComputer visionen_US
dc.subjectMulti-modal learningen_US
dc.titleAutomatic inference of latent emotion from spontaneous facial micro-expressionsen_US
dc.typeThesisen_US
dc.contributor.sponsorChina Scholarship Council (CSC)en_US
dc.contributor.sponsorUniversity of St Andrewsen_US
dc.type.qualificationlevelDoctoralen_US
dc.type.qualificationnamePhD Doctor of Philosophyen_US
dc.publisher.institutionThe University of St Andrewsen_US
dc.identifier.doihttps://doi.org/10.17630/sta/649
dc.identifier.grantnumber201908060250en_US


This item appears in the following Collection(s)

Show simple item record