Show simple item record

Files in this item

Thumbnail

Item metadata

dc.contributor.authorMansouri Benssassi, Esma
dc.contributor.authorYe, Juan
dc.date.accessioned2020-02-11T13:30:03Z
dc.date.available2020-02-11T13:30:03Z
dc.date.issued2020-04-03
dc.identifier264038993
dc.identifier4cd7621a-4a04-48fd-ae52-41aff22a1092
dc.identifier85099878650
dc.identifier000667722801052
dc.identifier.citationMansouri Benssassi , E & Ye , J 2020 , Synch-Graph : multisensory emotion recognition through neural synchrony via graph convolutional networks . in Proceedings of the AAAI Conference on Artificial Intelligence (AAAI-20) . Proceedings of the AAAI Conference on Artificial Intelligence , AAAI Press , pp. 1351-1358 , Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20) , New York , New York , United States , 7/02/20 . https://doi.org/10.1609/aaai.v34i02.5491en
dc.identifier.citationconferenceen
dc.identifier.isbn9781577358350
dc.identifier.issn2159-5399
dc.identifier.otherORCID: /0000-0002-2838-6836/work/69029067
dc.identifier.urihttps://hdl.handle.net/10023/19444
dc.description.abstractHuman emotions are essentially multisensory, where emotional states are conveyed through multiple modalities such as facial expression, body language, and non-verbal and verbal signals. Therefore having multimodal or multisensory learning is crucial for recognising emotions and interpreting social signals. Existing multisensory emotion recognition approaches focus on extracting features on each modality, while ignoring the importance of constant interaction and co- learning between modalities. In this paper, we present a novel bio-inspired approach based on neural synchrony in audio- visual multisensory integration in the brain, named Synch-Graph. We model multisensory interaction using spiking neural networks (SNN) and explore the use of Graph Convolutional Networks (GCN) to represent and learn neural synchrony patterns. We hypothesise that modelling interactions between modalities will improve the accuracy of emotion recognition. We have evaluated Synch-Graph on two state- of-the-art datasets and achieved an overall accuracy of 98.3% and 96.82%, which are significantly higher than the existing techniques.
dc.format.extent8
dc.format.extent1038578
dc.language.isoeng
dc.publisherAAAI Press
dc.relation.ispartofProceedings of the AAAI Conference on Artificial Intelligence (AAAI-20)en
dc.relation.ispartofseriesProceedings of the AAAI Conference on Artificial Intelligenceen
dc.subjectQA75 Electronic computers. Computer scienceen
dc.subjectT Technologyen
dc.subject3rd-DASen
dc.subjectBDCen
dc.subjectR2Cen
dc.subject~DC~en
dc.subject.lccQA75en
dc.subject.lccTen
dc.titleSynch-Graph : multisensory emotion recognition through neural synchrony via graph convolutional networksen
dc.typeConference itemen
dc.contributor.institutionUniversity of St Andrews. School of Computer Scienceen
dc.identifier.doi10.1609/aaai.v34i02.5491
dc.identifier.urlhttps://aaai.org/Library/conferences-library.phpen


This item appears in the following Collection(s)

Show simple item record