St Andrews Research Repository

St Andrews University Home
View Item 
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  •   St Andrews Research Repository
  • University of St Andrews Research
  • University of St Andrews Research
  • University of St Andrews Research
  • View Item
  • Login
JavaScript is disabled for your browser. Some features of this site may not work without it.

Synch-Graph : multisensory emotion recognition through neural synchrony via graph convolutional networks

Thumbnail
View/Open
AAAI_MansouriBenssassiE.7493.pdf (1014.Kb)
Date
03/04/2020
Author
Mansouri Benssassi, Esma
Ye, Juan
Keywords
QA75 Electronic computers. Computer science
T Technology
3rd-DAS
BDC
R2C
~DC~
Metadata
Show full item record
Altmetrics Handle Statistics
Altmetrics DOI Statistics
Abstract
Human emotions are essentially multisensory, where emotional states are conveyed through multiple modalities such as facial expression, body language, and non-verbal and verbal signals. Therefore having multimodal or multisensory learning is crucial for recognising emotions and interpreting social signals. Existing multisensory emotion recognition approaches focus on extracting features on each modality, while ignoring the importance of constant interaction and co- learning between modalities. In this paper, we present a novel bio-inspired approach based on neural synchrony in audio- visual multisensory integration in the brain, named Synch-Graph. We model multisensory interaction using spiking neural networks (SNN) and explore the use of Graph Convolutional Networks (GCN) to represent and learn neural synchrony patterns. We hypothesise that modelling interactions between modalities will improve the accuracy of emotion recognition. We have evaluated Synch-Graph on two state- of-the-art datasets and achieved an overall accuracy of 98.3% and 96.82%, which are significantly higher than the existing techniques.
Citation
Mansouri Benssassi , E & Ye , J 2020 , Synch-Graph : multisensory emotion recognition through neural synchrony via graph convolutional networks . in Proceedings of the AAAI Conference on Artificial Intelligence (AAAI-20) . Proceedings of the AAAI Conference on Artificial Intelligence , AAAI Press , pp. 1351-1358 , Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI-20) , New York , United States , 7/02/20 . https://doi.org/10.1609/aaai.v34i02.5491
 
conference
 
Publication
Proceedings of the AAAI Conference on Artificial Intelligence (AAAI-20)
DOI
https://doi.org/10.1609/aaai.v34i02.5491
ISSN
2159-5399
Type
Conference item
Rights
Copyright © 2020 Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. This work has been made available online in accordance with publisher policies or with permission. Permission for further reuse of this content should be sought from the publisher or the rights holder. This is the author created accepted manuscript following peer review and may differ slightly from the final published version. The final published version of this work is available at https://doi.org/10.1609/aaai.v34i02.5491.
Collections
  • University of St Andrews Research
URL
https://aaai.org/Library/conferences-library.php
URI
http://hdl.handle.net/10023/19444

Items in the St Andrews Research Repository are protected by copyright, with all rights reserved, unless otherwise indicated.

Advanced Search

Browse

All of RepositoryCommunities & CollectionsBy Issue DateNamesTitlesSubjectsClassificationTypeFunderThis CollectionBy Issue DateNamesTitlesSubjectsClassificationTypeFunder

My Account

Login

Open Access

To find out how you can benefit from open access to research, see our library web pages and Open Access blog. For open access help contact: openaccess@st-andrews.ac.uk.

Accessibility

Read our Accessibility statement.

How to submit research papers

The full text of research papers can be submitted to the repository via Pure, the University's research information system. For help see our guide: How to deposit in Pure.

Electronic thesis deposit

Help with deposit.

Repository help

For repository help contact: Digital-Repository@st-andrews.ac.uk.

Give Feedback

Cookie policy

This site may use cookies. Please see Terms and Conditions.

Usage statistics

COUNTER-compliant statistics on downloads from the repository are available from the IRUS-UK Service. Contact us for information.

© University of St Andrews Library

University of St Andrews is a charity registered in Scotland, No SC013532.

  • Facebook
  • Twitter