AnxietyDecoder: An EEG-based anxiety predictor using a 3-D convolutional neural network

Yi Wang, Brendan McCane, Neil McNaughton, Zhiyi Huang, Shabah Shadli, Phoebe Neo

Research output: Book chapter/Published conference paperConference paperpeer-review

9 Citations (Scopus)

Abstract

In this paper, we propose and implement an EEG-based three-dimensional Convolutional Neural Network architecture, 'AnxietyDecoder', to predict anxious personality and decode its potential biomarkers from the participants. Since Goal-Conflict-Specific-Rhythmicity (GCSR) in the EEG is a sign of an anxiety-related system working, we first propose a two-dimensional Conflict-focused CNN (2-D CNN). It simulates the GCSR extraction process but with the advantages of automatic frequency band selection and functional contrast calculation optimization, thus providing more comprehensive trait anxiety predictions. Then, to generate more targeted hierarchical features from local spatio-temporal scale to global, we propose a three-dimensional Conflict-focused CNN (3-D CNN), which simultaneously integrates information in the temporal and brain-topology-related spatial dimensions. In addition, we embed Layer-wise Relevance Propagation (LRP) into our model to reveal the essential brain areas that are correlated to anxious personality. The experimental results show that the percentage variance accounted for by our three-dimensional Conflict-focused CNN is 33%, which is almost four times higher than the previous theoretically derived GCSR contrast (7%). Meanwhile, it also outperforms the 2-D model (26%) and the t-test difference between the 3-D and 2-D models is significant (t(4) = 5.4962, p = 0.0053). What's more, the reverse engineering results provide an interpretable way to understand the prediction decision-making and participants' anxiety personality. Our proposed AnxietyDecoder not only sets a new benchmark for EEG-based anxiety prediction but also reveals essential EEG components that contribute to the decision-making, and thus sheds some light on the anxiety biomarker research.

Original languageEnglish
Title of host publication2019 International Joint Conference on Neural Networks (IJCNN)
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages8
ISBN (Electronic)9781728119854
ISBN (Print)9781728119861 (Print on demand)
DOIs
Publication statusPublished - 05 Jul 2019
EventThe International Joint Conference on Neural Networks: IJCNN 2019 - InterContinental Budapest Hotel, Budapest, Hungary
Duration: 14 Jul 201919 Jul 2019
https://ieeexplore-ieee-org.ezproxy.csu.edu.au/xpl/conhome/8840768/proceeding (Proceedings)
https://www.ijcnn.org/assets/docs/ijcnn2019-program-Jun29-largefont-v2%281%29.pdf (Program)
https://web.archive.org/web/20190725190028/https://www.ijcnn.org/ (Conference website on Wayback Machine)

Publication series

NameProceedings of the International Joint Conference on Neural Networks
PublisherIEEE
Volume2019-July
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

ConferenceThe International Joint Conference on Neural Networks
Country/TerritoryHungary
CityBudapest
Period14/07/1919/07/19
OtherThe 2019 International Joint Conference on Neural Networks (IJCNN) will be held at the InterContinental Budapest Hotel in Budapest, Hungary on July 14-19, 2019. The conference is organized by the International Neural Network Society (INNS) in cooperation with the IEEE Computational Intelligence Society, and is the premier international meeting for researchers and other professionals in neural networks and related areas. It will feature invited plenary talks by world-renowned speakers in the areas of neural network theory and applications, computational neuroscience, robotics, and distrbuted intelligence. In addition to regular technical sessions with oral and poster presentations, the conference program will include special sessions, competitions, tutorials and workshops on topics of current interest.
Internet address

Fingerprint

Dive into the research topics of 'AnxietyDecoder: An EEG-based anxiety predictor using a 3-D convolutional neural network'. Together they form a unique fingerprint.

Cite this