AI Chat Paper
Note: Please note that the following content is generated by AMiner AI. SciOpen does not take any responsibility related to this content.
{{lang === 'zh_CN' ? '文章概述' : 'Summary'}}
{{lang === 'en_US' ? '中' : 'Eng'}}
Chat more with AI
PDF (1.8 MB)
Collect
Submit Manuscript AI Chat Paper
Show Outline
Outline
Show full outline
Hide outline
Outline
Show full outline
Hide outline
Research Article | Open Access

Transformer-based ensemble deep learning model for EEG-based emotion recognition

Xiaopeng Si1,2,§( )Dong Huang1,2,§Yulin Sun1,2,§Shudi Huang1,2He Huang1,2Dong Ming1,2( )
Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin 300072, China
Tianjin Key Laboratory of Brain Science and Neural Engineering, Tianjin University, Tianjin 300072, China

§ These authors contributed equally to this work.

Show Author Information

Abstract

Emotion recognition is one of the most important research directions in the field of brain–computer interface (BCI). However, to conduct electroencephalogram (EEG)-based emotion recognition, there exist difficulties regarding EEG signal processing; moreover, the performance of classification models in this regard is restricted. To counter these issues, the 2022 World Robot Contest successfully held an affective BCI competition, thus promoting the innovation of EEG-based emotion recognition. In this paper, we propose the Transformer-based ensemble (TBEM) deep learning model. TBEM comprises two models: a pure convolutional neural network (CNN) model and a cascaded CNN-Transformer hybrid model. The proposed model won the abovementioned affective BCI competition’s final championship in the 2022 World Robot Contest, demonstrating the effectiveness of the proposed TBEM deep learning model for EEG-based emotion recognition.

References

[1]
Kuusikko S, Haapsamo H, Jansson-Verkasalo E, et al. Emotion recognition in children and adolescents with autism spectrum disorders. J Autism Dev Disord 2009, 39(6): 938–945.
[2]
Shanechi MM. Brain-machine interfaces from motor to mood. Nat Neurosci 2019, 22(10): 1554–1564.
[3]
Gao XR, Wang YJ, Chen XG, et al. Interface, interaction, and intelligence in generalized brain-computer interfaces. Trends Cogn Sci 2021, 25(8): 671–684.
[4]
Ramakrishnan S, El Emary IMM. Speech emotion recognition approaches in human computer interaction. Telecommun Syst 2013, 52(3): 1467–1478.
[5]
Zhang SQ, Zhao XM, Tian Q. Spontaneous speech emotion recognition using multiscale deep convolutional LSTM. IEEE Trans Affect Comput 2022, 13(2): 680–688.
[6]
Valstar M, Pantic M, Patras I. Motion history for facial action detection in video. In 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No. 04CH37583), The Hague, Netherlands, 2004, pp 635–640.
[7]
Zeng ZH, Pantic M, Roisman GI, et al. A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 2009, 31(1): 39–58.
[8]
Toisoul A, Kossaifi J, Bulat A, et al. Estimation of continuous valence and arousal levels from faces in naturalistic conditions. Nat Mach Intell 2021, 3(1): 42–50.
[9]
Soleymani M, Lichtenauer J, Pun T, et al. A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 2012, 3(1): 42–55.
[10]
Koelstra S, Muhl C, Soleymani M, et al. DEAP: a database for emotion analysis using physiological signals. IEEE Trans Affect Comput 2012, 3(1): 18–31.
[11]
Alarcão SM, Fonseca MJ. Emotions recognition using EEG signals: a survey. IEEE Trans Affect Comput 2019, 10(3): 374–393.
[12]
LeDoux J. The amygdala. Curr Biol 2007, 17(20): R868–R874.
[13]
Gallagher M, Chiba AA. The amygdala and emotion. Curr Opin Neurobiol 1996, 6(2): 221–227.
[14]
Adolphs R. The unsolved problems of neuroscience. Trends Cogn Sci 2015, 19(4): 173–175.
[15]
Dixon ML, Thiruchselvam R, Todd R, et al. Emotion and the prefrontal cortex: an integrative review. Psychol Bull 2017, 143(10): 1033–1081.
[16]
Salzman CD, Fusi S. Emotion, cognition, and mental state representation in amygdala and prefrontal cortex. Annu Rev Neurosci 2010, 33: 173–202.
[17]
Hu X, Chen JJ, Wang F, et al. Ten challenges for EEG-based affective computing. Brain Sci Adv 2019, 5(1): 1–20.
[18]
Luck SJ. An introduction to the event-related potential technique. Cambridge: MIT Press, 2014.
[19]
Niedermeyer E, da Silva FL. Electroencephalography: basic principles, clinical applications, and related fields. Philadelphia, USA: Lippincott Williams & Wilkins, 2005.
[20]
Gonzalez H, George R, Muzaffar S, et al. Hardware acceleration of EEG-based emotion classification systems: a comprehensive survey. IEEE Trans Biomed Circuits Syst 2021, 15(3): 412–442.
[21]
Li X, Zhang YZ, Tiwari P, et al. EEG based emotion recognition: A tutorial and review. ACM Computing Surveys 2022, 55(4): 1–57.
[22]
Hu WR, Huang G, Li LL, et al. Video-triggered EEG-emotion public databases and current methods: a survey. Brain Science Advances 2020, 6(3): 255–287.
[23]
Hjorth B. EEG analysis based on time domain properties. Electroencephalogr Clin Neurophysiol 1970, 29(3): 306–310.
[24]
Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed 2010, 14(2): 186–197.
[25]
Kiymik MK, Güler I, Dizibüyük A, et al. Comparison of STFT and wavelet transform methods in determining epileptic seizure activity in EEG signals for real-time application. Comput Biol Med 2005, 35(7): 603–616.
[26]
Delorme A, Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J Neurosci Methods 2004, 134(1): 9–21.
[27]
Akin M. Comparison of wavelet transform and FFT methods in the analysis of EEG signals. J Med Syst 2002, 26(3): 241–247.
[28]
Zheng WL, Lu BL. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 2015, 7(3): 162–175.
[29]
Schwartz GE, Davidson RJ, Maer F. Right hemisphere lateralization for emotion in the human brain: interactions with cognition. Science 1975, 190(4211): 286–288.
[30]
Güntürkün O, Ströckens F, Ocklenburg S. Brain lateralization: a comparative perspective. Physiol Rev 2020, 100(3): 1019–1063.
[31]
Xu PF, Peng SL, Luo YJ, et al. Facial expression recognition: a meta-analytic review of theoretical models and neuroimaging evidence. Neurosci Biobehav Rev 2021, 127: 820–836.
[32]
Hu X, Wang F, Zhang D. Similar brains blend emotion in similar ways: neural representations of individual difference in emotion profiles. NeuroImage 2022, 247: 118819.
[33]
Ismail LE, Karwowski W. A graph theory-based modeling of functional brain connectivity based on EEG: a systematic review in the context of neuroergonomics. IEEE Access 2020, 8: 155103–155135.
[34]
Li PY, Liu H, Si YJ, et al. EEG based emotion recognition by combining functional connectivity network and local activations. IEEE Trans Biomed Eng 2019, 66(10): 2869–2881.
[35]
Li CB, Li PY, Zhang YS, et al. Effective emotion recognition by learning discriminative graph topologies in EEG brain networks. IEEE Trans Neural Netw Learn Syst 2023, PP: .
[36]
Liu Y, Ding YF, Li C, et al. Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Comput Biol Med 2020, 123: 103927.
[37]
Krizhevsky A, Sutskever I, Hinton GE. ImageNet classification with deep convolutional neural networks. Commun ACM 2017, 60(6): 84–90.
[38]
Sutskever I, Vinyals O, Le QV. Sequence to sequence learning with neural networks. In Proceedings of the 27th International Conference on Neural Information Processing Systems - Volume 2. Montreal, Canada, 2014, pp 3104–3112.
[39]
Hochreiter S, Schmidhuber J. Long short-term memory. Neural Comput 1997, 9(8): 1735–1780.
[40]
Ackley DH, Hinton GE, Sejnowski TJ. A learning algorithm for Boltzmann machines. Cogn Sci 1985, 9(1): 147–169.
[41]
Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need. In Proceedings of Advances in Neural Information Processing Systems(NIPS 2017), Long Beach, USA, 2017, pp 30.
[42]
Devlin J, Chang MW, Lee K, et al. BERT: pre-training of deep bidirectional transformers for language understanding. arXiv, 2018: 1810.04805. https://arxiv.org/abs/1810.04805
[43]
Brown T, Mann B, Ryder N, et al. Language models are few-shot learners. In NIPS'20: Proceedings of the 34th International Conference on Neural Information Processing Systems, 2020, pp 1877–1901.
[44]
Wang Z, Wang YX, Hu CF, et al. Transformers for EEG-based emotion recognition: a hierarchical spatial information learning model. IEEE Sens J 2022, 22(5): 4359–4368.
[45]
Aadam, Tubaishat A, Al-Obeidat F, et al. EmoPercept: EEG-based emotion classification through perceiver. Soft Comput 2022, 26(20): 10563–10570.
[46]
Arjun A, Rajpoot AS, Raveendranatha Panicker M. Introducing attention mechanism for EEG signals: emotion recognition with vision transformers. In 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC). Mexico, 2021, pp 5723–5726.
[47]
Li R, Ren C, Zhang XW, et al. A novel ensemble learning method using multiple objective particle swarm optimization for subject-independent EEG-based emotion recognition. Comput Biol Med 2021, 140: 105080.
[48]
Awan AW, Usman SM, Khalid S, et al. An ensemble learning method for emotion charting using multimodal physiological signals. Sensors 2022, 22(23): 9480.
[49]
Hu X, Zhuang C, Wang F, et al. fNIRS evidence for recognizably different positive emotions. Front Hum Neurosci 2019, 13: 120.
[50]
Dosovitskiy A, Beyer L, Kolesnikov A, et al. An image is worth 16x16 words: transformers for image recognition at scale. arXiv, 2020: 2010.11929. https://arxiv.org/abs/2010.11929
[51]
Liu Z, Lin YT, Cao Y, et al. Swin transformer: hierarchical vision transformer using shifted windows. In 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 2022, pp 9992–10002.
[52]
Lawhern VJ, Solon AJ, Waytowich NR, et al. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng 2018, 15(5): 056013.
[53]
Dong XB, Yu ZW, Cao WM, et al. A survey on ensemble learning. Front Comput Sci 2020, 14(2): 241–258.
[54]
Zhang S, Zhao ZY, Guan CT. Multimodal continuous emotion recognition: a technical report for ABAW5. arXiv, 2023: 2303.10335. https://arxiv.org/abs/2303.10335
[55]
Hu SA, Lai YX, Valdes-Sosa PA, et al. How do reference montage and electrodes setup affect the measured scalp EEG potentials? J Neural Eng 2018, 15(2): 026013.
[56]
Hu SA, Yao DZ, Bringas-Vega ML, et al. The statistics of EEG unipolar references: derivations and properties. Brain Topogr 2019, 32(4): 696–703.
[57]
Hu SA, Yao DZ, Valdes-Sosa PA. Unified Bayesian estimator of EEG reference at infinity: rREST (regularized reference electrode standardization technique). Front Neurosci 2018, 12: 297.
[58]
Yao DZ, Qin Y, Hu SA, et al. Which reference should we use for EEG and ERP practice? Brain Topogr 2019, 32(4): 530–549.
[59]
Liang SJ, Su L, Fu YF, et al. Multi-source joint domain adaptation for cross-subject and cross-session emotion recognition from electroencephalography. Front Hum Neurosci 2022, 16: 921346.
[60]
Tang C, Li YH, Chen BD. Comparison of cross-subject EEG emotion recognition algorithms in the BCI Controlled Robot Contest in World Robot Contest 2021. Brain Sci Adv 2022, 8(2): 142–152.
Brain Science Advances
Pages 210-223
Cite this article:
Si X, Huang D, Sun Y, et al. Transformer-based ensemble deep learning model for EEG-based emotion recognition. Brain Science Advances, 2023, 9(3): 210-223. https://doi.org/10.26599/BSA.2023.9050016

1483

Views

334

Downloads

2

Crossref

Altmetrics

Received: 29 March 2023
Revised: 08 May 2023
Accepted: 24 May 2023
Published: 05 September 2023
© The authors 2023.

This article is published with open access at journals.sagepub.com/home/BSA

Creative Commons Non Commercial CC BY-NC: This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (http://www.creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage).

Return