Journal Home > Volume 8 , Issue 2

Electroencephalogram (EEG) data depict various emotional states and reflect brain activity. There has been increasing interest in EEG emotion recognition in brain-computer interface systems (BCIs). In the World Robot Contest (WRC), the BCI Controlled Robot Contest successfully staged an emotion recognition technology competition. Three types of emotions (happy, sad, and neutral) are modeled using EEG signals. In this study, 5 methods employed by different teams are compared. The results reveal that classical machine learning approaches and deep learning methods perform similarly in offline recognition, whereas deep learning methods perform better in online cross-subject decoding.


menu
Abstract
Full text
Outline
About this article

Comparison of cross-subject EEG emotion recognition algorithms in the BCI Controlled Robot Contest in World Robot Contest 2021

Show Author's information Chao Tang§Yunhuan Li§Badong Chen( )
Institute of Artificial Intelligence and Robotics, Xi'an Jiaotong University, Xi'an 710049, China

§ These authors contributed equally to this work.

Abstract

Electroencephalogram (EEG) data depict various emotional states and reflect brain activity. There has been increasing interest in EEG emotion recognition in brain-computer interface systems (BCIs). In the World Robot Contest (WRC), the BCI Controlled Robot Contest successfully staged an emotion recognition technology competition. Three types of emotions (happy, sad, and neutral) are modeled using EEG signals. In this study, 5 methods employed by different teams are compared. The results reveal that classical machine learning approaches and deep learning methods perform similarly in offline recognition, whereas deep learning methods perform better in online cross-subject decoding.

Keywords: electroencephalography, emotion recognition, brain-computer interface, online decoding, cross-subject

References(34)

[1]
Poria S, Cambria E, Bajpai R, et al. A review of affective computing: from unimodal analysis to multimodal fusion. Inf Fusion 2017, 37: 98-125.
[2]
Song TF, Zheng WM, Song P, et al. EEG emotion recognition using dynamical graph convolutional neural networks. IEEE Trans Affect Comput 2020, 11(3): 532-541.
[3]
Nie D, Wang XW, Shi LC, et al. EEG-based emotion recognition during watching movies. In 2011 5th International IEEE/EMBS Conference on Neural Engineering, Cancun, Mexico, 2011, pp 667-670.
[4]
Appriou A, Cichocki A, Lotte F. Modern machine- learning algorithms: for classifying cognitive and affective states from electroencephalography signals. IEEE Syst Man Cybern Mag 2020, 6(3): 29-38.
[5]
Torres-Valencia C, Álvarez-López M, Orozco-Gutiérrez Á. SVM-based feature selection methods for emotion recognition from multimodal data. J Multimodal User Interfaces 2017, 11(1): 9-23.
[6]
Li M, Xu HP, Liu XW, et al. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol Health Care 2018, 26(S1): 509-519.
[7]
Yin Z, Liu L, Chen JN, et al. Locally robust EEG feature selection for individual-independent emotion recognition. Expert Syst Appl 2020, 162: 113768.
[8]
Hjorth B. EEG analysis based on time domain properties. Electroencephalogr Clin Neurophysiol 1970, 29(3): 306-310.
[9]
Ansari-Asl K, Chanel G, Pun T. A channel selection method for EEG classification in emotion assessment based on synchronization likelihood. In 2007 15th European Signal Processing Conference, Poznan, Poland, 2007, pp 1241-1245.
[10]
Petrantonakis PC, Hadjileontiadis LJ. Emotion recognition from EEG using higher order crossings. IEEE Trans Inf Technol Biomed 2010, 14(2): 186-197.
[11]
Duan RN, Zhu JY, Lu BL. Differential entropy feature for EEG-based emotion classification. In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER), San Diego, CA, USA, 2013, pp 81-84.
[12]
Bos D O. EEG-based emotion recognition: The influence of visual and auditory stimuli. https://www.semanticscholar.org/paper/EEG-based-Emotion-Recognition-The-Influence-of-and-Bos/5097b37a30b8d7a8d2bb03b307be5bf5deab73c4 (accessed Jun 6, 2022).
[13]
Lin YP, Wang CH, Jung TP, et al. EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 2010, 57(7): 1798-1806.
[14]
Lin YP, Hsu SH, Jung TP. Exploring day-to-day variability in the relations between emotion and EEG signals. In Foundations of Augmented Cognition. AC 2015. Lecture Notes in Computer Science. Vol 9183. Schmorrow DD, Fidopiastis CM, Eds. Cham: Springer, 2015, pp 461-469.
DOI
[15]
Mohammadi Z, Frounchi J, Amiri M. Wavelet-based emotion recognition system using EEG signal. Neural Comput Appl 2017, 28(8): 1985-1990.
[16]
Zheng WL, Lu BL. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 2015, 7(3): 162-175.
[17]
Noble WS. What is a support vector machine? Nat Biotechnol 2006, 24(12): 1565-1567.
[18]
Mehmood RM, Lee HJ. Emotion classification of EEG brain signal using SVM and KNN. In 2015 IEEE International Conference on Multimedia & Expo Workshops, Turin, Italy, 2015, pp 1-5.
[19]
Pan C, Shi C, Mu HL, et al. EEG-based emotion recognition using logistic regression with Gaussian kernel and Laplacian prior and investigation of critical frequency bands. Appl Sci 2020, 10(5): 1619.
[20]
Veeramallu GKP, Anupalli Y, Jilumudi SK, et al. EEG based automatic emotion recognition using EMD and random forest classifier. In 2019 10th International Conference on Computing, Communication and Networking Technologies (ICCCNT), Kanpur, India, 2019, pp 1-6.
[21]
Wang XW, Nie D, Lu BL. Emotional state classification from EEG data using machine learning approach. Neurocomputing 2014, 129: 94-106.
[22]
World Robot Contest. http://www.worldrobotconference.com/cn/about/134.html, (accessed Jun 6, 2022)
[23]
Lawhern VJ, Solon AJ, Waytowich NR, et al. EEGNet: a compact convolutional neural network for EEG-based brain-computer interfaces. J Neural Eng 2018, 15(5): 056013.
[24]
Vourkas M, Micheloyannis S, Papadourakis G. Use of ANN and Hjorth parameters in mental-task discrimination. In 2000 First International Conference Advances in Medical Signal and Information Processing, Bristol, UK,2000, pp 327-332.
[25]
Zheng WL, Zhu JY, Lu BL. Identifying stable patterns over time for emotion recognition from EEG. IEEE Trans Affect Comput 2019, 10(3): 417-429.
[26]
Fontaine JRJ, Scherer KR, Roesch EB, et al. The world of emotions is not two-dimensional. Psychol Sci 2007, 18(12): 1050-1057.
[27]
Wu ZB, Mao KZ, Ng GW. Feature regrouping for CCA - based feature fusion and extraction through normalized cut. In 2018 21st International Conference on Information Fusion (FUSION), Cambridge, UK, 2018, pp 2275-2282.
[28]
Li X, Song DW, Zhang P, et al. Exploring EEG features in cross-subject emotion recognition. Front Neurosci 2018, 12: 162.
[29]
Fu Y, Cao LL, Guo GD, et al. Multiple feature fusion by subspace learning. In CIVR '08: Proceedings of the 2008 International Conference on Content-based Image and Video Retrieval, New York, USA, 2008, pp 127-134.
[30]
Ma L, Lu JW, Feng JJ, et al. Multiple feature fusion via weighted entropy for visual tracking. In 2015 IEEE International Conference on Computer Vision, Santiago, Chile, 2015, pp 3128-3136.
[31]
Hou SD, Sun QS, Xia DS. Feature fusion using multiple component analysis. Neural Process Lett 2011, 34(3): 259-275.
[32]
Lin YP, Jung TP. Improving EEG-based emotion classification using conditional transfer learning. Front Hum Neurosci 2017, 11: 334.
[33]
Zheng W L, Lu B L. Personalizing EEG-based affective models with transfer learning. In Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, New York, USA, 2016, pp 2732-2738.
[34]
Wan ZT, Yang R, Huang MJ, et al. A review on transfer learning in EEG signal analysis. Neurocomputing 2021, 421: 1-14.
Publication history
Copyright
Acknowledgements
Rights and permissions

Publication history

Received: 18 April 2022
Revised: 20 May 2022
Accepted: 30 May 2022
Published: 29 June 2022
Issue date: June 2022

Copyright

© The authors 2022.

Acknowledgements

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China (Grant Nos. U21A20485, 61976175).

Rights and permissions

This article is published with open access at journals.sagepub.com/home/BSA

Creative Commons Non Commercial CC BY- NC: This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (http://www.creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/ en-us/nam/open-access-at-sage).

Return