Discover the SciOpen Platform and Achieve Your Research Goals with Ease.
Search articles, authors, keywords, DOl and etc.
Walking as a unique biometric tool conveys important information for emotion recognition. Individuals in different emotional states exhibit distinct walking patterns. For this purpose, this paper proposes a novel approach to recognizing emotion during walking using electroencephalogram (EEG) and inertial signals. Accurate recognition of emotion is achieved by training in an end-to-end deep learning fashion and taking into account multi-modal fusion. Subjects wear virtual reality head-mounted display (VR-HMD) equipment to immerse in strong emotions during walking. VR environment shows excellent imitation and experience ability, which plays an important role in awakening and changing emotions. In addition, the multi-modal signals acquired from EEG and inertial sensors are separately represented as virtual emotion images by discrete wavelet transform (DWT). These serve as input to the attention-based convolutional neural network (CNN) fusion model. The designed network structure is simple and lightweight while integrating the channel attention mechanism to extract and enhance features. To effectively improve the performance of the recognition system, the proposed decision fusion algorithm combines Critic method and majority voting strategy to determine the weight values that affect the final decision results. An investigation is made on the effect of diverse mother wavelet types and wavelet decomposition levels on model performance which indicates that the 2.2-order reverse biorthogonal (rbio2.2) wavelet with two-level decomposition has the best recognition performance. Comparative experiment results show that the proposed method outperforms other existing state-of-the-art works with an accuracy of 98.73%.
F. Y. N. Leung, J. Sin, C. Dawson, J. H. Ong, C. Zhao, A. Veić, and F. Liu, Emotion recognition across visual and auditory modalities in autism spectrum disorder: A systematic review and meta-analysis, Dev. Rev., vol. 63, p. 101000, 2022.
W. K. Ngai, H. Xie, D. Zou, and K. L. Chou, Emotion recognition based on convolutional neural networks and heterogeneous bio-signal data sources, Inf. Fusion, vol. 77, pp. 107–117, 2022.
P. Parada-Fernández, D. Herrero-Fernández, R. Jorge, and P. Comesaña, Wearing mask hinders emotion recognition, but enhances perception of attractiveness, Pers. Individ. Differ., vol. 184, p. 111195, 2022.
Y. Bhatia, A. H. Bari, G. J. Hsu, and M. Gavrilova, Motion capture sensor-based emotion recognition using a bi-modular sequential neural network, Sensors, vol. 22, no. 1, p. 403, 2022.
S. Qiu, Z. Wang, H. Zhao, K. Qin, Z. Li, and H. Hu, Inertial/magnetic sensors based pedestrian dead reckoning by means of multi-sensor fusion, Inf. Fusion, vol. 39, pp. 108–119, 2018.
H. Zhao, Z. Wang, S. Qiu, Y. Shen, L. Zhang, K. Tang, and G. Fortino, Heading drift reduction for foot-mounted inertial navigation system via multi-sensor fusion and dual-gait analysis, IEEE Sens. J., vol. 19, no. 19, pp. 8514–8521, 2019.
T. T. Pham and Y. S. Suh, Conditional generative adversarial network-based regression approach for walking distance estimation using waist-mounted inertial sensors, IEEE Trans. Instrum. Meas., vol. 71, pp. 1–13, 2022.
D. Seckiner, X. Mallett, P. Maynard, D. Meuwly, and C. Roux, Forensic gait analysis—Morphometric assessment from surveillance footage, Forensic Sci. Int., vol. 296, pp. 57–66, 2019.
S. Pal, S. Mukhopadhyay, and N. Suryadevara, Development and progress in sensors and technologies for human emotion recognition, Sensors, vol. 21, no. 16, p. 5554, 2021.
S. Qiu, H. Zhao, N. Jiang, Z. Wang, L. Liu, Y. An, H. Zhao, X. Miao, R. Liu, and G. Fortino, Multi-sensor information fusion based on machine learning for real applications in human activity recognition: State-of-the-art and research challenges, Inf. Fusion, vol. 80, pp. 241–265, 2022.
I. Mohino-Herranz, R. Gil-Pita, J. García-Gómez, M. Rosa-Zurera, and F. Seoane, A wrapper feature selection algorithm: An emotional assessment using physiological recordings from wearable sensors, Sensors, vol. 20, no. 1, p. 309, 2020.
P. Ekman, W. V. Friesen, M. O'Sullivan, A. Chan, I. Diacoyanni-Tarlatzis, K. Heider, R. Krause, W. A. LeCompte, T. Pitcairn, P. E. Ricci-Bitti, et al., Universals and cultural differences in the judgments of facial expressions of emotion, J. Pers. Soc. Psychol., vol. 53, no. 4, pp. 712–717, 1987.
J. A. Russell, A circumplex model of affect, J. Pers. Soc. Psychol., vol. 39, no. 6, pp. 1161–1178, 1980.
J. A. Russell and A. Mehrabian, Distinguishing anger and anxiety in terms of emotional response factors, J. Consult. Clin. Psychol., vol. 42, no. 1, pp. 79–83, 1974.
M. A. Hashmi, Q. Riaz, M. Zeeshan, M. Shahzad, and M. M. Fraz, Motion reveal emotions: Identifying emotions from human walk using chest mounted smartphone, IEEE Sens. J., vol. 20, no. 22, pp. 13511–13522, 2020.
T. Fan, S. Qiu, Z. Wang, H. Zhao, J. Jiang, Y. Wang, J. Xu, T. Sun, and N. Jiang, A new deep convolutional neural network incorporating attentional mechanisms for ECG emotion recognition, Comput. Biol. Med., vol. 159, p. 106938, 2023.
M. Borghetti, M. Serpelloni, E. Sardini, and O. Casas, Multisensor system for analyzing the thigh movement during walking, IEEE Sens. J., vol. 17, no. 15, pp. 4953–4961, 2017.
F. Y. Liang, F. Gao, and W. H. Liao, Synergy-based knee angle estimation using kinematics of thigh, Gait Posture, vol. 89, pp. 25–30, 2021.
S. Wang, Y. Wang, D. Liu, Z. Zhang, W. Li, C. Liu, T. Du, X. Xiao, L. Song, H. Pang, et al., A robust and self-powered tilt sensor based on annular liquid-solid interfacing triboelectric nanogenerator for ship attitude sensing, Sens. Actuat. A Phys., vol. 317, p. 112459, 2021.
W. Zhang, Y. Liu, S. Zhang, T. Long, and J. Liang, Error fusion of hybrid neural networks for mechanical condition dynamic prediction, Sensors, vol. 21, no. 12, p. 4043, 2021.
I. H. López-Nava and A. Muñoz-Meléndez, Wearable inertial sensors for human motion analysis: A review, IEEE Sens. J., vol. 16, no. 22, pp. 7821–7834, 2016.
Z. Wang, M. Guo, and C. Zhao, Badminton stroke recognition based on body sensor networks, IEEE Trans. Hum. Mach. Syst., vol. 46, no. 5, pp. 769–775, 2016.
Y. Zhao, M. Guo, X. Sun, X. Chen, and F. Zhao, Attention-based sensor fusion for emotion recognition from human motion by combining convolutional neural network and weighted kernel support vector machine and using inertial measurement unit signals, IET Signal Process., vol. 17, no. 4, p. e12201, 2023.
M. Guo, Z. Wang, N. Yang, Z. Li, and T. An, A multisensor multiclassifier hierarchical fusion model based on entropy weight for human activity recognition using wearable inertial sensors, IEEE Trans. Hum. Mach. Syst., vol. 49, no. 1, pp. 105–111, 2019.
Q. Zhang, H. Zhang, K. Zhou, and L. Zhang, Developing a physiological signal-based, mean threshold and decision-level fusion algorithm (PMD) for emotion recognition, Tsinghua Science and Technology, vol. 28, no. 4, pp. 673–685, 2023.
S. Liu, P. Gao, Y. Li, W. Fu, and W. Ding, Multi-modal fusion network with complementarity and importance for emotion recognition, Inf. Sci., vol. 619, pp. 679–694, 2023.
Y. Xu, H. Su, G. Ma, and X. Liu, A novel dual-modal emotion recognition algorithm with fusing hybrid features of audio signal and speech context, Complex Intell. Syst., vol. 9, no. 1, pp. 951–963, 2023.
699
Views
113
Downloads
3
Crossref
2
Web of Science
2
Scopus
0
CSCD
Altmetrics
The articles published in this open access journal are distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/).