Citation: | ZHANG Han, ZHENG Xiaomin, DIAO Xiaolei, CAI Dongli. Attention Routing-Based Capsule Networks for Emotion Recognition on Multi-Channel EEG[J]. Journal of South China Normal University (Natural Science Edition), 2023, 55(5): 103-110. DOI: 10.6054/j.jscnun.2023069 |
In recent years, deep neural networks have been applied to EEG emotion recognition and have demonstrated superior performance compared to traditional algorithms. However, convolutional neural networks exhibit weaknesses in recognizing spatial relationships between objects, identifying features after object rotation, losing va-luable information through pooling operations, and describing the inherent connections among different EEG signal channels. To address these shortcomings, a multi-channel EEG emotion recognition model based on Attention Routing Capsule Network (AR-CapsNet) is proposesd, which introduces attention routing and capsule activation into the EEG emotion recognition model. Compared to traditional capsule network EEG emotion models, the AR-CapsNet model maintains spatial information while performing forward propagation quickly. Finally, experiments on the DEAP dataset compare the AR-CapsNet model with machine learning model and other deep learning-based models (dynamic graph convolutional neural network, 4D convolutional recurrent neural network and traditional capsule networks, etc.), in experiments evaluating emotion recognition accuracy. A comparison was also conducted with multi-channel EEG-based emotion recognition via a multi-level features guided capsule network in terms of parame-ter count and training time. Experimental results indicate that: (1)The AR-CapsNet model achieves higher recog-nition accuracy compared to other models, with average recognition accuracies of 99.46%, 98.45%, 99.54% for valence, arousal and dominance, respectively. (2)In comparison to the currently high-performing capsule network model for EEG-based emotion recognition, namely, the multi-level feature-guided capsule network, a lower total parameter count is employed by the AR-CapsNet model, thereby reducing the complexity of EEG signal emotion recognition.
[1] |
KESSOUS L, CASTELLANO G, CARIDAKIS G. Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis[J]. Journal on Multimodal User Interfaces, 2010, 3(1): 33-48.
|
[2] |
权学良, 曾志刚, 蒋建华, 等. 基于生理信号的情感计算研究综述[J]. 自动化学报, 2021, 47(8): 1769-1784.
QUAN X L, ZENG Z G, JIANG J H, et al. Physiological signals based affective computing: a systematic review[J]. Acta Automatica Sinica, 2021, 47(8): 1769-1784.
|
[3] |
ZHENG W L, ZHU J Y, PENG Y, et al. EEG-based emotion classification using deep belief networks[C]//2014 IEEE International Conference on Multimedia and Expo. Chengdu: IEEE, 2014: 1-6.
|
[4] |
KWON Y H, SHIN S B, KIM S D. Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system[J]. Sensors, 2018, 18(5): 1383/1-13. doi: 10.3390/s18051383
|
[5] |
SUN B, WEI Q L, LI L D, et al. LSTM for dynamic emotion and group emotion recognition in the wild[C]//Proceedings of the 18th ACM International Conference on Multimodal Interaction. New York: ACM, 2016: 451-457.
|
[6] |
SONG T F, ZHENG W M, SONG P, et al. EEG emotion recognition using dynamical graph convolutional neural networks[J]. IEEE Transactions on Affective Computing, 2018, 11(3): 532-541.
|
[7] |
CHAO H, DONG L, LIU Y, et al. Emotion recognition from multiband EEG signals using CapsNet[J]. Sensors, 2019, 19(9): 2212/1-16.
|
[8] |
ZHANG D L, YAO L N, ZHANG X, et al. EEG-based intention recognition from spatio-temporal representations via cascade and parallel convolutional recurrent neural networks[J/OL]. arXiv, (2021-06-10)[2022-10-09]. https://arxiv.org/abs/1708.06578.
|
[9] |
SABOUR S, FROSST N, HINTON G E. Dynamic routing between capsules[C]//Advances in Neural Information Processing Systems. Long Beach: Curran Associates Inc, 2017: 3856-3866.
|
[10] |
KUMARI N, ANWAR S, BHATTACHARJEE V. Time series-dependent feature of EEG signals for improved visually evoked emotion classification using EmotionCapsNet[J]. Neural Computing and Applications, 2022, 34(16): 13291-13303. doi: 10.1007/s00521-022-06942-x
|
[11] |
XI E, BING S, JIN Y. Capsule network performance on complex data[J/OL]. arXiv, (2017-12-10)[2022-10-09]. https://arxiv.org/abs/1712.03480.
|
[12] |
LIU Y, DING Y F, LI C, et al. Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network[J]. Computers in Biology and Medicine, 2020, 123: 103927-103937. doi: 10.1016/j.compbiomed.2020.103927
|
[13] |
SALVETTI F, MAZZIA V, KHALIQ A, et al. Multi-image super resolution of remotely sensed images using residual attention deep neural networks[J]. Remote Sensing, 2020, 12(14): 2207/1-20. doi: 10.3390/rs12142207
|
[14] |
CHOI J, SEO H, IM S, et al. Attention routing between capsules[C]//2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW 2019). Seoul: IEEE, 2019: 1981-1989.
|
[15] |
DELIEGE A, CIOPPA A, van DROOGENBROECK M. Hitnet: a neural network with capsules embedded in a hit-or-miss layer, extended with hybrid data augmentation and ghost capsules[J/OL]. arXiv, (2018-06-18)[2022-10-09]. https://arxiv.org/abs/1806.06519.
|
[16] |
WANG D L, LIU Q. An optimization view on dynamic routing between capsules[C]//International Conference on Learning Representations (ICLR 2018). Vancouver: [s. n. ], 2018: 1-4.
|
[17] |
RAJASEGARAN J, JAYASUNDARA V, JAYASEKARA S, et al. Deepcaps: going deeper with capsule networks[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach: IEEE, 2019: 10717-10725.
|
[18] |
REN H, SU J, LU H. Evaluating generalization ability of convolutional neural networks and capsule networks for image classification via top-2 classification[J/OL]. arXiv, (2022-01-15)[2022-10-09]. https://arxiv.org/abs/1901.10112.
|
[19] |
KOELSTRA S, MUHL C, SOLEYMANI M, et al. Deap: a database for emotion analysis; using physiological signals[J]. IEEE Transactions on Affective Computing, 2011, 3(1): 18-31.
|
[20] |
TRIPATHI S, ACHARYA S, SHARMA R D, et al. Using deep and convolutional neural networks for accurate emotion classification on DEAP data[C]//Proceedings of the AAAI Conference on Artificial Intelligence. San Francisco: AAAI, 2017, 31(2): 4746-4752.
|
[21] |
WANG X W, NIE D, LU B L. Emotional state classification from EEG data using machine learning approach[J]. Neurocomputing, 2014, 129: 94-106. doi: 10.1016/j.neucom.2013.06.046
|
[22] |
SUYKENS J A K, VANDEWALLE J. Least squares su-pport vector machine classifiers[J]. Neural Processing Le-tters, 1999, 9(3): 293-300. doi: 10.1023/A:1018628609742
|
[23] |
SHEN F Y, DAI G J, LIN G, et al. EEG-based emotion recognition using 4D convolutional recurrent neural network[J]. Cognitive Neurodynamics, 2020, 14(6): 815-828. doi: 10.1007/s11571-020-09634-1
|
[24] |
蔡冬丽, 钟清华, 朱永升, 等. 基于混合神经网络的脑电情感识别[J]. 华南师范大学学报(自然科学版), 2021, 53(1): 109-118. doi: 10.6054/j.jscnun.2021017
CAI D L, ZHONG Q L, ZHU Y S, et al. EEG emotion re-cognition based on hybrid neural network[J]. Journal of South China Normal University(Natural Science Edition), 2021, 53(1): 109-118. doi: 10.6054/j.jscnun.2021017
|
[25] |
CHENG J, CHEN M Y, LI C, et al. Emotion recognition from multi-channel eeg via deep forest[J]. IEEE Journal of Biomedical and Health Informatics, 2020, 25(2): 453-464.
|
[26] |
CHAO H, DONG L, LIU Y L, et al. Emotion recognition from multiband EEG signals using CapsNet[J]. Sensors, 2019, 19(9): 2212/1-16.
|