基于混合神经网络的脑电情感识别

蔡冬丽, 钟清华, 朱永升, 张涵

蔡冬丽, 钟清华, 朱永升, 张涵. 基于混合神经网络的脑电情感识别[J]. 华南师范大学学报(自然科学版), 2021, 53(1): 109-118. DOI: 10.6054/j.jscnun.2021017
引用本文: 蔡冬丽, 钟清华, 朱永升, 张涵. 基于混合神经网络的脑电情感识别[J]. 华南师范大学学报(自然科学版), 2021, 53(1): 109-118. DOI: 10.6054/j.jscnun.2021017
CAI Dongli, ZHONG Qinghua, ZHU Yongsheng, ZHANG Han. EEG Emotion Recognition Based on Hybrid Neural Networks[J]. Journal of South China Normal University (Natural Science Edition), 2021, 53(1): 109-118. DOI: 10.6054/j.jscnun.2021017
Citation: CAI Dongli, ZHONG Qinghua, ZHU Yongsheng, ZHANG Han. EEG Emotion Recognition Based on Hybrid Neural Networks[J]. Journal of South China Normal University (Natural Science Edition), 2021, 53(1): 109-118. DOI: 10.6054/j.jscnun.2021017

基于混合神经网络的脑电情感识别

基金项目: 

国家自然科学基金项目 61871433

广东省自然科学基金项目 2019A1515011940

广东省科技计划项目 2017B030308009

广州市科技计划项目 202002030353

详细信息
    通讯作者:

    钟清华,Email: zhongqinghua@m.scnu.edu.cn

  • 中图分类号: TP391

EEG Emotion Recognition Based on Hybrid Neural Networks

  • 摘要: 为保留脑电(Electroencephalogram,EEG)空间信息的同时充分挖掘EEG时序相关信息,提出了一种三维卷积神经网络(3-Dimensional Convolutional Neural Networks,3D-CNN)结合双向长短期记忆神经网络(Bidirectional Long Short-term Memory Neural Networks,BLSTM)的混合神经网络(3DCNN-BLSTM);为验证该模型的分类性能,在DEAP数据集和SEED数据集上进行情感识别实验. 实验结果表明3DCNN-BLSTM模型能有效学习EEG多通道间的相关性与时间维度信息且提高了情感分类性能:在DEAP数据集的二分类实验中,唤醒度和效价的情感识别平均准确率分别为93.56%和93.21%;在DEAP数据集的四分类实验中,情感识别平均准确率为90.97%;在SEED数据集的三分类实验中,情感识别平均准确率为98.90%.
    Abstract: A hybrid neural network (3DCNN-BLSTM) based on a 3-Dimensional Convolutional Neural Network (3D-CNN) combined with a Bi-directional Long Short-term Memory Neural Network (BLSTM) is proposed to preserve the spatial information of the EEG while taking full advantage of its time-related information. Emotion re-cognition experiments on DEAP and SEED datasets are carried out to evaluate the classification performance of the model. The experiment results show that the 3DCNN-BLSTM model can effectively learn the correlation between EEG multi-channels and time dimension information and improve the performance of emotion classification. The ave-rage accuracy of emotion recognition of arousal and valence in the two-classification experiments on DEAP dataset are 93.56% and 93.21% respectively; the average accuracy of emotion recognition in the four-classification experiments on DEAP dataset is 90.97%; and the average accuracy of emotion recognition in the three-classification experiments on SEED dataset is 98.90%.
  • 图  1   混合神经网络结构图

    Figure  1.   The framework of a hybrid neural network

    图  2   脑电信号分频操作示意图

    Figure  2.   The schematic diagram of EEG signal frequency division operation

    图  3   DEAP数据集中二维脑电特征矩阵的构造示意图[18]

    注:黄色的电极表示DEAP数据集的测试点,白色的电极则是未使用的测试点.

    Figure  3.   The schematic diagram of the 2D EEG feature matrix in the DEAP dataset[18]

    图  4   SEED数据集中二维脑电特征矩阵的构造示意图[18]

    Figure  4.   The schematic diagram of the 2D EEG feature matrix in the SEED dataset[18]

    表  1   6种特征的情感识别平均准确率

    Table  1   The average accuracy of emotion recognition of six characteristics  %

    特征 二分类实验 四分类实验
    唤醒度 效价
    PSD 77.36 76.09 63.71
    SvdEn 87.91 86.62 84.31
    PeEn 88.07 87.93 83.97
    DE 88.62 87.60 84.13
    SampEn 90.60 89.89 86.80
    ApEn 93.56 93.21 90.97
    下载: 导出CSV

    表  2   5种模型的情感识别平均准确率

    Table  2   The average accuracy of emotion recognition with five models  %

    特征 二分类实验 四分类实验
    唤醒度 效价
    2D-CNN 74.49 74.21 60.25
    LSTM 91.54 90.91 87.99
    3D-CNN 92.74 92.22 89.10
    BLSTM 92.37 91.72 89.12
    3DCNN-BLSTM 93.56 93.21 90.97
    下载: 导出CSV

    表  3   8种方法在唤醒度和效价维度上的平均准确率

    Table  3   The average accuracy of 8 methods on the dimensions of arousal and valence

    方法 平均准确率/% 来源
    唤醒度 效价
    Gaussian naive Bayes 62.00 57.60 文献[5]
    Bayes classifier 66.40 66.60 文献[19]
    CapsNet 68.28 66.73 文献[20]
    EMD+SVM 69.10 71.99 文献[21]
    SAE+LSTM 74.38 81.10 文献[8]
    3D-CNN 88.49 87.44 文献[6]
    Multi-Column CNN 90.65 90.01 文献[22]
    3DCNN-BLSTM 93.56 93.21 本文
    下载: 导出CSV

    表  4   7种方法在四分类实验的平均准确率

    Table  4   The average accuracy of 7 methods in four-classification experiments

    方法 平均准确率/% 来源
    DWT+SVM 45.40 文献[23]
    GLCM+KNN 46.25 文献[24]
    SOM 55.15 文献[25]
    TFBSS+SVM 71.62 文献[26]
    QSE+SVM 72.50 文献[27]
    CNN 73.10 文献[1]
    3DCNN-BLSTM 90.97 本文
    下载: 导出CSV

    表  5   5种方法在三分类实验的平均准确率

    Table  5   The average accuracy of 5 methods in three-classification experiments

    方法 平均准确率/% 来源
    DBN 86.08 文献[14]
    HCNN 88.20 文献[28]
    Weighted+Ensemble CNN 93.12 文献[29]
    1D-CNN+RF 94.70 文献[30]
    3DCNN-BLSTM 98.90 本文
    下载: 导出CSV
  • [1]

    MEI H, XU X M. EEG based emotion classification using convolutional neural network[C]//Proceedings of the IEEE International Conference on Security, Pattern Analy-sis, and Cybernetics. Shenzhen: IEEE, 2017: 130-135.

    [2]

    TRIPATHI S, ACHARYA S, SHARMA R D, et al. Using deep and convolutional neural networks for accurate emotion classification on DEAP dataset[C]//Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press, 2017: 4746-4752.

    [3]

    MOON S, JANG S, LEE J, et al. Convolutional neural network approach for Eeg-based emotion recognition using brain connectivity and its spatial information[C]//Proceedings of 2018 IEEE International Conference on Acoustics, Speech and Signal Processing. Calgary, Canada: IEEE, 2018: 2556-2560.

    [4]

    KWON Y H, SHIN S B, KIM S, et al. Electroencephalogra-phy based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system[J]. Sensors, 2018, 18(5): 1383/1-13. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC5982398/

    [5]

    KOELSTRA S, MUHL C, SOLEYMANI M, et al. DEAP: a database for emotion analysis using physiological signals[J]. IEEE Transactions on Affective Computing, 2012, 3(1): 18-31. doi: 10.1109/T-AFFC.2011.15

    [6]

    SALAMA E S, ELKHORIBI R A, SHOMAN M, et al. EEG-based emotion recognition using 3D convolutional neural networks[J]. International Journal of Advanced Computer Science and Applications, 2018, 9(8): 329-337.

    [7]

    ALHAGRY S, FAHMY A A, ELKHORIBI R A, et al. Emotion recognition based on EEG using LSTM recurrent neural network[J]. International Journal of Advanced Computer Science and Applications, 2017, 8(10): 355-358. http://www.researchgate.net/publication/320802497_Emotion_Recognition_based_on_EEG_using_LSTM_Recurrent_Neural_Network

    [8]

    XING X F, LI Z Q, XU T Y, et al. SAE+LSTM: a new framework for emotion recognition from multi-channel EEG[J]. Frontiers in Neurorobotics, 2019, 13: 37/1-14. http://www.researchgate.net/publication/333730290_SAELSTM_A_New_Framework_for_Emotion_Recognition_From_Multi-Channel_EEG

    [9]

    WANG Y X, QIU S, LI J P, et al. EEG-based emotion recognition with similarity learning network[C]//Proceedings of 2019 41st Annual International Conference of the IEEE Engineering in Medicine & Biology Society. Berlin: IEEE, 2019: 1209-1212.

    [10]

    YANG Y L, WU Q F, QIU M, et al. Emotion recognition from multi-channel EEG through parallel convolutional recurrent neural network[C]//Proceedings of 2018 International Joint Conference on Neural Networks. Rio de Janeiro, Brazil: IEEE, 2018: 1-7.

    [11]

    SHEN F Y, DAI G J, LIN G, et al. EEG-based emotion recognition using 4D convolutional recurrent neural network[J]. Cognitive Neurodynamics, 2020, 14(6): 815-828. doi: 10.1007/s11571-020-09634-1

    [12]

    SHEYKHIVAND S, MOUSAVI Z, REZAⅡ T Y, et al. Recognizing emotions evoked by music using CNN-LSTM Networks on EEG Signals[J]. IEEE Access, 2020, 8: 139332-139345. doi: 10.1109/ACCESS.2020.3011882

    [13]

    YIN Z, ZHAO M Y, WANG Y X, et al. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model[J]. Computer Methods and Programs in Biomedicine, 2017, 140: 93-110. doi: 10.1016/j.cmpb.2016.12.005

    [14]

    ZHENG W L, LU B L. Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks[J]. IEEE Transactions on Autonomous Mental Development, 2015, 7(3): 162-175. doi: 10.1109/TAMD.2015.2431497

    [15]

    HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural Computation, 1997, 9(8): 1735-1780. doi: 10.1162/neco.1997.9.8.1735

    [16]

    GRAVES A, SCHMIDHUBER J. Framewise phoneme cla-ssification with bidirectional LSTM and other neural network architectures[J]. Neural networks, 2005, 18(5/6): 602-610. http://www.sciencedirect.com/science/article/pii/S0893608005001206

    [17]

    MATURANA D, SCHERER S. Voxnet: a 3d convolutional neural network for real-time object recognition[C]//Proceedings of 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Hamburg, Germany: IEEE, 2015: 922-928.

    [18]

    YANG Y L, WU Q F, FU Y, et al. Continuous convolutional neural network with 3d input for EEG-based emotion recognition[C]//International Conference on Neural Information Processing. Cham, Switzerland: Springer, 2018: 433-443.

    [19]

    CHUNG S Y, YOON H J. Affective classification using Bayesian classifier and supervised learning[C]//Proceedings of 2012 12th International Conference on Control, Automation and Systems. JeJu Island, South Korea: IEEE, 2012: 1768-1771.

    [20]

    CHAO H, DONG L, LIU Y L, et al. Emotion recognition from multiband EEG signals using CapsNet[J]. Sensors, 2019, 19(9): 2212/1-16. http://www.ncbi.nlm.nih.gov/pubmed/31086110

    [21]

    ZHUANG N, ZENG Y, TONG L, et al. Emotion recognition from EEG signals using multidimensional information in EMD Domain[J]. BioMed Research International, 2017, 2017: 8317357/1-9. http://www.ncbi.nlm.nih.gov/pubmed/28900626

    [22]

    YANG H, HAN J, MIN K, et al. A multi-column CNN model for emotion recognition from EEG signals[J]. Sensors, 2019, 19(21): 4736/1-12. http://www.ncbi.nlm.nih.gov/pubmed/31683608

    [23]

    ZUBAIR M, YOON C. EEG based classification of human emotions using discrete wavelet transform[M]//IT Convergence and Security 2017. Singapore: Springer, 2018: 21-28.

    [24]

    JADHAV N, MANTHALKAR R, JOSHI Y. Electroencephalography based emotion recognition using gray-level co-occurrence matrix features[C]//RAMAN B, KUMAR S, ROY P, et al, ed. Proceedings of the International Conference on Computer Vision and Image Processing. Singapore: Springer, 2016: 335-343.

    [25]

    HATAMIKIA S, NASRABADI A M. Recognition of emotional states induced by music videos based on nonlinear feature extraction and SOM classification[C]//Procee-dings of 2014 21th Iranian Conference on Biomedical Engineering (ICBME). Tehran, Iran: IEEE, 2014: 333-337.

    [26]

    ZHANG X Y, WANG W R, SHEN C Y, et al. Extraction of EEG components based on time - frequency blind source sepa-ration[C]//Proceedings of the International Conference on Intelligent Information Hiding and Multimedia Signal Processing. Cham, Switzerland: Springer, 2017: 3-10.

    [27]

    MARTÍNEZ-RODRIGO A, GARCÍA-MARTÍNEZ B, ALCARAZ R, et al. Study of electroencephalographic signal regularity for automatic emotion recognition[C]//Proceedings of the International Conference on Ubiquitous Computing and Ambient Intelligence. Cham, Switzerland: Springer, 2017: 766-777.

    [28]

    LI J P, ZHANG Z X, HE H G. Hierarchical convolutional neural networks for EEG-based emotion recognition[J]. Cognitive Computation, 2018, 10(2): 368-380. doi: 10.1007/s12559-017-9533-x

    [29] 魏琛, 陈兰岚, 张傲. 基于集成卷积神经网络的脑电情感识别[J]. 华东理工大学学报(自然科学版), 2019, 45(4): 614-622. https://www.cnki.com.cn/Article/CJFDTOTAL-HLDX201904014.htm

    WEI C, CHEN L L, ZHANG A. Emotion recognition of EEG based on ensemble convolutional neural networks[J]. Journal of East China University of Science and Technology, 2019, 45(4): 614-622. https://www.cnki.com.cn/Article/CJFDTOTAL-HLDX201904014.htm

    [30] 田莉莉, 邹俊忠, 张见, 等. 基于改进的卷积神经网络脑电信号情感识别[J]. 计算机工程与应用, 2019, 55(22): 99-105. https://www.cnki.com.cn/Article/CJFDTOTAL-JSGG201922015.htm

    TIAN L L, ZOU J Z, ZHANG J, et al. Emotion recognition of EEG signal based on improved convolutional neural network[J]. Computer Engineering and Applications, 2019, 55(22): 99-105. https://www.cnki.com.cn/Article/CJFDTOTAL-JSGG201922015.htm

  • 期刊类型引用(2)

    1. 郭贺媛熙,李利军,冯军,林鑫,李睿. 基于DNA杂交指示剂和银纳米棒阵列芯片构建氯霉素SERS适配体传感器的研究. 光谱学与光谱分析. 2023(11): 3445-3451 . 百度学术
    2. 赵倩雯,李南希,陈琳琳,李红. 亚甲基蓝介导抗坏血酸氧化动力学的研究. 华南师范大学学报(自然科学版). 2018(06): 25-30 . 百度学术

    其他类型引用(0)

图(4)  /  表(5)
计量
  • 文章访问数:  1514
  • HTML全文浏览量:  576
  • PDF下载量:  150
  • 被引次数: 2
出版历程
  • 收稿日期:  2020-08-01
  • 网络出版日期:  2021-03-23
  • 刊出日期:  2021-02-24

目录

    /

    返回文章
    返回