SparseDGCNN: Recognizing Emotion from Multichannel EEG Signals
Guanhua Zhang, Minjing Yu, Yong-Jin Liu, Guozhen Zhao, Dan Zhang
IEEE Transactions on Affective Computing (TAFFC), , pp. 1–12, 2021.
Abstract
Emotion recognition from EEG signals has attracted much attention in affective computing. Recently, a novel dynamic graph convolutional neural network (DGCNN) model was proposed, which simultaneously optimized the network parameters and a weighted graph G characterizing the strength of functional relation between each pair of two electrodes in the EEG recording equipment. In this paper, we propose a sparse DGCNN model which improves the DGCNN by imposing a sparseness constraint on G. Our work is based on an important observation: the tomography study reveals that different brain regions sampled by EEG electrodes may be related to different functions of the brain and then the functional relations among electrodes are possibly highly localized and sparse. However, introducing sparseness constraint into the graph G makes the loss function of sparse DGCNN non-differentiable at some singular points. To ensure that the training process of sparse DGCNN converges, we apply the forward-backward splitting method. To evaluate the performance of sparse DGCNN, we compare it with four representative recognition methods as well as different features and spectral bands. The results show that (1) sparse DGCNN has consistently better accuracy than representative methods and has a good scalability, and (2) DE, PSD and ASM features on γ bands convey most discriminative emotional information, and fusion of separate features and frequency bands can improve recognition performance.Links
BibTeX
@article{zhang21_taffc,
title = {SparseDGCNN: Recognizing Emotion from Multichannel EEG Signals},
author = {{Zhang}, Guanhua and {Yu}, Minjing and {Liu}, Yong-Jin and {Zhao}, Guozhen and {Zhang}, Dan},
journal = {IEEE Transactions on Affective Computing (TAFFC)},
doi = {10.1109/TAFFC.2021.3051332},
year = {2021},
pages = {1--12}
}