Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/73270
Title: Unsupervised learning in reservoir computing for EEG-based emotion recognition
Authors: Fourati, Rahma
Ammar, Boudour
Sánchez Medina, Javier Jesús 
Alimi, Adel M.
UNESCO Clasification: 120304 Inteligencia artificial
Keywords: Brain modeling
Echo state network
Electroencephalogram
Electroencephalography
Emotion recognition, et al
Issue Date: 2022
Journal: IEEE Transactions on Affective Computing 
Abstract: In real-world applications such as emotion recognition from recorded brain activity, data are captured from electrodes over time. These signals constitute a multidimensional time series. In this paper, Echo State Network (ESN), a recurrent neural network with great success in time series prediction and classification, is optimized with different neural plasticity rules for classification of emotions based on electroencephalogram (EEG) time series. The developed network could automatically extract valid features from EEG signals. We use the filtered signals as the network input and do not take any feature extraction methods. Evaluated on two well-known benchmarks, the DEAP dataset, and the SEED dataset, the performance of the ESN with intrinsic plasticity greatly outperforms the feature-based methods and shows certain advantages compared with other existing methods. Thus, the proposed network can form a more complete and efficient representation, whilst retaining the advantages such as faster learning speed and more reliable performance.
URI: http://hdl.handle.net/10553/73270
ISSN: 1949-3045
DOI: 10.1109/TAFFC.2020.2982143
Source: IEEE Transactions on Affective Computing [EISSN 1949-3045], v. 13(2), p. 972-984 (2022)
Appears in Collections:Artículos
Show full item record

SCOPUSTM   
Citations

34
checked on Dec 15, 2024

WEB OF SCIENCETM
Citations

33
checked on Dec 15, 2024

Page view(s)

154
checked on Sep 13, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.