Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/132169
Title: Towards Bi-Hemispheric Emotion Mapping Through EEG: A Dual-Stream Neural Network Approach
Authors: Freire Obregón, David Sebastián 
Hernández Sosa, José Daniel 
Santana Jaria, Oliverio Jesús 
Lorenzo Navarro, José Javier 
Castrillón Santana, Modesto Fernando 
UNESCO Clasification: 120304 Inteligencia artificial
Issue Date: 2024
Conference: 18th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2024)
Abstract: Emotion classification through EEG signals plays a significant role in psychology, neuroscience, and humancomputer interaction. This paper addresses the challenge of mapping human emotions using EEG data in the Mapping Human Emotions through EEG Signals FG24 competition. Subjects mimic the facial expressions of an avatar, displaying fear, joy, anger, sadness, disgust, and surprise in a VR setting. EEG data is captured using a multi-channel sensor system to discern brain activity patterns. We propose a novel two-stream neural network employing a Bi-Hemispheric approach for emotion inference, surpassing baseline methods and enhancing emotion recognition accuracy. Additionally, we conduct a temporal analysis revealing that specific signal intervals at the beginning and end of the emotion stimulus sequence contribute significantly to improve accuracy. Leveraging insights gained from this temporal analysis, our approach offers enhanced performance in capturing subtle variations in the states of emotions.
URI: http://hdl.handle.net/10553/132169
ISBN: 979-8-3503-9494-8
ISSN: 2326-5396
DOI: 10.1109/FG59268.2024.10581965
Source: 2024 IEEE 18th International Conference on Automatic Face and Gesture Recognition, FG 2024 [EISSN ], (Mayo 2024)
Appears in Collections:Actas de congresos
Thumbnail
PDF
Adobe PDF (433,14 kB)
Show full item record

Page view(s)

68
checked on Oct 12, 2024

Download(s)

29
checked on Oct 12, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.