Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/37179
Title: Adaptive segmentation of multimodal polysomnography data for sleep stages detection
Authors: Procházka, A.
Kuchyňka, J.
Yadollahi, M.
Suárez Araujo, Carmen Paz 
Vyšata, O.
UNESCO Clasification: 120304 Inteligencia artificial
Keywords: Depth sensors
Classification
Kinect
Image
Issue Date: 2017
Journal: International Conference on Digital Signal Processing proceedings 
Conference: 2017 22nd International Conference on Digital Signal Processing, DSP 2017 
Abstract: The paper presents a new algorithm for adaptive classification of sleep stages using multimodal data recorded in the sleep laboratory during overnight polysomnography records. The proposed method includes the learning process applied for the set of individuals with their sleep stages classified by an experienced neurologist. Features evaluated for time windows 30 s long and selected multimodal signals are used for construction and optimization of the proposed two-layer neural network model. Resulting computational system based upon breathing EEG and EOG features is used for analysis of new individuals to detect their sleep stages. Results include classification accuracy higher than 80% and 90% for Wake and REM stages, respectively. The proposed method can adaptively modify model coefficients to detect sleep stages and sleeping disorders using man-machine interaction.
URI: http://hdl.handle.net/10553/37179
ISSN: 2165-3577
DOI: 10.1109/ICDSP.2017.8096108
Source: 2017 22Nd International Conference On Digital Signal Processing (Dsp) [ISSN 1546-1874], (2017)
Appears in Collections:Actas de congresos
Show full item record

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.