Please use this identifier to cite or link to this item:
https://accedacris.ulpgc.es/jspui/handle/10553/158571
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Cornejo, Diego Rodrigo | en_US |
| dc.contributor.author | Ravelo-García, Antonio G. | en_US |
| dc.contributor.author | Rodríguez, María Fernanda | en_US |
| dc.contributor.author | Díaz, Luz Alexandra | en_US |
| dc.contributor.author | Cabrera-Caso, Victor | en_US |
| dc.contributor.author | Condori-Merma, Dante | en_US |
| dc.contributor.author | Cornejo, Miguel Vizcardo | en_US |
| dc.date.accessioned | 2026-02-20T09:48:00Z | - |
| dc.date.available | 2026-02-20T09:48:00Z | - |
| dc.date.issued | 2024 | en_US |
| dc.identifier.issn | 2325-8861 | en_US |
| dc.identifier.other | Scopus | - |
| dc.identifier.uri | https://accedacris.ulpgc.es/jspui/handle/10553/158571 | - |
| dc.description.abstract | Due to its rapid propagation and enormous number of infected people, COVID-19 is the greatest pandemic in the past 100 years, with millions of deaths. The need for accessible, quick, and non-invasive diagnostic techniques persists despite a decline in cases recently. Because of this, in the current work we develop a densely connected neural network that uses heart rate data to identify between patients with COVID and healthy individuals. The Stanford University database was used, which underwent a feature extraction and the usage of approximation entropy. With an accuracy of 93% and an AUC of 0.956, the results demonstrated to be more than good at categorization, supporting the usefulness of this approach for the accurate identification of COVID cases. | en_US |
| dc.language | eng | en_US |
| dc.relation.ispartof | Computers in Cardiology | en_US |
| dc.source | Computing in Cardiology[ISSN 2325-8861],v. 51, (Enero 2024) | en_US |
| dc.subject | 3314 Tecnología médica | en_US |
| dc.title | Analysis of COVID Patients Employing Approximate Entropy and Deep Learning for Classification and Early Diagnosis | en_US |
| dc.type | info:eu-repo/semantics/conferenceObject | en_US |
| dc.type | ConferenceObject | en_US |
| dc.relation.conference | 51st International Computing in Cardiology, CinC 2024 | en_US |
| dc.identifier.doi | 10.22489/CinC.2024.173 | en_US |
| dc.identifier.scopus | 105028368416 | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.orcid | NO DATA | - |
| dc.contributor.authorscopusid | 57222005271 | - |
| dc.contributor.authorscopusid | 9634135600 | - |
| dc.contributor.authorscopusid | 58189068000 | - |
| dc.contributor.authorscopusid | 58147880900 | - |
| dc.contributor.authorscopusid | 58189751900 | - |
| dc.contributor.authorscopusid | 57207622703 | - |
| dc.contributor.authorscopusid | 60347702400 | - |
| dc.identifier.eissn | 2325-887X | - |
| dc.relation.volume | 51 | en_US |
| dc.investigacion | Ingeniería y Arquitectura | en_US |
| dc.type2 | Actas de congresos | en_US |
| dc.utils.revision | Sí | en_US |
| dc.date.coverdate | Enero 2024 | en_US |
| dc.identifier.conferenceid | events156154 | - |
| dc.identifier.ulpgc | Sí | en_US |
| dc.contributor.buulpgc | BU-TEL | en_US |
| item.grantfulltext | open | - |
| item.fulltext | Con texto completo | - |
| crisitem.event.eventsstartdate | 14-05-2024 | - |
| crisitem.event.eventsenddate | 16-05-2024 | - |
| crisitem.author.dept | GIR IDeTIC: División de Procesado Digital de Señales | - |
| crisitem.author.dept | IU para el Desarrollo Tecnológico y la Innovación en Comunicaciones (IDeTIC) | - |
| crisitem.author.dept | Departamento de Señales y Comunicaciones | - |
| crisitem.author.orcid | 0000-0002-8512-965X | - |
| crisitem.author.parentorg | IU para el Desarrollo Tecnológico y la Innovación en Comunicaciones (IDeTIC) | - |
| crisitem.author.fullName | Ravelo García, Antonio Gabriel | - |
| Appears in Collections: | Actas de congresos | |
Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.