Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10553/47707
Campo DC Valoridioma
dc.contributor.authorMunteanu, Cristianen_US
dc.contributor.authorMorales, Francisco Cabreraen_US
dc.contributor.authorFernández, Javier Gonzálezen_US
dc.contributor.authorRosa, Agostinhoen_US
dc.contributor.authorDéniz, Luís Gómezen_US
dc.date.accessioned2018-11-23T15:45:15Z-
dc.date.available2018-11-23T15:45:15Z-
dc.date.issued2008en_US
dc.identifier.issn0933-3657en_US
dc.identifier.urihttp://hdl.handle.net/10553/47707-
dc.description.abstractObjective: So far there is no ideal speckle reduction filtering technique that is capable of enhancing and reducing the level of noise in medical ultrasound (US) images, white efficiently responding to medical experts' validation criteria which quite often include a subjective component. This paper presents an interactive tool. called evolutionary speckle reducing anisotropic diffusion filter (EVOSRAD) that performs adaptive speckle filtering on ultrasound B-mode still images. The medical expert runs the algorithm interactively, having a permanent control over the output, and guiding the filtering process towards obtaining enhanced images that agree to his/her subjective quality criteria.Methods and material: We employ an interactive evolutionary algorithm (IGA) to adapt on-tine the parameters of a speckle reducing anisotropic diffusion (SRAD) filter. For a given input US image, the algorithm evolves the parameters of the SRAD filter according to subjective criteria of the medical expert who runs the interactive algorithm. The method and its validation are applied to a test bed comprising both real and simulated obstetrics and gynecology (OB/GYN) ultrasound images.Results: The potential of the method is analyzed in comparison to other speckle reduction fitters: the original SRAD filter, the anisotropic diffusion, offset and median fitters. Results obtained show the good potential of the method on several classes of OB/GYN ultrasound images, as well as on a synthetic image simulating a real fetal US image. Quality criteria for the evaluation and validation of the method include subjective scoring given by the medical expert who runs the interactive method, as well as objective global and local quality criteria.Conclusions: The method presented allows the medical expert to design its own filters according to the degree of medical expertise as well as to particular and often subjective assessment criteria. A filter is designed for a given class of ultrasound images and for a given medical expert who will later use the respective filter in clinical practice. The process of designing a filter is simple and employs an interactive visualization and scoring stage that does not require image processing knowledge. Results show that fitters tailored using the presented method achieve better quality scores than other more generic speckle filtering techniques. (C) 2008 Elsevier B.V. All rights reserved.en_US
dc.languageengen_US
dc.publisher0933-3657-
dc.relation.ispartofArtificial Intelligence in Medicineen_US
dc.sourceArtificial Intelligence in Medicine[ISSN 0933-3657],v. 43, p. 223-242en_US
dc.subject3314 Tecnología médicaen_US
dc.subject.otherNeonatal Spineen_US
dc.subject.otherReductionen_US
dc.subject.otherSonographyen_US
dc.titleEnhancing obstetric and gynecology ultrasound images by adaptation of the speckle reducing anisotropic diffusion filteren_US
dc.typeinfo:eu-repo/semantics/Articleen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.artmed.2008.04.001en_US
dc.identifier.scopus46549086705-
dc.identifier.isi000257686600005-
dc.contributor.authorscopusid55511749163-
dc.contributor.authorscopusid7101807075-
dc.contributor.authorscopusid55455063900-
dc.contributor.authorscopusid7201498661-
dc.contributor.authorscopusid57190033385-
dc.description.lastpage242en_US
dc.description.firstpage223en_US
dc.relation.volume43en_US
dc.investigacionIngeniería y Arquitecturaen_US
dc.type2Artículoen_US
dc.contributor.daisngid392799-
dc.contributor.daisngid2258643-
dc.contributor.daisngid34953761-
dc.contributor.daisngid439450-
dc.contributor.daisngid7172625-
dc.utils.revisionen_US
dc.contributor.wosstandardWOS:Munteanu, C-
dc.contributor.wosstandardWOS:Morales, FC-
dc.contributor.wosstandardWOS:Fernandez, JG-
dc.contributor.wosstandardWOS:Rosa, A-
dc.contributor.wosstandardWOS:Deniz, LG-
dc.date.coverdateJulio 2008en_US
dc.identifier.ulpgces
dc.description.jcr1,96
dc.description.jcrqQ2
dc.description.scieSCIE
item.grantfulltextnone-
item.fulltextSin texto completo-
crisitem.author.deptGIR IUCES: Centro de Tecnologías de la Imagen-
crisitem.author.deptIU de Cibernética, Empresa y Sociedad (IUCES)-
crisitem.author.deptDepartamento de Ingeniería Electrónica y Automática-
crisitem.author.orcid0000-0003-0667-2302-
crisitem.author.parentorgIU de Cibernética, Empresa y Sociedad (IUCES)-
crisitem.author.fullNameGómez Déniz, Luis-
Colección:Artículos
Vista resumida

Citas SCOPUSTM   

8
actualizado el 17-nov-2024

Citas de WEB OF SCIENCETM
Citations

7
actualizado el 17-nov-2024

Visitas

108
actualizado el 01-nov-2024

Google ScholarTM

Verifica

Altmetric


Comparte



Exporta metadatos



Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.