Identificador persistente para citar o vincular este elemento: https://accedacris.ulpgc.es/handle/10553/141828
Campo DC Valoridioma
dc.contributor.authorVitale, Sergioen_US
dc.contributor.authorFerraioli, Giampaoloen_US
dc.contributor.authorPascazio, Vitoen_US
dc.contributor.authorDeniz, Luis Gomezen_US
dc.date.accessioned2025-07-01T09:25:23Z-
dc.date.available2025-07-01T09:25:23Z-
dc.date.issued2025en_US
dc.identifier.issn1545-598Xen_US
dc.identifier.otherWoS-
dc.identifier.urihttps://accedacris.ulpgc.es/handle/10553/141828-
dc.description.abstractThe proposal of deep learning (DL) solutions for synthetic aperture radar (SAR) image despeckling has recently widespread. Such solutions have been mainly designed from a DL perspective by leveraging the training and validation stage on the use of typical norm-based cost functions. For going beyond the DL perspective, in this letter, we propose an SAR-based validation stage by using SAR assessing metrics in the design and hyperparameter selection of neural networks. In the first phase, SAR assessing metrics may be used only as validation metrics to highlight critical issues that cannot be spotted with standard image-processing quality metrics. In a second phase, the same SAR assessing metrics may be used directly for enhancing the DL solution by addressing specific issues that arose during the previous SAR-based validation stage. To this aim, three different DL SAR despeckling solutions and four different SAR assessing metrics have been considered. The outcome of this analysis shows the importance of including SAR knowledge in the training and validation stages of the design of a DL solution for SAR image despeckling.en_US
dc.languageengen_US
dc.relation.ispartofIEEE Geoscience and Remote Sensing Lettersen_US
dc.sourceIeee Geoscience And Remote Sensing Letters[ISSN 1545-598X],v. 22, (2025)en_US
dc.subject25 Ciencias de la tierra y del espacioen_US
dc.subject.otherMeasurementen_US
dc.subject.otherTrainingen_US
dc.subject.otherCost Functionen_US
dc.subject.otherRadar Polarimetryen_US
dc.subject.otherSpeckleen_US
dc.subject.otherInformation Filtersen_US
dc.subject.otherNoiseen_US
dc.subject.otherRadiometryen_US
dc.subject.otherAssessmenten_US
dc.subject.otherConvolutional Neural Networks (Cnns)en_US
dc.subject.otherDeep Learning (Dl)en_US
dc.subject.otherDespecklingen_US
dc.subject.otherImage Restorationen_US
dc.subject.otherSynthetic Aperture Radar (Sar)en_US
dc.subject.otherImage Restorationen_US
dc.subject.otherSynthetic Aperture Radar (Sar)en_US
dc.titleEnhanced Deep Learning SAR Despeckling Networks Based on SAR Assessing Metricsen_US
dc.typeinfo:eu-repo/semantics/Articleen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/LGRS.2025.3577907en_US
dc.identifier.isi001510911000006-
dc.identifier.eissn1558-0571-
dc.relation.volume22en_US
dc.investigacionIngeniería y Arquitecturaen_US
dc.type2Artículoen_US
dc.contributor.daisngidNo ID-
dc.contributor.daisngidNo ID-
dc.contributor.daisngidNo ID-
dc.contributor.daisngidNo ID-
dc.description.numberofpages5en_US
dc.utils.revisionen_US
dc.contributor.wosstandardWOS:Vitale, S-
dc.contributor.wosstandardWOS:Ferraioli, G-
dc.contributor.wosstandardWOS:Pascazio, V-
dc.contributor.wosstandardWOS:Deniz, LG-
dc.date.coverdate2025en_US
dc.identifier.ulpgcen_US
dc.contributor.buulpgcBU-TELen_US
dc.description.sjr1,248
dc.description.jcr4,0
dc.description.sjrqQ1
dc.description.jcrqQ1
dc.description.scieSCIE
dc.description.miaricds10,7
item.grantfulltextopen-
item.fulltextCon texto completo-
crisitem.author.deptGIR IUCES: Centro de Tecnologías de la Imagen-
crisitem.author.deptIU de Cibernética, Empresa y Sociedad (IUCES)-
crisitem.author.deptDepartamento de Ingeniería Electrónica y Automática-
crisitem.author.orcid0000-0003-0667-2302-
crisitem.author.parentorgIU de Cibernética, Empresa y Sociedad (IUCES)-
crisitem.author.fullNameGómez Déniz, Luis-
Colección:Artículos
Adobe PDF (1,8 MB)
Vista resumida

Google ScholarTM

Verifica

Altmetric


Comparte



Exporta metadatos



Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.