Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/132179
Title: Design of a labeled dataset for despeckling SAR imagery
Authors: Vasquez Salazar, Ruben Dario 
Cardona Mesa, Ahmed Alejandro 
Gómez, Luis 
Travieso-González, Carlos M. 
Garavito-González, Andrés F.
Vásquez-Cano, Esteban
UNESCO Clasification: 33 Ciencias tecnológicas
Issue Date: 2024
Journal: Proceedings of the European Conference on Synthetic Aperture Radar, EUSAR 
Conference: 15th European Conference on Synthetic Aperture Radar, EUSAR 2024
Abstract: In the field of computer vision, Deep Learning (DL) emerges as an extremely useful and powerful tool for image processing. When training DL models with the supervised learning paradigm, labeled datasets must contain input and output data. In DL models, and more specifically those designed for filtering tasks, a labeled dataset should ideally contain two subsets: one comprising noisy images as inputs and another comprising noiseless images as outputs. This way, the model will update its inner filters and weights, trying to find a way to restore the image. SAR (Synthetic Aperture Radar) images include speckle inherent to the sensor, making it unattainable to have a valid reference (ground truth image). The traditional approach is to corrupt with a speckle model a clean image, so a ground truth image to evaluate image-quality indices is available, and therefore, despeckling filters and DL models can be properly designed. In a more realistic approach, the ground truth is crafted from actual SAR images. In this paper, a multitemporal fusion strategy was used to design a dataset from SAR images, and data augmentation techniques were then applied to improve its quality. Nineteen datasets were designed and evaluated with different metrics to attain a better version of the dataset. Additionally, we employed a transformer called SwinIR to restore the images and increase the details and edges in the output images, resulting in a high-quality dataset that can be used to train models with supervised learning.
URI: http://hdl.handle.net/10553/132179
ISBN: 9783800762873
ISSN: 2197-4403
Source: Proceedings of the European Conference on Synthetic Aperture Radar, EUSAR[ISSN 2197-4403], p. 509-512, (Enero 2024)
Appears in Collections:Actas de congresos
Show full item record

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.