Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/130721
Title: Novel Spectral Loss Function for Unsupervised Hyperspectral Image Segmentation
Authors: Pérez García, Ámbar 
Paoletti, Mercedes E.
Haut, Juan M.
López Feliciano, José Francisco 
UNESCO Clasification: 220990 Tratamiento digital. Imágenes
Keywords: Autoencoder (AE)
hyperspectral images (HSIs)
semantic segmentation
unsupervised learning
Issue Date: 2023
Journal: IEEE Geoscience and Remote Sensing Letters 
Abstract: Neural networks (NNs) have gained importance in hyperspectral image (HSI) segmentation for earth observation (EO) due to their unparalleled data-driven feature extraction capability. However, in many real-life situations, ground truth is not available, and the performance of unsupervised NNs is still susceptible to enhancement. To overcome this challenge, this letter presents a new loss function to improve the performance of unsupervised HSI segmentation models. The spectral loss function, $Sl$ , which can be included in different models, is based on the purity of the unmixing endmembers and the spectral similarity of the clusters provided by the NN to determine the classes. It is incorporated into a 3-D convolutional autoencoder (AE) to validate its performance on four standard HSI benchmarks. Furthermore, its performance has been qualitatively examined in a real case study, an oil spill without ground truth. The results show that $Sl$ is a breakthrough in unsupervised HS segmentation, obtaining the best overall performance and highlighting the importance of spectral signatures. Additionally, the dimensional reduction is also vital in compacting the spectral information, which facilitates its segmentation. The source code is available at https://github.com/mhaut/HSI-3DSpLoss.
URI: http://hdl.handle.net/10553/130721
ISSN: 1545-598X
DOI: 10.1109/LGRS.2023.3288809
Appears in Collections:Artículos
Show full item record

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.