Please use this identifier to cite or link to this item:
http://hdl.handle.net/10553/127432
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Barra, Paola | en_US |
dc.contributor.author | Orefice, Giosuè | en_US |
dc.contributor.author | Auriemma Citarella, Alessia | en_US |
dc.contributor.author | Castrillón Santana, Modesto Fernando | en_US |
dc.contributor.author | Ciaramella, Angelo | en_US |
dc.date.accessioned | 2023-10-30T15:30:05Z | - |
dc.date.available | 2023-10-30T15:30:05Z | - |
dc.date.issued | 2023 | en_US |
dc.identifier.isbn | 979-8-4007-0116-0 | en_US |
dc.identifier.other | Scopus | - |
dc.identifier.uri | http://hdl.handle.net/10553/127432 | - |
dc.description.abstract | The marine ecosystem faces a significant threat due to the release of human waste into the sea. One of the most challenging issues is identifying and removing small particles that settle on the sand. These particles can be ingested by local fauna or cause harm to the marine ecosystem. Distinguishing these particles from natural materials like shells and stones is difficult, as they blend in with the surroundings. To address this problem, we utilized the Litter On The Sand (LOTS) dataset, which comprises images of clean, dirty, and wavy sand from three different beaches. We established an initial benchmark on this dataset by employing state-of-the-art Deep Learning segmentation techniques. The evaluated models included MultiResU-Net, Half MultiResU-Net, and Quarter MultiResU-Net. The results revealed that the Half MultiResU-Net model outperformed the others for most types of sand analyzed, providing valuable insights for future efforts in combating marine litter and preserving the health of our marine ecosystems. | en_US |
dc.language | eng | en_US |
dc.source | GoodIT '23: ACM International Conference on Information Technology for Social Good, 2023, p. 1-5, Lisbon Portugal , (Septiembre 2023) | en_US |
dc.subject | 33 Ciencias tecnológicas | en_US |
dc.subject.other | Computer vision | en_US |
dc.subject.other | Dataset | en_US |
dc.subject.other | Litter detection | en_US |
dc.subject.other | Machine learning | en_US |
dc.subject.other | Segmentation | en_US |
dc.title | Litter segmentation with LOTS dataset | en_US |
dc.type | info:eu-repo/semantics/conferenceObject | en_US |
dc.type | ConferenceObject | en_US |
dc.relation.conference | ACM 3rd International Conference on Information Technology for Social Good (GoodIT 2023) | en_US |
dc.identifier.doi | 10.1145/3582515.3609511 | en_US |
dc.identifier.scopus | 85174295182 | - |
dc.contributor.orcid | 0000-0002-7692-0626 | - |
dc.contributor.orcid | 0009-0003-1464-4173 | - |
dc.contributor.orcid | 0000-0002-6525-0217 | - |
dc.contributor.orcid | 0000-0002-8673-2725 | - |
dc.contributor.orcid | 0000-0001-5592-7995 | - |
dc.contributor.authorscopusid | 57205195650 | - |
dc.contributor.authorscopusid | 58571613300 | - |
dc.contributor.authorscopusid | 57226113587 | - |
dc.contributor.authorscopusid | 57218418238 | - |
dc.contributor.authorscopusid | 7003470719 | - |
dc.description.lastpage | 5 | en_US |
dc.description.firstpage | 1 | en_US |
dc.investigacion | Ingeniería y Arquitectura | en_US |
dc.type2 | Actas de congresos | en_US |
dc.description.numberofpages | 5 | en_US |
dc.utils.revision | Sí | en_US |
dc.date.coverdate | Septiembre 2023 | en_US |
dc.identifier.conferenceid | events150445 | - |
dc.identifier.ulpgc | Sí | en_US |
dc.contributor.buulpgc | BU-INF | en_US |
item.grantfulltext | none | - |
item.fulltext | Sin texto completo | - |
crisitem.author.dept | GIR SIANI: Inteligencia Artificial, Robótica y Oceanografía Computacional | - |
crisitem.author.dept | IU Sistemas Inteligentes y Aplicaciones Numéricas | - |
crisitem.author.dept | Departamento de Informática y Sistemas | - |
crisitem.author.orcid | 0000-0002-8673-2725 | - |
crisitem.author.parentorg | IU Sistemas Inteligentes y Aplicaciones Numéricas | - |
crisitem.author.fullName | Castrillón Santana, Modesto Fernando | - |
Appears in Collections: | Actas de congresos |
Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.