Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10553/37111
Campo DC Valoridioma
dc.contributor.authorAlonso, Jesús B.en_US
dc.contributor.authorCabrera, Josuéen_US
dc.contributor.authorShyamnani, Rohiten_US
dc.contributor.authorTravieso González, Carlos Manuelen_US
dc.contributor.authorBolaños, Federicoen_US
dc.contributor.authorGarcía, Adriánen_US
dc.contributor.authorVillegas, Alexanderen_US
dc.contributor.authorWainwright, Marken_US
dc.contributor.otherAlonso-Hernandez, Jesus B.-
dc.date.accessioned2018-05-21T07:58:45Z-
dc.date.available2018-05-21T07:58:45Z-
dc.date.issued2017en_US
dc.identifier.issn0957-4174en_US
dc.identifier.urihttp://hdl.handle.net/10553/37111-
dc.description.abstractThe use of bioacoustics to identify animal species has huge potential for use in biology and conservation research. Fields that could be greatly enhanced by the use of bioacoustical techniques include the study of animal behavior, soundscape ecology, species diversity assessments, and long-term monitoring- for example to further our understanding of the conservation status of numerous species and their vulnerability to different threats. In this study, we focus primarily, but not exclusively, on the identification of anuran vocalizations. We chose anurans both because they tend to be quite vocal and because they are considered indicators of environmental health. We present a system for semi-automated segmentation of anuran calls, based on sound enhancement method that uses Minimum-Mean Square Error (MMSE) Short-Time Spectral Amplitude (STSA) estimator and noise suppression algorithm using Spectral Subtraction (SS), and an automated classification system for 17 anuran species based on Mel-Frequency Cepstrum Coefficients (MFCC) and the Gaussian Mixture Model (GMM). To our knowledge this is the first study that applies this combination of methods for animal identification. This technique achieves accuracies of between 96.1% and 100% per species. Experimental results show that the semi-automated segmentation technique performs better than automated segmentation systems, improving the average success rate to 98.61%. The effectiveness of the proposed anuran identification system in natural environment is thus verified. This work presents a first approach to future tools which can signify a significant advance in the procedures to analysis in a semiautomatic or even in an automatic way to analysis indicators of environmental health based on expert and intelligent systemsen_US
dc.languageengen_US
dc.relation.ispartofExpert Systems with Applicationsen_US
dc.sourceExpert Systems with Applications[ISSN 0957-4174],v. 75, p. 83-92en_US
dc.subject240601 Bioacústicaen_US
dc.subject120325 Diseño de sistemas sensoresen_US
dc.subject.otherBioacoustical identificationen_US
dc.subject.otherBiodiversity monitoringen_US
dc.subject.otherSpecies richnessen_US
dc.subject.otherEcological indicesen_US
dc.subject.otherEnvironmental audioen_US
dc.subject.otherFrog identificationen_US
dc.titleAutomatic anuran identification using noise removal and audio activity detectionen_US
dc.typeinfo:eu-repo/semantics/Articleen_US
dc.typeArticleen_US
dc.typeArticleen_US
dc.identifier.doi10.1016/j.eswa.2016.12.019en_US
dc.identifier.scopus85006256316-
dc.identifier.isi000392770900007-
dcterms.isPartOfExpert Systems With Applications
dcterms.sourceExpert Systems With Applications[ISSN 0957-4174],v. 72, p. 83-92
dc.contributor.authorscopusid24774957200-
dc.contributor.authorscopusid56501436400-
dc.contributor.authorscopusid57192418643-
dc.contributor.authorscopusid6602376272-
dc.contributor.authorscopusid8509701600-
dc.contributor.authorscopusid57192415239-
dc.contributor.authorscopusid57192417590-
dc.contributor.authorscopusid57192413836-
dc.contributor.authorscopusid57195934602-
dc.identifier.eissn1873-6793-
dc.description.lastpage92en_US
dc.description.firstpage83en_US
dc.relation.volume72en_US
dc.investigacionIngeniería y Arquitecturaen_US
dc.type2Artículoen_US
dc.identifier.wosWOS:000392770900007-
dc.contributor.daisngid418703-
dc.contributor.daisngid4468790-
dc.contributor.daisngid33491179-
dc.contributor.daisngid26163786-
dc.contributor.daisngid265761-
dc.contributor.daisngid1625309-
dc.contributor.daisngid736412-
dc.contributor.daisngid26597360-
dc.contributor.daisngid29565815-
dc.contributor.daisngid19268289-
dc.contributor.daisngid25360703-
dc.contributor.daisngid15945-
dc.identifier.investigatorRIDN-5977-2014-
dc.contributor.wosstandardWOS:Alonso, JB-
dc.contributor.wosstandardWOS:Cabrera, J-
dc.contributor.wosstandardWOS:Shyamnani, R-
dc.contributor.wosstandardWOS:Travieso, CM-
dc.contributor.wosstandardWOS:Bolanos, F-
dc.contributor.wosstandardWOS:Garcia, A-
dc.contributor.wosstandardWOS:Villegas, A-
dc.contributor.wosstandardWOS:Wainwright, M-
dc.date.coverdateAbril 2017en_US
dc.identifier.ulpgces
dc.description.sjr1,271
dc.description.jcr3,768
dc.description.sjrqQ1
dc.description.jcrqQ1
dc.description.scieSCIE
item.grantfulltextnone-
item.fulltextSin texto completo-
crisitem.author.deptGIR IDeTIC: División de Procesado Digital de Señales-
crisitem.author.deptIU para el Desarrollo Tecnológico y la Innovación-
crisitem.author.deptDepartamento de Señales y Comunicaciones-
crisitem.author.deptGIR IDeTIC: División de Procesado Digital de Señales-
crisitem.author.deptIU para el Desarrollo Tecnológico y la Innovación-
crisitem.author.deptDepartamento de Señales y Comunicaciones-
crisitem.author.orcid0000-0002-7866-585X-
crisitem.author.orcid0000-0002-4621-2768-
crisitem.author.parentorgIU para el Desarrollo Tecnológico y la Innovación-
crisitem.author.parentorgIU para el Desarrollo Tecnológico y la Innovación-
crisitem.author.fullNameAlonso Hernández, Jesús Bernardino-
crisitem.author.fullNameTravieso González, Carlos Manuel-
Colección:Artículos
Vista resumida

Citas SCOPUSTM   

41
actualizado el 21-abr-2024

Visitas

64
actualizado el 14-oct-2023

Google ScholarTM

Verifica

Altmetric


Comparte



Exporta metadatos



Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.