Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/43951
Title: Automatic Identification of Botanical Samples of leaves using Computer Vision
Authors: Yadav, Anjali
Dutta, Malay Kishore
Travieso, Carlos M. 
Alonso, Jesus B. 
UNESCO Clasification: 3307 Tecnología electrónica
Keywords: Support vector machines , Feature extraction , Image segmentation , Classification algorithms , Shape , Gray-scale, Image Processing , Features extraction , GLCM features , Classification , Multi-SVM
Issue Date: 2017
Journal: 2017 International Work Conference on Bio-Inspired Intelligence: Intelligent Systems for Biodiversity Conservation, IWOBI 2017 - Proceedings
Conference: 5th IEEE International Work Conference on Bio-Inspired Intelligence, IWOBI 2017 
Abstract: Leaf can be one of the many different parameters on the basis of which a plant can be uniquely identified. Many plants types are on the verge of extinction and can be taken care of, if identified correctly. The proposed method discusses an automated image processing system for leaf classification. The leaf pixels from the image are segmented and termed as region of interest (ROI). A set of geometrical, textural and statistical features is extracted for each input sample and analyzed using a multi class SVM classifier. The proposed system has achieved an accuracy of 97% with a sensitivity of 98.32%. The results are encouraging for a dataset consisting of 10 different leaf classes and can be used for development of some real time application.
URI: http://hdl.handle.net/10553/43951
ISBN: 9781538608500
DOI: 10.1109/IWOBI.2017.7985531
Source: 2017 International Work Conference on Bio-Inspired Intelligence: Intelligent Systems for Biodiversity Conservation, IWOBI 2017 - Proceedings (7985531)
Appears in Collections:Actas de congresos
Show full item record

SCOPUSTM   
Citations

4
checked on Dec 1, 2024

Page view(s)

104
checked on Nov 1, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.