Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/46166
Title: Improving generalization ability of HMM/NNs based classifiers
Authors: Ferrer, Miguel A. 
Alonso, Itziar G.
Travieso, Carlos M. 
Figueiras-Vidal, Anibal R.
UNESCO Clasification: 3307 Tecnología electrónica
Keywords: Handwriting recognition
Hidden Markov models
Speech recognition
Artificial neural networks
Standards
Issue Date: 2000
Journal: European Signal Processing Conference
Conference: 2000 10th European Signal Processing Conference, EUSIPCO 2000 
Abstract: Standard Hidden Markov Models (HMM) have proved to be a very useful tool for temporal sequence pattern recognition, although they present a poor discriminative power. On the contrary Neural Networks (NNs) have been recognized as powerful tools for classification task, but they are less efficient to model temporal variation than HMM. In order to get the advantages of both HMMs and NNs, different hybrid structures have been proposed. In this paper we suggest a HMM/NN hybrid where the NN classify from HMM scores. As NN we have used a committee of networks. As networks of the committee we have used a Multilayer Perceptron (MLP: a global classifier) and Radial Basis Function (RBF: a local classifier) nets which drawn conceptually different interclass borders. The combining algorithm is the TopNSeg scoring method which sum the top N ranked networks normalized outputs for each class. The test of above architecture with speech recognition, handwritten numeral classification, and signature verification problems show that this architecture works significantly better than the isolated networks.
URI: http://hdl.handle.net/10553/46166
ISSN: 2219-5491
Source: European Signal Processing Conference[ISSN 2219-5491],v. 2015-March (7075437)
Appears in Collections:Actas de congresos
Thumbnail
Adobe PDF (89,8 kB)
Show full item record

Google ScholarTM

Check


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.