Optimal approximation of discrete probability distribution with kth-order dependency and its application to combining multiple classifiers

Cited 29 time in webofscience Cited 34 time in scopus
  • Hit : 346
  • Download : 2
In order to probabilistically combine multiple decisions of K classifiers obtained from samples, a (K + 1)st-order probability distribution is needed. It is well known that storing and estimating the distribution is exponentially complex and is unmanageable even for small K. Chow and Liu (1968) as well as Lewis (1959) proposed an approximation of nth-order probability distributions with a product of second-order distributions considering first-order tree dependency. However, often we face cases in which a decision is based on more than two other classifiers. In such cases, first-order dependency would not be suitable to estimate a high order distribution properly. In this paper, a new method is proposed to optimally approximate a high order distribution with a product of kth-order dependencies, or (k + 1)st-order distributions, where 1 less than or equal to k less than or equal to K. The authors also proposed the way to identify high order dependencies from training samples. The superior performance of the new method is demonstrated by experiments on the recognition of standardized CENPARMI handwritten numerals and KAIST on-line handwritten numerals. (C) 1997 Published by Elsevier Science B.V.
Publisher
ELSEVIER SCIENCE BV
Issue Date
1997
Language
English
Article Type
Article
Keywords

RECOGNITION

Citation

PATTERN RECOGNITION LETTERS, v.18, pp.515 - 523

ISSN
0167-8655
DOI
10.1016/S0167-8655(97)00041-X
URI
http://hdl.handle.net/10203/10353
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 29 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0