A framework for probabilistic combination of multiple classifiers at an abstract level

Cited 16 time in webofscience Cited 18 time in scopus
  • Hit : 332
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKang, HJko
dc.contributor.authorKim, KWko
dc.contributor.authorKim, JinHyungko
dc.date.accessioned2013-03-03T05:58:56Z-
dc.date.available2013-03-03T05:58:56Z-
dc.date.created2012-02-06-
dc.date.created2012-02-06-
dc.date.issued1997-12-
dc.identifier.citationENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, v.10, no.4, pp.379 - 385-
dc.identifier.issn0952-1976-
dc.identifier.urihttp://hdl.handle.net/10203/77527-
dc.description.abstractin previous approaches, the combination. of multiple classifiers depends heavily on one of the three classification results; measurement scores (measurement level), ranking (rank level), and top choice (abstract level). For a more general combination of multiple classifiers, it is desirable that combination methods should be developed at the abstract level. In combining multiple classifiers at this level, most studies have assumed that classifiers behave independently. Such an assumption degrades and biases the classification performance, in cases where highly dependent classifiers are added. In order to overcome such weaknesses, it should be possible to combine multiple classifiers in a probabilistic framework, using a Bayesian formalism. A probabilistic combination of multiple decisions of K classifiers needs a (K + 1)st-order probability distribution. However it is well known that such a distribution will become unmanageable to store and estimate, even for a small K. In this paper; a framework is proposed to optimally identify a product set of kth-order dependencies, where 1 less than or equal to k less than or equal to K for the product approximation of the (K + 1)st-order probability distribution from training samples, and to probabilistically combine multiple decisions by the identified product set, using the Bayesian formalism. This framework was tested and evaluated using a standardized CENPARMI data base. The results showed superior performance over other combination methods. (C) 1997 Published by Elsevier Science Ltd. All rights reserved.-
dc.languageEnglish-
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD-
dc.titleA framework for probabilistic combination of multiple classifiers at an abstract level-
dc.typeArticle-
dc.identifier.wosidA1997YE66500006-
dc.identifier.scopusid2-s2.0-0031198818-
dc.type.rimsART-
dc.citation.volume10-
dc.citation.issue4-
dc.citation.beginningpage379-
dc.citation.endingpage385-
dc.citation.publicationnameENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE-
dc.identifier.doi10.1016/S0952-1976(97)00020-1-
dc.contributor.localauthorKim, JinHyung-
dc.contributor.nonIdAuthorKang, HJ-
dc.contributor.nonIdAuthorKim, KW-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorprobabilistic combination-
dc.subject.keywordAuthorcombining multiple classifiers-
dc.subject.keywordAuthorkth-order dependency-
dc.subject.keywordAuthorhigh-order probability distributions-
dc.subject.keywordAuthorproduct approximation-
dc.subject.keywordAuthorBayesian formalism-
dc.subject.keywordPlusRECOGNITION-
Appears in Collection
CS-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 16 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0