Calibration of Few-Shot Classification Tasks: Mitigating Misconfidence From Distribution Mismatch

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 150
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Sungnyunko
dc.contributor.authorYUN, SE-YOUNGko
dc.date.accessioned2022-06-14T05:00:09Z-
dc.date.available2022-06-14T05:00:09Z-
dc.date.created2022-06-13-
dc.date.created2022-06-13-
dc.date.created2022-06-13-
dc.date.issued2022-
dc.identifier.citationIEEE ACCESS, v.10, pp.53894 - 53908-
dc.identifier.issn2169-3536-
dc.identifier.urihttp://hdl.handle.net/10203/296933-
dc.description.abstractAs many meta-learning algorithms improve performance in solving few-shot classification problems for practical applications, the accurate prediction of uncertainty is considered essential. In meta-training, the algorithm treats all generated tasks equally and updates the model to perform well on training tasks. During the training, some of the tasks may make it difficult for the model to infer the query examples from the support examples, especially when a large mismatch between the support set and the query set exists. The distribution mismatch causes the model to have incorrect confidence, which causes a calibration problem. In this study, we propose a novel meta-training method that measures the distribution mismatch and enables the model to predict with more precise confidence. Moreover, our method is algorithm-agnostic and can be readily expanded to include a range of meta-learning models. Through extensive experimentation, including dataset shift, we show that our training strategy prevents the model from becoming indiscriminately confident, and thereby helps the model to produce calibrated classification results without the loss of accuracy.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleCalibration of Few-Shot Classification Tasks: Mitigating Misconfidence From Distribution Mismatch-
dc.typeArticle-
dc.identifier.wosid000802025000001-
dc.identifier.scopusid2-s2.0-85130503471-
dc.type.rimsART-
dc.citation.volume10-
dc.citation.beginningpage53894-
dc.citation.endingpage53908-
dc.citation.publicationnameIEEE ACCESS-
dc.identifier.doi10.1109/ACCESS.2022.3176090-
dc.contributor.localauthorYUN, SE-YOUNG-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorTask analysis-
dc.subject.keywordAuthorCalibration-
dc.subject.keywordAuthorTraining-
dc.subject.keywordAuthorPower capacitors-
dc.subject.keywordAuthorUncertainty-
dc.subject.keywordAuthorAdaptation models-
dc.subject.keywordAuthorPrediction algorithms-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthordistribution mismatch-
dc.subject.keywordAuthorfew-shot image classification-
dc.subject.keywordAuthortask calibration-
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0