Distribution Aware Active Learning via Gaussian Mixtures

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 107
  • Download : 0
In this paper, we propose a distribution-aware active learning strategy that captures and mitigates the distribution discrepancy between the labeled and unlabeled sets to cope with overfitting. By taking advantage of gaussian mixture models (GMM) and Wasserstein distance, we first design a distribution-aware training strategy to improve the model performance. Then, we introduce a hybrid informativeness metric for active learning which considers both likelihood-based and model-based information simultaneously. Experimental results on four different datasets show the effectiveness of our method against existing active learning baselines.
Publisher
ICLR
Issue Date
2023-05-05
Language
English
Citation

The International Conference on Learning Representations, ICLR 2023

URI
http://hdl.handle.net/10203/315805
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0