Style Balancing and Test-Time Style Shifting for Domain Generalization

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 193
  • Download : 0
Recent works on domain generalization have shown great success by generating new feature statistics (or style statistics) during training, which enables the model to get exposed to diverse domains or styles. However, existing works suffer from cross-domain class imbalance problem, that naturally arises in domain generalization problems. The performance of previous works are also degraded when the gap between the style statistics of source and target domains is large (i.e., when the distribution shift is large in the feature-level style space). In this paper, we propose new strategies to improve robustness against potential domain shift. We first propose style balancing, which strategically balances the number of samples for each class across all source domains, to improve domain diversity during training. Then we propose test-time style shifting, which shifts the style of the test sample (that has a large style gap with the source domains) to the nearest source domain to improve the prediction performance.
Publisher
IEEE ICML
Issue Date
2022-07-29
Language
English
Citation

ICML 2022 Workshop Principles of Distribution Shift (PODS)

URI
http://hdl.handle.net/10203/301173
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0