Energy-based contrastive learning of visual representations에너지 기반 모델을 통한 시각 표현의 대조적 학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 155
  • Download : 0
Contrastive learning is a method of learning visual representations by training Deep Neural Networks (DNNs) to increase the similarity between representations of positive pairs (transformations of the same image) and reduce the similarity between representations of negative pairs (transformations of different images). Here we explore Energy-Based Contrastive Learning (EBCLR) that leverages the power of generative learning by combining contrastive learning with Energy-Based Models (EBMs). EBCLR can be theoretically interpreted as learning the joint distribution of positive pairs, and it shows promising results on small and medium-scale datasets such as MNIST, Fashion-MNIST, CIFAR10, and CIFAR100. Specifically, we find EBCLR demonstrates from ×4 up to ×20 acceleration compared to SimCLR and MoCo v2 in terms of training epochs. Furthermore, in contrast to SimCLR, we observe EBCLR achieves nearly the same performance with 254 negative pairs (batch size 128) and 30 negative pairs (batch size 16) per positive pair, demonstrating the robustness of EBCLR to small numbers of negative pairs. Hence, EBCLR provides a novel avenue for improving contrastive learning methods that usually require large datasets with a significant number of negative pairs per iteration to achieve reasonable performance on downstream tasks.
Advisors
Ye, Jong Chulresearcher예종철researcher
Description
한국과학기술원 :수리과학과,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 수리과학과, 2023.2,[iv, 24 p. :]

Keywords

EBM▼aContrastive Learning▼aDeep Learning; 에너지 기반 모델▼a대조적 학습▼a딥러닝

URI
http://hdl.handle.net/10203/308919
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032793&flag=dissertation
Appears in Collection
MA-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0