Attention-based dropout layer for weakly supervised object localization

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 120
  • Download : 0
Weakly Supervised Object Localization (WSOL) techniques learn the object location only using image-level labels, without location annotations. A common limitation for these techniques is that they cover only the most discriminative part of the object, not the entire object. To address this problem, we propose an Attention-based Dropout Layer (ADL), which utilizes the self-attention mechanism to process the feature maps of the model. The proposed method is composed of two key components: 1) hiding the most discriminative part from the model for capturing the integral extent of object, and 2) highlighting the informative region for improving the recognition power of the model. Based on extensive experiments, we demonstrate that the proposed method is effective to improve the accuracy of WSOL, achieving a new state-of-the-art localization accuracy in CUB-200-2011 dataset. We also show that the proposed method is much more efficient in terms of both parameter and computation overheads than existing techniques.
Publisher
IEEE Computer Society
Issue Date
2019-06
Language
English
Citation

32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2019, pp.2214 - 2223

ISSN
1063-6919
DOI
10.1109/CVPR.2019.00232
URI
http://hdl.handle.net/10203/298143
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0