Attention Masking for Improved Near Out-of-Distribution Image Detection

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 80
  • Download : 0
Detecting near out-of-distribution (OOD) data is important when deploying a deep learning model. The goal of near-OOD detection is to distinguish the OOD samples when distributions of inliers and outliers are similar. However, an input sample containing unexpected information may degrade the OOD detection performance in downstream tasks. To address this problem, we propose an algorithm called attention masking, which masks the less-attended parts of the given input to precisely calculate its Mahalanobis distance from the training distribution. In our experiments, we use a largescale pre-trained model to measure the performance of our approach on the vision OOD benchmark tasks. For instance, on CFAR-100vs. CIFAR-10 detection, we improve the AUROC of Mahalanobis distance-based OOD detection from 91.23% to 92.89% using ViT-Base model. In addition, we measured the performance of the challenging zero-shot OOD detection, which only utilizes the pre-trained weights without fine-tuning on CFAR-100 or CFAR-10, and achieved an average AUROC improvement of 7% on CFAR-100vs. CFAR-10 detection.
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2023-02-15
Language
English
Citation

2023 IEEE International Conference on Big Data and Smart Computing, BigComp 2023, pp.195 - 202

DOI
10.1109/BigComp57234.2023.00040
URI
http://hdl.handle.net/10203/308687
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0