Contrastive learning for knowledge distillation-based anomaly detection지식 증류 기반 이상 탐지를 위한 대조 학습

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 305
  • Download : 0
In this work, we analyzed the two recent knowledge distillation-based anomaly detection methods and proposed a solution for their problem. Recently, in the field of anomaly detection, knowledge distillation-based methods are attracting attention for their excellent anomaly detection capabilities. Multiresolution Knowledge Distillation for Anomaly Detection (MKD) and Anomaly Detection via Reverse Distillation from One-Class Embedding (RD4AD) are representative of knowledge distillation-based anomaly detection techniques. We found that these two state-of-the-art techniques have difficulties in generalizing input data with a large diversity. We proposed residual contrastive learning (RCL) for this problem. RCL is a contrastive learning technique that can be used when anomaly data is available for training data. RCL uses the residuals between feature vectors of the teacher model and the student model used in the knowledge distillation-based anomaly detection model. We show that RCL enhance knowledge distillation-based anomaly detection methods for various datasets. Our method also successfully outperform previous state-of-the-art supervised anomaly detection methods.
Advisors
Kim, Taekyunresearcher김태균researcher
Description
한국과학기술원 :전산학부,
Publisher
한국과학기술원
Issue Date
2023
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 전산학부, 2023.2,[iii, 19 p. :]

Keywords

Anomaly detection▼aKnowledge distillation-based anomaly detection▼aContrastive learning; 이상 탐지▼a지식 증류 기반 이상 탐지▼a대조 학습

URI
http://hdl.handle.net/10203/309521
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1032971&flag=dissertation
Appears in Collection
CS-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0