Robust Perturbation for Visual Explanation: Cross-checking Mask Optimization to Avoid Class Distortion

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 479
  • Download : 0
Along with the outstanding performance of the deep neural networks (DNNs), considerable research efforts have been devoted to finding ways to understand the decision of DNNs structures. In the computer vision domain, visualizing the attribution map is one of the most intuitive and understandable ways to achieve human-level interpretation. Among them, perturbation-based visualization can explain the "black box" property of the given network by optimizing perturbation masks that alter the network prediction of the target class the most. However, existing perturbation methods could make unexpected changes to network predictions after applying a perturbation mask to the input image, resulting in a loss of robustness and fidelity of the perturbation mechanisms. In this paper, we define class distortion as the unexpected changes of the network prediction during the perturbation process. To handle that, we propose a novel visual interpretation framework, Robust Perturbation, which shows robustness against the unexpected class distortion during the mask optimization. With a new cross-checking mask optimization strategy, our proposed framework perturbs the target prediction of the network while upholding the non-target predictions, providing more reliable and accurate visual explanations. We evaluate our framework on three different public datasets through extensive experiments. Furthermore, we propose a new metric for class distortion evaluation. In both quantitative and qualitative experiments, tackling the class distortion problem turns out to enhance the quality and fidelity of the visual explanation in comparison with the existing perturbation-based methods.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2022-01
Language
English
Article Type
Article
Citation

IEEE TRANSACTIONS ON IMAGE PROCESSING, v.31, pp.301 - 313

ISSN
1057-7149
DOI
10.1109/TIP.2021.3130526
URI
http://hdl.handle.net/10203/291320
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0