Your lottery ticket is damaged: Towards all-alive pruning for extremely sparse networks

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 274
  • Download : 0
Network pruning has been widely adopted for reducing computational costs and memory consumption in low-resource devices. Recent studies show that the lottery ticket hypothesis achieves high accuracy under high compression ratios (i.e., 80-90% of the parameters in original networks are removed). Nevertheless, finding well-trainable networks with sparse parameters (i.e., < 10% of the parameters remaining) is still a challenging task, commonly believed by the lack of model capacity. This paper revisits the training process of existing pruning methods and observe that dead connections, which do not contribute to model capacity, occur in existing pruning methods. To this end, we propose a novel pruning method, namely all-alive pruning (AAP), which produces the pruned networks with only trainable weights with no dead connections. Notably, AAP is broadly applicable to various pruning methods and model architectures. We demonstrate that AAP equipped with the existing pruning methods (e.g., iterative pruning, one-shot pruning, and dynamic pruning) consistently improves the accuracy of the original methods at high compression ratios on the various image-and language-based tasks.
Publisher
ELSEVIER SCIENCE INC
Issue Date
2023-07
Language
English
Article Type
Article
Citation

INFORMATION SCIENCES, v.634, pp.608 - 620

ISSN
0020-0255
DOI
10.1016/j.ins.2023.03.122
URI
http://hdl.handle.net/10203/306415
Appears in Collection
CS-Journal Papers(저널논문)AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0