Your lottery ticket is damaged: Towards all-alive pruning for extremely sparse networks

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 290
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Daejinko
dc.contributor.authorKim, Min-Sooko
dc.contributor.authorShim, Hyunjungko
dc.contributor.authorLee, Jongwukko
dc.date.accessioned2023-05-02T08:00:28Z-
dc.date.available2023-05-02T08:00:28Z-
dc.date.created2023-05-02-
dc.date.issued2023-07-
dc.identifier.citationINFORMATION SCIENCES, v.634, pp.608 - 620-
dc.identifier.issn0020-0255-
dc.identifier.urihttp://hdl.handle.net/10203/306415-
dc.description.abstractNetwork pruning has been widely adopted for reducing computational costs and memory consumption in low-resource devices. Recent studies show that the lottery ticket hypothesis achieves high accuracy under high compression ratios (i.e., 80-90% of the parameters in original networks are removed). Nevertheless, finding well-trainable networks with sparse parameters (i.e., < 10% of the parameters remaining) is still a challenging task, commonly believed by the lack of model capacity. This paper revisits the training process of existing pruning methods and observe that dead connections, which do not contribute to model capacity, occur in existing pruning methods. To this end, we propose a novel pruning method, namely all-alive pruning (AAP), which produces the pruned networks with only trainable weights with no dead connections. Notably, AAP is broadly applicable to various pruning methods and model architectures. We demonstrate that AAP equipped with the existing pruning methods (e.g., iterative pruning, one-shot pruning, and dynamic pruning) consistently improves the accuracy of the original methods at high compression ratios on the various image-and language-based tasks.-
dc.languageEnglish-
dc.publisherELSEVIER SCIENCE INC-
dc.titleYour lottery ticket is damaged: Towards all-alive pruning for extremely sparse networks-
dc.typeArticle-
dc.identifier.wosid000965731500001-
dc.identifier.scopusid2-s2.0-85151435576-
dc.type.rimsART-
dc.citation.volume634-
dc.citation.beginningpage608-
dc.citation.endingpage620-
dc.citation.publicationnameINFORMATION SCIENCES-
dc.identifier.doi10.1016/j.ins.2023.03.122-
dc.contributor.localauthorKim, Min-Soo-
dc.contributor.localauthorShim, Hyunjung-
dc.contributor.nonIdAuthorKim, Daejin-
dc.contributor.nonIdAuthorLee, Jongwuk-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorModel compression-
dc.subject.keywordAuthorNetwork pruning-
dc.subject.keywordAuthorDead neurons-
Appears in Collection
CS-Journal Papers(저널논문)AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0