Lazy Net: Lazy Entry Neural Networks for Accelerated and Efficient Inference

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 79
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorPark, Junyongko
dc.contributor.authorKim, Daeyoungko
dc.contributor.authorMoon, Yong-Hyukko
dc.date.accessioned2023-09-15T07:01:42Z-
dc.date.available2023-09-15T07:01:42Z-
dc.date.created2023-09-15-
dc.date.issued2022-10-
dc.identifier.citation13th International Conference on Information and Communication Technology Convergence, ICTC 2022, pp.495 - 497-
dc.identifier.urihttp://hdl.handle.net/10203/312677-
dc.description.abstractModern edge devices have become powerful enough to run deep learning tasks, but there are still many challenges, such as limited resources such as computing power, memory space, and energy. To address these challenges, methods such as channel pruning, network quantization and early exiting has been introduced to reduce the computational load for achieve this tasks. In this paper, we propose LazyNet, an alternative network of applying skip modules instead of early exiting on a pre-trained neural network. We use a small module that preserves the spatial information and also provides metrics to decide the computational flow. If the data sample is easy, the network skips most of the computation load and if not, the network computes the sample for accurate classification. We test our model with various backbone networks and cifar-10 dataset and show reduction on model inference time, memory consumption and increased accuracy to prove our results.-
dc.languageEnglish-
dc.publisherIEEE Computer Society-
dc.titleLazy Net: Lazy Entry Neural Networks for Accelerated and Efficient Inference-
dc.typeConference-
dc.identifier.scopusid2-s2.0-85143254206-
dc.type.rimsCONF-
dc.citation.beginningpage495-
dc.citation.endingpage497-
dc.citation.publicationname13th International Conference on Information and Communication Technology Convergence, ICTC 2022-
dc.identifier.conferencecountryKO-
dc.identifier.conferencelocationJeju Island-
dc.identifier.doi10.1109/ICTC55196.2022.9953031-
dc.contributor.localauthorKim, Daeyoung-
dc.contributor.nonIdAuthorPark, Junyong-
dc.contributor.nonIdAuthorMoon, Yong-Hyuk-
Appears in Collection
CS-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0