A Parameter Efficient Multi-Scale Capsule Network

Cited 2 time in webofscience Cited 0 time in scopus
  • Hit : 104
  • Download : 0
Capsule networks consider spatial relationships in an input image. The relationship-based feature propagation in capsule networks shows promising results. However, a large number of trainable parameters limit their widespread use. In this paper, we propose Decomposed Capsule Network (DCN) to reduce the number of training parameters in the primary capsule generation stage. Our DCN represents a capsule as a combination of basis vectors. Generating basis vectors and their coefficients notably reduce the total number of training parameters. Moreover, we introduce an extension of the DCN architecture, named Multi-scale Decomposed Capsule Network (MDCN). The MDCN architecture integrates features from multiple scales to synthesize capsules with fewer parameters. Our proposed networks show better performance on the Fashion-MNIST dataset and the CIFAR10 dataset with fewer parameters than the original network.
Publisher
IEEE Signal Processing Society
Issue Date
2021-09
Language
English
Citation

IEEE International Conference on Image Processing (ICIP), pp.739 - 743

ISSN
1522-4880
DOI
10.1109/ICIP42928.2021.9506364
URI
http://hdl.handle.net/10203/289609
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 2 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0