Deep Pyramidal Residual Networks

Cited 203 time in webofscience Cited 0 time in scopus
  • Hit : 386
  • Download : 0
Deep convolutional neural networks (DCNNs) have shown remarkable performance in image classification tasks in recent years. Generally, deep neural network architectures are stacks consisting of a large number of convolutional layers, and they perform downsampling along the spatial dimension via pooling to reduce memory usage. Concurrently, the feature map dimension (i.e., the number of channels) is sharply increased at downsampling locations, which is essential to ensure effective performance because it increases the diversity of high- level attributes. This also applies to residual networks and is very closely related to their performance. In this research, instead of sharply increasing the feature map dimension at units that perform downsampling, we gradually increase the feature map dimension at all units to involve as many locations as possible. This design, which is discussed in depth together with our new insights, has proven to be an effective means of improving generalization ability. Furthermore, we propose a novel residual unit capable of further improving the classification accuracy with our new network architecture. Experiments on benchmark CIFAR-10, CIFAR-100, and Image-Net datasets have shown that our network architecture has superior generalization ability compared to the original residual networks.
Publisher
IEEE Computer Society and the Computer Vision Foundation (CVF)
Issue Date
2017-07-25
Language
English
Citation

30th IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp.6307 - 6315

ISSN
1063-6919
DOI
10.1109/CVPR.2017.668
URI
http://hdl.handle.net/10203/227655
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 203 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0