A brain-inspired network architecture for cost-efficient object recognition in shallow hierarchical neural networks

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 145
  • Download : 45
DC FieldValueLanguage
dc.contributor.authorPark, Youngjinko
dc.contributor.authorBaek, Seungdaeko
dc.contributor.authorPaik, Se-Bumko
dc.date.accessioned2021-02-02T05:50:09Z-
dc.date.available2021-02-02T05:50:09Z-
dc.date.created2021-01-27-
dc.date.created2021-01-27-
dc.date.created2021-01-27-
dc.date.created2021-01-27-
dc.date.issued2021-02-
dc.identifier.citationNEURAL NETWORKS, v.134, pp.76 - 85-
dc.identifier.issn0893-6080-
dc.identifier.urihttp://hdl.handle.net/10203/280461-
dc.description.abstractThe brain successfully performs visual object recognition with a limited number of hierarchical networks that are much shallower than artificial deep neural networks (DNNs) that perform similar tasks. Here, we show that long-range horizontal connections (LRCs), often observed in the visual cortex of mammalian species, enable such a cost-efficient visual object recognition in shallow neural networks. Using simulations of a model hierarchical network with convergent feedforward connections and LRCs, we found that the addition of LRCs to the shallow feedforward network significantly enhances the performance of networks for image classification, to a degree that is comparable to much deeper networks. We found that a combination of sparse LRCs and dense local connections dramatically increases performance per wiring cost. From network pruning with gradient-based optimization, we also confirmed that LRCs could emerge spontaneously by minimizing the total connection length while maintaining performance. Ablation of emerged LRCs led to a significant reduction of classification performance, which implies these LRCs are crucial for performing image classification. Taken together, our findings suggest a brain-inspired strategy for constructing a cost-efficient network architecture to implement parsimonious object recognition under physical constraints such as shallow hierarchical depth.-
dc.languageEnglish-
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD-
dc.titleA brain-inspired network architecture for cost-efficient object recognition in shallow hierarchical neural networks-
dc.typeArticle-
dc.identifier.wosid000603296800007-
dc.identifier.scopusid2-s2.0-85097338488-
dc.type.rimsART-
dc.citation.volume134-
dc.citation.beginningpage76-
dc.citation.endingpage85-
dc.citation.publicationnameNEURAL NETWORKS-
dc.identifier.doi10.1016/j.neunet.2020.11.013-
dc.contributor.localauthorPaik, Se-Bum-
dc.description.isOpenAccessY-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorVisual cortex-
dc.subject.keywordAuthorLong-range horizontal connection-
dc.subject.keywordAuthorObject recognition-
dc.subject.keywordAuthorShallow network-
dc.subject.keywordAuthorArtificial neural network-
dc.subject.keywordAuthorCost-efficiency-
dc.subject.keywordPlusPRIMARY VISUAL-CORTEX-
dc.subject.keywordPlusHORIZONTAL CONNECTIONS-
dc.subject.keywordPlusCONTEXTUAL INTERACTIONS-
dc.subject.keywordPlusORIENTATION SELECTIVITY-
dc.subject.keywordPlusFUNCTIONAL ARCHITECTURE-
dc.subject.keywordPlusLAYER-III-
dc.subject.keywordPlusMODEL-
dc.subject.keywordPlusSYNCHRONIZATION-
dc.subject.keywordPlusORGANIZATION-
dc.subject.keywordPlusARRANGEMENT-

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0