Revisiting Orthogonality Regularization: A Study for Convolutional Neural Networks in Image Classification

Cited 1 time in webofscience Cited 0 time in scopus
  • Hit : 171
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorKim, Taehyeonko
dc.contributor.authorYun, Se-Youngko
dc.date.accessioned2022-08-25T08:01:23Z-
dc.date.available2022-08-25T08:01:23Z-
dc.date.created2022-08-25-
dc.date.created2022-08-25-
dc.date.issued2022-
dc.identifier.citationIEEE ACCESS, v.10, pp.69741 - 69749-
dc.identifier.issn2169-3536-
dc.identifier.urihttp://hdl.handle.net/10203/298114-
dc.description.abstractRecent research in deep Convolutional Neural Networks(CNN) faces the challenges of vanishing/exploding gradient issues, training instability, and feature redundancy. Orthogonality Regularization(OR), which introduces a penalty function considering the orthogonality of neural networks, could be a remedy to these challenges but is surprisingly not popular in the literature. This work revisits the OR approaches and empirically answer the question: Even when comparing various regularizations like weight decay and spectral norm regularization, which is the most powerful OR technique? We begin by introducing the improvements of various regularization techniques, specifically focusing on OR approaches over a variety of architectures. After that, we disentangle the benefits of OR in the comparison of other regularization approaches with a connection on how they affect norm preservation effects and feature redundancy in the forward and backward propagation. Our investigations show that Kernel Orthogonality Regularization(KOR) approaches, which directly penalize the orthogonality of convolutional kernel matrices, consistently outperform other techniques. We propose a simple KOR method considering both row- and column- orthogonality, of which empirical performance is the most effective in mitigating the aforementioned challenges. We further discuss several circumstances in the recent CNN models on various benchmark datasets, wherein KOR gains more effectiveness.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleRevisiting Orthogonality Regularization: A Study for Convolutional Neural Networks in Image Classification-
dc.typeArticle-
dc.identifier.wosid000838367900001-
dc.identifier.scopusid2-s2.0-85133615093-
dc.type.rimsART-
dc.citation.volume10-
dc.citation.beginningpage69741-
dc.citation.endingpage69749-
dc.citation.publicationnameIEEE ACCESS-
dc.identifier.doi10.1109/ACCESS.2022.3185621-
dc.contributor.localauthorYun, Se-Young-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
dc.subject.keywordAuthorTraining-
dc.subject.keywordAuthorKernel-
dc.subject.keywordAuthorConvolutional neural networks-
dc.subject.keywordAuthorRedundancy-
dc.subject.keywordAuthorArtificial intelligence-
dc.subject.keywordAuthorLicenses-
dc.subject.keywordAuthorSignal to noise ratio-
dc.subject.keywordAuthorDeep neural network (DNN)-
dc.subject.keywordAuthorkernel-
dc.subject.keywordAuthororthogonality regularization-
dc.subject.keywordAuthorconvolutional neural network (CNN)-
dc.subject.keywordAuthorregularization-
dc.subject.keywordAuthorimage classification-
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 1 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0