FlowerFormer: empowering neural architecture encoding using a flow-aware graph transformer신경망 구조 인코딩을 위한 신경망 흐름을 고려한 그래프 트랜스포머

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 2
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisor신기정-
dc.contributor.authorHwang, Dongyeong-
dc.contributor.author황동영-
dc.date.accessioned2024-07-30T19:30:37Z-
dc.date.available2024-07-30T19:30:37Z-
dc.date.issued2024-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1096055&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/321350-
dc.description학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2024.2,[iv, 25 p. :]-
dc.description.abstract(b) global attention built on flow-based masking. Our extensive experiments demonstrate the superiority of FlowerFormer over existing neural encoding methods, and its effectiveness extends beyond computer vision models to include graph neural networks and auto speech recognition models-
dc.description.abstractthere is no one-size-fits-all solution. Thus, considerable efforts have been made to quickly and accurately estimate the performances of neural architectures, without full training or evaluation, for given tasks and datasets. Neural architecture encoding has played a crucial role in the estimation, and graph-based methods, which treat an architecture as a graph, have shown prominent performance. For enhanced representation learning of neural architectures, we introduce FlowerFormer, a powerful graph transformer that incorporates the information flows within a neural architecture. FlowerFormer consists of two key components: (a) bidirectional asynchronous message passing, inspired by the flows-
dc.description.abstractThe success of a specific neural network architecture is closely tied to the dataset and task it tackles-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subject그래프 트랜스포머▼a그래프 신경망▼a신경망 구조 인코딩▼a신경망 성능 예측-
dc.subjectGraph transformers▼aGraph neural networks▼aNeural architecture encoding▼aNeural architecture performance prediction-
dc.titleFlowerFormer: empowering neural architecture encoding using a flow-aware graph transformer-
dc.title.alternative신경망 구조 인코딩을 위한 신경망 흐름을 고려한 그래프 트랜스포머-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :김재철AI대학원,-
dc.contributor.alternativeauthorShin, Kijung-
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0