FlowerFormer: empowering neural architecture encoding using a flow-aware graph transformer신경망 구조 인코딩을 위한 신경망 흐름을 고려한 그래프 트랜스포머

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 1
  • Download : 0
(b) global attention built on flow-based masking. Our extensive experiments demonstrate the superiority of FlowerFormer over existing neural encoding methods, and its effectiveness extends beyond computer vision models to include graph neural networks and auto speech recognition models; there is no one-size-fits-all solution. Thus, considerable efforts have been made to quickly and accurately estimate the performances of neural architectures, without full training or evaluation, for given tasks and datasets. Neural architecture encoding has played a crucial role in the estimation, and graph-based methods, which treat an architecture as a graph, have shown prominent performance. For enhanced representation learning of neural architectures, we introduce FlowerFormer, a powerful graph transformer that incorporates the information flows within a neural architecture. FlowerFormer consists of two key components: (a) bidirectional asynchronous message passing, inspired by the flows; The success of a specific neural network architecture is closely tied to the dataset and task it tackles
Advisors
신기정researcher
Description
한국과학기술원 :김재철AI대학원,
Publisher
한국과학기술원
Issue Date
2024
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2024.2,[iv, 25 p. :]

Keywords

그래프 트랜스포머▼a그래프 신경망▼a신경망 구조 인코딩▼a신경망 성능 예측; Graph transformers▼aGraph neural networks▼aNeural architecture encoding▼aNeural architecture performance prediction

URI
http://hdl.handle.net/10203/321350
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1096055&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0