Modeling and simulation of semiconductor devices using machine learning머신 러닝을 이용한 반도체 소자 모델링 및 시뮬레이션

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 3
  • Download : 0
This paper delves into the modeling and simulation of semiconductor devices through the utilization of machine learning. We propose an approach leveraging machine learning to model device performance and physical characteristics based on semiconductor device parameters, subsequently employing these models for optimization and simulation. To elaborate further, we employ Bayesian optimization methods to maximize the performance of a state-of-the-art 3-nanometer node nanosheet field-effect transistor within a five-dimensional design space. Furthermore, through the application of physics-informed machine learning, we efficiently model and simulate the spatial physical attributes of nanowires, which represent the next-generation gate-all-around device. This study introduces and proposes a novel methodology for effectively addressing the challenges of modeling increasingly complex semiconductor devices by harnessing the latest advancements in machine learning techniques.
Advisors
신민철researcher
Description
한국과학기술원 :전기및전자공학부,
Publisher
한국과학기술원
Issue Date
2024
Identifier
325007
Language
eng
Description

학위논문(박사) - 한국과학기술원 : 전기및전자공학부, 2024.2,[xviii, 130 p. :]

Keywords

Semiconductor devices▼aNanowire▼aMachine learning▼aBayesian optimization▼aPhysics-informed machine learning; 반도체 소자▼a나노와이어▼a머신러닝▼a베이지안 최적화▼a물리정보 머신러닝

URI
http://hdl.handle.net/10203/322166
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1100072&flag=dissertation
Appears in Collection
EE-Theses_Ph.D.(박사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0