DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Shin, Youngsoo | - |
dc.contributor.advisor | 신영수 | - |
dc.contributor.author | Jung, Giyoon | - |
dc.date.accessioned | 2021-05-13T19:39:12Z | - |
dc.date.available | 2021-05-13T19:39:12Z | - |
dc.date.issued | 2020 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=925213&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/285049 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2020.8,[iv, 31 p. :] | - |
dc.description.abstract | As the technology node keeps shrinking, IR drop has become a critical issue of VLSI system design. IR drop impairs circuit performance and may lead to timing issues such as setup/hold violations. Conventional IR drop analyzers solve a large linear equation to attain static/dynamic IR drop, but it takes substantial runtime when the design has millions of nodes. In this paper, we present a machine-learning model that uses image-to-image translation to predict static/dynamic IR drop rapidly. In contrast to most of previous approaches that applied machine learning to calculate IR drop, our model can predict IR drop of designs with different PDN structures. In addition, we predict IR drop region-by-region rather than cell-by-cell to reduce inference time. Compared to commercial IR drop analysis tool (ANSYS Redhawk), our model reduced runtime by 4-5x while achieving an error rate below 18%. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Static/dynamic IR drop▼apower distribution network▼aU-Net▼aeffective resistance▼adesign independent | - |
dc.subject | 정적/동적 전압 강하▼a전력분배망▼aU자형 신경망 네트워크▼a실효 저항▼a디자인 독립적 | - |
dc.title | Fast IR drop analysis using U-Net convolutional network | - |
dc.title.alternative | U-Net convolutional network를 이용한 고속 전압 강하 분석 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 정기윤 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.