DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | 김동준 | - |
dc.contributor.advisor | Kim, John | - |
dc.contributor.author | Cho, Sanghun | - |
dc.date.accessioned | 2022-04-27T19:30:52Z | - |
dc.date.available | 2022-04-27T19:30:52Z | - |
dc.date.issued | 2021 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=948980&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/295930 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 전기및전자공학부, 2021.2,[iii, 27 p. :] | - |
dc.description.abstract | Machine learning has recently been in the spotlight as a solution to problems that were difficult to solve. Distributed processing techniques using graphic processing devices are widely used to deal with the vast amount of data needed to learn neural networks in deep learning, the most widely used type of machine learning, and thus collective communication within distributed systems exists as the main performance bottleneck and impairs the scalability of the system. In this dissertation, we would like to propose techniques and computer architectures that address communication bottlenecks and improve learning performance, considering the characteristics of the intermediate values of collective communication, the multilayer properties of deep learning, and the structure of distributed systems that perform actual computations. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Collective Communication▼aDeep Learning▼aDistributed Processing▼aGraphics Processing Unit▼aInterconnect | - |
dc.subject | 그래픽 처리 장치▼a딥 러닝▼a분산 처리▼a상호 연결▼a집단 통신 | - |
dc.title | Communication optimization for deep learning in distributed processing environments | - |
dc.title.alternative | 분산 처리 환경에서 딥 러닝을 위한 통신 기법 최적화 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :전기및전자공학부, | - |
dc.contributor.alternativeauthor | 조상훈 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.