DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Yun, Seyoung | - |
dc.contributor.advisor | 윤세영 | - |
dc.contributor.author | Kim, Nakyil | - |
dc.date.accessioned | 2023-06-22T19:31:16Z | - |
dc.date.available | 2023-06-22T19:31:16Z | - |
dc.date.issued | 2022 | - |
dc.identifier.uri | http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1008200&flag=dissertation | en_US |
dc.identifier.uri | http://hdl.handle.net/10203/308193 | - |
dc.description | 학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2022.8,[v, 25 p. :] | - |
dc.description.abstract | Federated learning is a collaboratively training process of several local client models. It aims to train a global server model from client models without directly accessing to private client data. However, federated learning suffers from data heterogeneity of client data. Ensemble distillation that distills knowledge of clients to server by matching prediction is a possible solution for data heterogeneity but it requires a public dataset. In this research we propose a method to make use of condensed client data in federated learning that doesn’t require public dataset. To mitigate data heterogeneity issue, we condensed client data and use it as an input of ensemble distillation and empirically show that it is beneficial to improve overall performance in heterogeneous situations. | - |
dc.language | eng | - |
dc.publisher | 한국과학기술원 | - |
dc.subject | Deep learning▼aFederated learning▼aEnsemble distillation▼aDataset condensation▼aData heterogeneity | - |
dc.subject | 심층학습▼a연합 학습▼a앙상블 증류▼a데이터 축약▼a이기종 데이터 | - |
dc.title | Ensemble distillation in federated learning with dataset condensation | - |
dc.title.alternative | 연합학습에서 데이터 합성을 이용한 앙상블 증류 | - |
dc.type | Thesis(Master) | - |
dc.identifier.CNRN | 325007 | - |
dc.description.department | 한국과학기술원 :김재철AI대학원, | - |
dc.contributor.alternativeauthor | 김낙일 | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.