Ensemble distillation in federated learning with dataset condensation연합학습에서 데이터 합성을 이용한 앙상블 증류

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 97
  • Download : 0
Federated learning is a collaboratively training process of several local client models. It aims to train a global server model from client models without directly accessing to private client data. However, federated learning suffers from data heterogeneity of client data. Ensemble distillation that distills knowledge of clients to server by matching prediction is a possible solution for data heterogeneity but it requires a public dataset. In this research we propose a method to make use of condensed client data in federated learning that doesn’t require public dataset. To mitigate data heterogeneity issue, we condensed client data and use it as an input of ensemble distillation and empirically show that it is beneficial to improve overall performance in heterogeneous situations.
Advisors
Yun, Seyoungresearcher윤세영researcher
Description
한국과학기술원 :김재철AI대학원,
Publisher
한국과학기술원
Issue Date
2022
Identifier
325007
Language
eng
Description

학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2022.8,[v, 25 p. :]

Keywords

Deep learning▼aFederated learning▼aEnsemble distillation▼aDataset condensation▼aData heterogeneity; 심층학습▼a연합 학습▼a앙상블 증류▼a데이터 축약▼a이기종 데이터

URI
http://hdl.handle.net/10203/308193
Link
http://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1008200&flag=dissertation
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0