Ensemble distillation in federated learning with dataset condensation연합학습에서 데이터 합성을 이용한 앙상블 증류

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 98
  • Download : 0
DC FieldValueLanguage
dc.contributor.advisorYun, Seyoung-
dc.contributor.advisor윤세영-
dc.contributor.authorKim, Nakyil-
dc.date.accessioned2023-06-22T19:31:16Z-
dc.date.available2023-06-22T19:31:16Z-
dc.date.issued2022-
dc.identifier.urihttp://library.kaist.ac.kr/search/detail/view.do?bibCtrlNo=1008200&flag=dissertationen_US
dc.identifier.urihttp://hdl.handle.net/10203/308193-
dc.description학위논문(석사) - 한국과학기술원 : 김재철AI대학원, 2022.8,[v, 25 p. :]-
dc.description.abstractFederated learning is a collaboratively training process of several local client models. It aims to train a global server model from client models without directly accessing to private client data. However, federated learning suffers from data heterogeneity of client data. Ensemble distillation that distills knowledge of clients to server by matching prediction is a possible solution for data heterogeneity but it requires a public dataset. In this research we propose a method to make use of condensed client data in federated learning that doesn’t require public dataset. To mitigate data heterogeneity issue, we condensed client data and use it as an input of ensemble distillation and empirically show that it is beneficial to improve overall performance in heterogeneous situations.-
dc.languageeng-
dc.publisher한국과학기술원-
dc.subjectDeep learning▼aFederated learning▼aEnsemble distillation▼aDataset condensation▼aData heterogeneity-
dc.subject심층학습▼a연합 학습▼a앙상블 증류▼a데이터 축약▼a이기종 데이터-
dc.titleEnsemble distillation in federated learning with dataset condensation-
dc.title.alternative연합학습에서 데이터 합성을 이용한 앙상블 증류-
dc.typeThesis(Master)-
dc.identifier.CNRN325007-
dc.description.department한국과학기술원 :김재철AI대학원,-
dc.contributor.alternativeauthor김낙일-
Appears in Collection
AI-Theses_Master(석사논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0