FedBalancer: Data and Pace Control for Efficient Federated Learning on Heterogeneous Clients

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 61
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorShin, Jaeminko
dc.contributor.authorLi, Yuanchunko
dc.contributor.authorLiu, Yunxinko
dc.contributor.authorLee, Sung-Juko
dc.date.accessioned2022-09-29T02:00:17Z-
dc.date.available2022-09-29T02:00:17Z-
dc.date.created2022-09-27-
dc.date.created2022-09-27-
dc.date.issued2022-06-
dc.identifier.citation20th ACM International Conference on Mobile Systems, Applications and Services, MobiSys 2022, pp.436 - 449-
dc.identifier.urihttp://hdl.handle.net/10203/298762-
dc.description.abstractFederated Learning (FL) trains a machine learning model on distributed clients without exposing individual data. Unlike centralized training that is usually based on carefully-organized data, FL deals with on-device data that are often unfiltered and imbalanced. As a result, conventional FL training protocol that treats all data equally leads to a waste of local computational resources and slows down the global learning process. To this end, we propose FedBalancer, a systematic FL framework that actively selects clients' training samples. Our sample selection strategy prioritizes more informativedata while respecting privacy and computational capabilities of clients. To better utilize the sample selection to speed up global training, we further introduce an adaptive deadline control scheme that predicts the optimal deadline for each round with varying client training data. Compared with existing FL algorithms with deadline configuration methods, our evaluation on five datasets from three different domains shows that FedBalancer improves the time-to-accuracy performance by 1.20∼4.48× while improving the model accuracy by 1.1∼5.0%. We also show that FedBalancer is readily applicable to other FL approaches by demonstrating that FedBalancer improves the convergence speed and accuracy when operating jointly with three different FL algorithms.-
dc.languageEnglish-
dc.publisherAssociation for Computing Machinery, Inc-
dc.titleFedBalancer: Data and Pace Control for Efficient Federated Learning on Heterogeneous Clients-
dc.typeConference-
dc.identifier.scopusid2-s2.0-85134001634-
dc.type.rimsCONF-
dc.citation.beginningpage436-
dc.citation.endingpage449-
dc.citation.publicationname20th ACM International Conference on Mobile Systems, Applications and Services, MobiSys 2022-
dc.identifier.conferencecountryUS-
dc.identifier.conferencelocationPortland-
dc.identifier.doi10.1145/3498361.3538917-
dc.contributor.localauthorLee, Sung-Ju-
dc.contributor.nonIdAuthorLi, Yuanchun-
dc.contributor.nonIdAuthorLiu, Yunxin-
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0