FedMes: Speeding Up Federated Learning with Multiple Edge Servers

Cited 18 time in webofscience Cited 0 time in scopus
  • Hit : 223
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorHan, Dong-Junko
dc.contributor.authorPark, Jungwukko
dc.contributor.authorChoi, Minseokko
dc.contributor.authorMoon, Jaekyunko
dc.date.accessioned2021-11-25T06:40:14Z-
dc.date.available2021-11-25T06:40:14Z-
dc.date.created2021-11-24-
dc.date.created2021-11-24-
dc.date.issued2021-12-
dc.identifier.citationIEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, v.39, no.12, pp.3870 - 3885-
dc.identifier.issn0733-8716-
dc.identifier.urihttp://hdl.handle.net/10203/289459-
dc.description.abstractWe consider federated learning (FL) with multiple wireless edge servers having their own local coverage. We focus on speeding up training in this increasingly practical setup. Our key idea is to utilize the clients located in the overlapping coverage areas among adjacent edge servers (ESs); in the model-downloading stage, the clients in the overlapping areas receive multiple models from different ESs, take the average of the received models, and then update the averaged model with their local data. These clients send their updated model to multiple ESs by broadcasting, which acts as bridges for sharing the trained models between servers. Even when some ESs are given biased datasets within their coverage regions, their training processes can be assisted by adjacent servers through the clients in their overlapping regions. As a result, the proposed scheme does not require costly communications with the central cloud server (located at the higher tier of edge servers) for model synchronization, significantly reducing the overall training time compared to the conventional cloud-based FL systems. Extensive experimental results show remarkable performance gains of our scheme compared to existing methods. Our design targets latency-sensitive applications where edge-based FL is essential, e.g., when a number of connected cars/drones must cooperate (via FL) to quickly adapt to dynamically changing environments.-
dc.languageEnglish-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleFedMes: Speeding Up Federated Learning with Multiple Edge Servers-
dc.typeArticle-
dc.identifier.wosid000720517900022-
dc.identifier.scopusid2-s2.0-85119597037-
dc.type.rimsART-
dc.citation.volume39-
dc.citation.issue12-
dc.citation.beginningpage3870-
dc.citation.endingpage3885-
dc.citation.publicationnameIEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS-
dc.identifier.doi10.1109/JSAC.2021.3118422-
dc.contributor.localauthorMoon, Jaekyun-
dc.contributor.nonIdAuthorChoi, Minseok-
dc.description.isOpenAccessN-
dc.type.journalArticleArticle-
Appears in Collection
EE-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 18 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0