Regulated Subspace Projection Based Local Model Update Compression for Communication-Efficient Federated Learning

Cited 3 time in webofscience Cited 0 time in scopus
  • Hit : 67
  • Download : 0
Despite high utility in distributed networks, federated learning entails enormous communication overhead due to the requirement of trained model exchange at every global iteration. When the communication resources are limited, as in wireless environments, learning performance can be severely degraded by the communication overhead. On this account, communication efficiency is one of the primary concerns in federated learning. In this paper, we put forth a communication-efficient federated learning system based on the projection of local model updates. Leveraging the correlation of consecutive local model updates, we devise a novel local model update compression scheme based on the projection onto the selected subspace. Furthermore, to avoid error propagation over global iterations and thus improve learning performance, we also develop novel criteria for deciding whether to compress the local model updates or not. The convergence of the proposed algorithm is also mathematically proved by deriving an upper bound on the mean square error of the global parameter. The merits of the proposed algorithm over the state-of-the-art benchmark schemes are verified by various simulations.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
Issue Date
2023-04
Language
English
Article Type
Article
Citation

IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, v.41, no.4, pp.964 - 976

ISSN
0733-8716
DOI
10.1109/JSAC.2023.3242722
URI
http://hdl.handle.net/10203/306426
Appears in Collection
RIMS Journal Papers
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 3 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0