Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data

Cited 46 time in webofscience Cited 33 time in scopus
  • Hit : 131
  • Download : 0
Cooperative training methods for distributed machine learning typically assume noiseless and ideal communication channels. This work studies some of the opportunities and challenges arising from the presence of wireless communication links. We specifically consider wireless implementations of Federated Learning (FL) and Federated Distillation (FD), as well as of a novel Hybrid Federated Distillation (HFD) scheme. Both digital implementations based on separate source-channel coding and over-the-air computing implementations based on joint source-channel coding are proposed and evaluated over Gaussian multiple-access channels.
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2019-09-08
Language
English
Citation

30th IEEE Annual International Symposium on Personal, Indoor and Mobile Radio Communications, PIMRC 2019, pp.1138 - 1143

DOI
10.1109/PIMRC.2019.8904164
URI
http://hdl.handle.net/10203/274421
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 46 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0