Forget-SVGD: Particle-Based Bayesian Federated Unlearning

Cited 6 time in webofscience Cited 0 time in scopus
  • Hit : 221
  • Download : 0
Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques. This paper proposes to leverage the flexibility of non-parametric Bayesian approximate inference to develop a novel Bayesian federated unlearning method, referred to as Forget-Stein Variational Gradient Descent (Forget-SVGD). Forget-SVGD builds on SVGD – a particle-based approximate Bayesian inference scheme using gradient-based deterministic updates – and on its distributed (federated) extension known as Distributed SVGD (DSVGD). Upon the completion of federated learning, as one or more participating agents request for their data to be “forgotten”, Forget-SVGD carries out local SVGD updates at the agents whose data need to be “unlearned”, which are interleaved with communication rounds with a parameter server. The proposed method is validated via performance comparisons with non-parametric schemes that train from scratch by excluding data to be forgotten, as well as with existing parametric Bayesian unlearning methods.
Publisher
Institute of Electrical and Electronics Engineers Inc.
Issue Date
2022-05
Language
English
Citation

2022 IEEE Data Science and Learning Workshop (DSLW)

DOI
10.1109/DSLW53931.2022.9820602
URI
http://hdl.handle.net/10203/298079
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.
This item is cited by other documents in WoS
⊙ Detail Information in WoSⓡ Click to see webofscience_button
⊙ Cited 6 items in WoS Click to see citing articles in records_button

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0