Gauging Variational Inference

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 90
  • Download : 0
DC FieldValueLanguage
dc.contributor.authorAhn, Sungsooko
dc.contributor.authorChertkov, Michaelko
dc.contributor.authorShin, Jinwooko
dc.identifier.citation31st Annual Conference on Neural Information Processing Systems, NIPS 2017, pp.2882 - 2891-
dc.description.abstractComputing partition function is the most important statistical inference task arising in applications of Graphical Models (GM). Since it is computationally intractable, approximate methods have been used in practice, where mean-field (MF) and belief propagation (BP) are arguably the most popular and successful approaches of a variational type. In this paper, we propose two new variational schemes, coined Gauged-MF (G-MF) and Gauged-BP (G-BP), improving MF and BP, respectively. Both provide lower bounds for the partition function by utilizing the so-called gauge transformation which modifies factors of GM while keeping the partition function invariant. Moreover, we prove that both G-MF and G-BP are exact for GMs with a single loop of a special structure, even though the bare MF and BP perform badly in this case. Our extensive experiments indeed confirm that the proposed algorithms outperform and generalize MF and BP. © 2017 Neural information processing systems foundation. All rights reserved.-
dc.publisherNeural Information Processing Systems Foundation-
dc.titleGauging Variational Inference-
dc.citation.publicationname31st Annual Conference on Neural Information Processing Systems, NIPS 2017-
dc.identifier.conferencelocationLong Beach Convention Center-
dc.contributor.localauthorShin, Jinwoo-
dc.contributor.nonIdAuthorChertkov, Michael-
Appears in Collection
EE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.


  • mendeley


rss_1.0 rss_2.0 atom_1.0