Bucket renormalization for approximate inference

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 411
  • Download : 0
Probabilistic graphical models are a key tool in machine learning applications. Computing the partition function, i.e. normalizing constant, is a fundamental task of statistical inference but it is generally computationally intractable, leading to extensive study of approximation methods. Iterative variational methods are a popular and successful family of approaches. However, even state of the art variational methods can return poor results or fail to converge on difficult instances. In this paper, we instead consider computing the partition function via sequential summation over variables. We develop robust approximate algorithms by combining ideas from mini-bucket elimination with tensor network and renormalization group methods from statistical physics. The resulting 'convergence-free' methods show good empirical performance on both synthetic and real-world benchmark models, even for difficult instances.
Publisher
IOP PUBLISHING LTD
Issue Date
2019-12
Language
English
Article Type
Article
Citation

JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, v.2019, no.12, pp.124022

ISSN
1742-5468
DOI
10.1088/1742-5468/ab3218
URI
http://hdl.handle.net/10203/272611
Appears in Collection
AI-Journal Papers(저널논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0