Minimizing expected losses in perturbation models with multidimensional parametric min-cuts

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 51
  • Download : 0
We consider the problem of learning perturbation-based probabilistic models by computing and differentiating expected losses. This is a challenging computational problem that has traditionally been tackled using Monte Carlo-based methods. In this work, we show how a generalization of parametric min-cuts can be used to address the same problem, achieving higher accuracy and faster performance than a sampling-based baseline. Utilizing our proposed Skeleton Method, we show that we can learn the perturbation model so as to directly minimize expected losses. Experimental results show that this approach offers promise as a new way of training structured prediction models under complex loss functions.
Publisher
Association for Uncertainty in Artificial Intelligence
Issue Date
2015-07
Language
English
Citation

31st Conference on Uncertainty in Artificial Intelligence, UAI 2015, pp.435 - 443

URI
http://hdl.handle.net/10203/313370
Appears in Collection
RIMS Conference Papers
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0