Variational Inference for Sequential Data with Future Likelihood Estimates

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 95
  • Download : 0
The recent development of flexible and scalable variational inference algorithms has popularized the use of deep probabilistic models in a wide range of applications. However, learning and reasoning about high-dimensional models with nondifferentiable densities are still a challenge. For such a model, inference algorithms struggle to estimate the gradients of variational objectives accurately, due to high variance in their estimates. To tackle this challenge, we present a novel variational inference algorithm for sequential data, which performs well even when the density from the model is not differentiable, for instance, due to the use of discrete random variables. The key feature of our algorithm is that it estimates future likelihoods at all time steps. The estimated future likelihoods form the core of our new low-variance gradient estimator. We formally analyze our gradient estimator from the perspective of variational objective, and show the effectiveness of our algorithm with synthetic and real datasets.
ICML Organisation
Issue Date

The 37th International Conference on Machine Learning (ICML 2020), pp.6843 - 6852

Appears in Collection
CS-Conference Papers(학술회의논문)RIMS Conference Papers
Files in This Item
There are no files associated with this item.


  • mendeley


rss_1.0 rss_2.0 atom_1.0