Academic Journals Database
Disseminating quality controlled scientific knowledge

Approximate inference via variational sampling

ADD TO MY LIST
 
Author(s): Alexis Roche

Journal: International Journal of Advanced Statistics and Probability
ISSN 2307-9045

Volume: 1;
Issue: 3;
Start page: 110;
Date: 2013;
Original page

ABSTRACT
A new method called “variational sampling” is proposed to estimate integrals under probability distributions that can be evaluated up to a normalizing constant. The key idea is to fit the target distribution with an exponential family model by minimizing a strongly consistent empirical approximation to the Kullback-Leibler divergence computed using either deterministic or random sampling. It is shown how variational sampling differs conceptually from both quadrature and importance sampling and established that, in the case of random independence sampling, it may have much faster stochastic convergence than importance sampling under mild conditions. The variational sampling implementation presented in this paper requires a rough initial approximation to the target distribution, which may be found, e.g. using the Laplace method, and is shown to then have the potential to substantially improve over several existing approximate inference techniques to estimate moments of order up to two of nearly-Gaussian distributions, which occur frequently in Bayesian analysis. In particular, an application of variational sampling to Bayesian logistic regression in moderate dimension is presented.

Tango Jona
Tangokurs Rapperswil-Jona

     Save time & money - Smart Internet Solutions