pith. machine review for the scientific record. sign in

arxiv: 1603.01140 · v1 · submitted 2016-03-03 · 📊 stat.ML

Recognition: unknown

Overdispersed Black-Box Variational Inference

Authors on Pith no claims yet
classification 📊 stat.ML
keywords variationalblack-boxinferenceoverdisperseddistributionimportancesamplesvariance
0
0 comments X
read the original abstract

We introduce overdispersed black-box variational inference, a method to reduce the variance of the Monte Carlo estimator of the gradient in black-box variational inference. Instead of taking samples from the variational distribution, we use importance sampling to take samples from an overdispersed distribution in the same exponential family as the variational approximation. Our approach is general since it can be readily applied to any exponential family distribution, which is the typical choice for the variational approximation. We run experiments on two non-conjugate probabilistic models to show that our method effectively reduces the variance, and the overhead introduced by the computation of the proposal parameters and the importance weights is negligible. We find that our overdispersed importance sampling scheme provides lower variance than black-box variational inference, even when the latter uses twice the number of samples. This results in faster convergence of the black-box inference procedure.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Time-dependent variational Monte Carlo without bias

    quant-ph 2026-05 unverdicted novelty 6.0

    An unbiased time-dependent variational Monte Carlo method is introduced via self-normalized importance sampling on a cutoff-deformed Born distribution, with a complementary tensor cross interpolation approach explored.