pith. machine review for the scientific record. sign in

arxiv: 1802.06847 · v4 · submitted 2018-02-19 · 📊 stat.ML · cs.LG

Recognition: unknown

Distribution Matching in Variational Inference

Authors on Pith no claims yet
classification 📊 stat.ML cs.LG
keywords distributionslimitationsganshybridsvaesgenerativeinferencelearning
0
0 comments X
read the original abstract

With the increasingly widespread deployment of generative models, there is a mounting need for a deeper understanding of their behaviors and limitations. In this paper, we expose the limitations of Variational Autoencoders (VAEs), which consistently fail to learn marginal distributions in both latent and visible spaces. We show this to be a consequence of learning by matching conditional distributions, and the limitations of explicit model and posterior distributions. It is popular to consider Generative Adversarial Networks (GANs) as a means of overcoming these limitations, leading to hybrids of VAEs and GANs. We perform a large-scale evaluation of several VAE-GAN hybrids and analyze the implications of class probability estimation for learning distributions. While promising, we conclude that at present, VAE-GAN hybrids have limited applicability: they are harder to scale, evaluate, and use for inference compared to VAEs; and they do not improve over the generation quality of GANs.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Molecular Design beyond Training Data with Novel Extended Objective Functionals of Generative AI Models Driven by Quantum Annealing Computer

    q-bio.QM 2026-02 unverdicted novelty 5.0

    Quantum annealing combined with a Neural Hash Function lets generative models create molecules that are more drug-like than classical versions or the training set itself.