pith. machine review for the scientific record. sign in

arxiv: 1706.02262 · v3 · submitted 2017-06-07 · 💻 cs.LG · cs.AI· stat.ML

Recognition: unknown

InfoVAE: Information Maximizing Variational Autoencoders

Authors on Pith no claims yet
classification 💻 cs.LG cs.AIstat.ML
keywords variationalautoencodersinferencemodelsamortizeddecodingdistributiondistributions
0
0 comments X
read the original abstract

A key advance in learning generative models is the use of amortized inference distributions that are jointly trained with the models. We find that existing training objectives for variational autoencoders can lead to inaccurate amortized inference distributions and, in some cases, improving the objective provably degrades the inference quality. In addition, it has been observed that variational autoencoders tend to ignore the latent variables when combined with a decoding distribution that is too flexible. We again identify the cause in existing training criteria and propose a new class of objectives (InfoVAE) that mitigate these problems. We show that our model can significantly improve the quality of the variational posterior and can make effective use of the latent features regardless of the flexibility of the decoding distribution. Through extensive qualitative and quantitative analyses, we demonstrate that our models outperform competing approaches on multiple performance metrics.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Molecular Design beyond Training Data with Novel Extended Objective Functionals of Generative AI Models Driven by Quantum Annealing Computer

    q-bio.QM 2026-02 unverdicted novelty 5.0

    Quantum annealing combined with a Neural Hash Function lets generative models create molecules that are more drug-like than classical versions or the training set itself.