pith. machine review for the scientific record. sign in

arxiv: 1703.00573 · v5 · submitted 2017-03-02 · 💻 cs.LG · cs.NE· stat.ML

Recognition: unknown

Generalization and Equilibrium in Generative Adversarial Nets (GANs)

Authors on Pith no claims yet
classification 💻 cs.LG cs.NEstat.ML
keywords trainingequilibriumgeneralizationadversarialdistributiongenerativegeneratorappear
0
0 comments X
read the original abstract

We show that training of generative adversarial network (GAN) may not have good generalization properties; e.g., training may appear successful but the trained distribution may be far from target distribution in standard metrics. However, generalization does occur for a weaker metric called neural net distance. It is also shown that an approximate pure equilibrium exists in the discriminator/generator game for a special class of generators with natural training objectives when generator capacity and training set sizes are moderate. This existence of equilibrium inspires MIX+GAN protocol, which can be combined with any existing GAN training, and empirically shown to improve some of them.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Demystifying MMD GANs

    stat.ML 2018-01 accept novelty 6.0

    MMD GANs have unbiased critic gradients but biased generator gradients from sample-based learning, and the Kernel Inception Distance provides a practical new measure for GAN convergence and dynamic learning rate adaptation.