pith. machine review for the scientific record. sign in

hub

Spectral Normalization for Generative Adversarial Networks

19 Pith papers cite this work. Polarity classification is still indexing.

19 Pith papers citing it
abstract

One of the challenges in the study of generative adversarial networks is the instability of its training. In this paper, we propose a novel weight normalization technique called spectral normalization to stabilize the training of the discriminator. Our new normalization technique is computationally light and easy to incorporate into existing implementations. We tested the efficacy of spectral normalization on CIFAR10, STL-10, and ILSVRC2012 dataset, and we experimentally confirmed that spectrally normalized GANs (SN-GANs) is capable of generating images of better or equal quality relative to the previous training stabilization techniques.

hub tools

years

2026 18 2021 1

representative citing papers

CAWI: Copula-Aligned Weight Initialization for Randomized Neural Networks

cs.LG · 2026-05-12 · unverdicted · novelty 7.0

CAWI replaces standard random initialization of input-to-hidden weights in randomized neural networks with samples drawn from a data-fitted copula that preserves observed feature dependencies, yielding consistent accuracy gains on 83 classification benchmarks.

Tessellations of Semi-Discrete Flow Matching

cs.LG · 2026-05-08 · unverdicted · novelty 7.0

Semi-discrete Flow Matching produces terminal assignment regions that are topologically simple (open, simply connected, homeomorphic to the ball under assumption) yet geometrically distinct from optimal transport Laguerre cells, as they can be non-convex with curved boundaries.

KANs need curvature: penalties for compositional smoothness

cs.LG · 2026-05-04 · unverdicted · novelty 7.0

A curvature penalty for KANs, derived to respect compositional effects and equipped with a proven upper bound on full-model curvature, produces smoother activations while preserving accuracy.

Diffusion Models Beat GANs on Image Synthesis

cs.LG · 2021-05-11 · accept · novelty 7.0

Diffusion models with architecture improvements and classifier guidance achieve superior FID scores to GANs on unconditional and conditional ImageNet image synthesis.

AdamO: A Collapse-Suppressed Optimizer for Offline RL

cs.LG · 2026-05-03 · unverdicted · novelty 6.0

AdamO modifies Adam with an orthogonality correction to ensure the spectral radius of the TD update operator stays below one, providing a theoretical stability guarantee for offline RL.

citing papers explorer

Showing 19 of 19 citing papers.