Sobolev regularization on the witness function enables global convergence of MMD gradient flows for both sampling and generative modeling without isoperimetric assumptions.
Quantitative Convergence of
2 Pith papers cite this work. Polarity classification is still indexing.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
Mean-field SVGD flow converges locally at explicit polynomial L2 rates to the target on the torus for Riesz kernels, with rates depending on dimension and regularity, sharpness in some regimes, and recovery of global exponential convergence for Coulomb kernels.
citing papers explorer
-
Sobolev Regularized MMD Gradient Flow
Sobolev regularization on the witness function enables global convergence of MMD gradient flows for both sampling and generative modeling without isoperimetric assumptions.
-
Quantitative Local Convergence of Mean-Field Stein Variational Gradient Flow
Mean-field SVGD flow converges locally at explicit polynomial L2 rates to the target on the torus for Riesz kernels, with rates depending on dimension and regularity, sharpness in some regimes, and recovery of global exponential convergence for Coulomb kernels.