Recognition: unknown
High-accuracy sampling for diffusion models and log-concave distributions
read the original abstract
We present algorithms for diffusion model sampling which obtain $\delta$-error in $\mathrm{polylog}(1/\delta)$ steps, given access to $\widetilde O(\delta)$-accurate score estimates in $L^2$. This is an exponential improvement over all previous results. Specifically, under minimal data assumptions, the complexity is $\widetilde O(d_\star \mathrm{polylog}(1/\delta))$ where $d_\star$ is the intrinsic dimension of the data. Further, under a non-uniform $L$-Lipschitz condition, the complexity reduces to $\widetilde O(L \mathrm{polylog}(1/\delta))$. Our approach also yields the first $\mathrm{polylog}(1/\delta)$ complexity sampler for general log-concave distributions using only gradient evaluations.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
Query Lower Bounds for Diffusion Sampling
Diffusion sampling from d-dimensional distributions requires at least ~sqrt(d) adaptive score queries when score estimates have polynomial accuracy.
-
Metropolis-Adjusted Diffusion Models
Metropolis-adjusted Langevin correctors using score-based acceptance probabilities, including an exact Bernoulli factory method and a Simpson's rule approximation, reduce sampling bias in diffusion models and improve ...
-
A proximal gradient algorithm for composite log-concave sampling
A proximal gradient sampler for composite log-concave distributions achieves near-optimal iteration complexity of order kappa sqrt(d) log^4(1/epsilon) in total variation distance under strong convexity and smoothness.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.