A certificate-based regret analysis framework for guided-diffusion black-box optimization is introduced, with mass lift as the central quantity explaining convergence from pretrained generators.
Diffusion model for data-driven black-box optimization
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3verdicts
UNVERDICTED 3representative citing papers
PGM replaces the intractable likelihood score in diffusion models with a closed-form Moreau score computed via proximal operators, enabling non-asymptotic sampling for inverse problems trained only on prior data.
Guided diffusion generates samples near the target distribution support under exact score access, explaining its empirical success in producing plausible outputs.
citing papers explorer
-
Regret Analysis of Guided Diffusion for Black-Box Optimization over Structured Inputs
A certificate-based regret analysis framework for guided-diffusion black-box optimization is introduced, with mass lift as the central quantity explaining convergence from pretrained generators.
-
Proximal-Based Generative Modeling for Bayesian Inverse Problems
PGM replaces the intractable likelihood score in diffusion models with a closed-form Moreau score computed via proximal operators, enabling non-asymptotic sampling for inverse problems trained only on prior data.
-
On the Robustness of Distribution Support under Diffusion Guidance
Guided diffusion generates samples near the target distribution support under exact score access, explaining its empirical success in producing plausible outputs.