A framework learns effective multiscale stochastic dynamics from single slow-variable paths by parameterizing the fast process invariant distribution with normalizing flows, trained end-to-end via penalized likelihood from stochastic averaging.
Normalizing flows for probabilistic modeling and inference.Journal of Machine Learning Research, 22(57):1–64
2 Pith papers cite this work. Polarity classification is still indexing.
years
2026 2verdicts
UNVERDICTED 2representative citing papers
Two methods are introduced to learn plug-in composite surrogates that maximize effect predictiveness, with the direct surrogate-effect modeling approach outperforming baselines on synthetic data with known effects and real-world experiment data.
citing papers explorer
-
Learning stochastic multiscale models through normalizing flows
A framework learns effective multiscale stochastic dynamics from single slow-variable paths by parameterizing the fast process invariant distribution with normalizing flows, trained end-to-end via penalized likelihood from stochastic averaging.
-
Learning plug-in surrogate endpoints for randomized experiments
Two methods are introduced to learn plug-in composite surrogates that maximize effect predictiveness, with the direct surrogate-effect modeling approach outperforming baselines on synthetic data with known effects and real-world experiment data.