Recognition: unknown
Sequential Neural Likelihood: Fast Likelihood-free Inference with Autoregressive Flows
read the original abstract
We present Sequential Neural Likelihood (SNL), a new method for Bayesian inference in simulator models, where the likelihood is intractable but simulating data from the model is possible. SNL trains an autoregressive flow on simulated data in order to learn a model of the likelihood in the region of high posterior density. A sequential training procedure guides simulations and reduces simulation cost by orders of magnitude. We show that SNL is more robust, more accurate and requires less tuning than related neural-based methods, and we discuss diagnostics for assessing calibration, convergence and goodness-of-fit.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
Tokenised Flow Matching for Hierarchical Simulation Based Inference
TFMPE combines likelihood factorisation with tokenised flow matching to enable efficient hierarchical SBI from single-site simulations, producing well-calibrated posteriors at lower computational cost on a new benchma...
-
Machine Learning Techniques for Astrophysics and Cosmology: Photometric Redshifts
AI techniques for photometric redshift estimation have converged and are now limited by the size, systematics, and selection effects in spectroscopic training samples rather than by methodology.
-
A Review of Diffusion-based Simulation-Based Inference: Foundations and Applications in Non-Ideal Data Scenarios
A synthesis of diffusion-based simulation-based inference methods that address model misspecification, irregular observations, and missing data in scientific applications.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.