pith. machine review for the scientific record. sign in

arxiv: 1701.02434 · v2 · submitted 2017-01-10 · 📊 stat.ME

Recognition: unknown

A Conceptual Introduction to Hamiltonian Monte Carlo

Authors on Pith no claims yet
classification 📊 stat.ME
keywords carlohamiltonianmonteappliedconceptualunderstandingwhenaccount
0
0 comments X
read the original abstract

Hamiltonian Monte Carlo has proven a remarkable empirical success, but only recently have we begun to develop a rigorous understanding of why it performs so well on difficult problems and how it is best applied in practice. Unfortunately, that understanding is confined within the mathematics of differential geometry which has limited its dissemination, especially to the applied communities for which it is particularly important. In this review I provide a comprehensive conceptual account of these theoretical foundations, focusing on developing a principled intuition behind the method and its optimal implementations rather of any exhaustive rigor. Whether a practitioner or a statistician, the dedicated reader will acquire a solid grasp of how Hamiltonian Monte Carlo works, when it succeeds, and, perhaps most importantly, when it fails.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 17 Pith papers

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Transfer Learning of Multiobjective Indirect Low-Thrust Trajectories Using Diffusion Models and Markov Chain Monte Carlo

    eess.SY 2026-05 unverdicted novelty 7.0

    A homotopy-plus-MCMC data-generation pipeline trains a mass-conditioned diffusion model that yields 40% more feasible initial costates and a better Pareto front for multiobjective indirect low-thrust transfers than ad...

  2. Bayesian Doppler Imaging: Simultaneous Inference of Surface Maps and Geometric Parameters

    astro-ph.EP 2026-05 conditional novelty 7.0

    A fully Bayesian pixel-based Doppler imaging framework uses Gaussian Process priors and Hamiltonian Monte Carlo to simultaneously infer surface maps and geometric parameters from spectral data.

  3. Adaptive Meta-Learning Stochastic Gradient Hamiltonian Monte Carlo Simulation for Bayesian Updating of Structural Dynamic Models

    stat.AP 2026-04 unverdicted novelty 7.0

    AM-SGHMC combines adaptive neural networks with SGHMC to produce a reusable MCMC sampler for Bayesian updating of similar structural dynamic models without per-task retraining.

  4. Adaptive Riemannian Manifold Hamiltonian Monte Carlo with Hierarchical Metric

    stat.CO 2026-04 unverdicted novelty 7.0

    An adaptive hierarchical RMHMC sampler with closed-form leapfrog integrator and automatic mass matrix tuning for efficient MCMC in high-dimensional Bayesian problems.

  5. Many Wrongs Make a Right: Leveraging Biased Simulations Towards Unbiased Parameter Inference

    hep-ph 2026-04 unverdicted novelty 7.0

    Template-Adapted Mixture Model uses many biased simulations for data-driven estimates of signal and background distributions, yielding unbiased signal fraction estimates with well-calibrated uncertainties.

  6. Mortality Forecasting as a Flow Field in Tucker Decomposition Space

    stat.ME 2026-03 unverdicted novelty 7.0

    Mortality forecasting is recast as integrating a flow field through the low-dimensional Tucker decomposition score space of the Human Mortality Database, yielding lower bias and error than Lee-Carter, Hyndman-Ullah, o...

  7. Self-Supervised Laplace Approximation for Bayesian Uncertainty Quantification

    stat.ML 2026-05 unverdicted novelty 6.0

    SSLA approximates the posterior predictive distribution by refitting Bayesian models on self-predicted data, providing a sampling-free method that improves predictive calibration over classical Laplace approximations ...

  8. Amortized Variational Inference for Joint Posterior and Predictive Distributions in Bayesian Uncertainty Quantification

    stat.ML 2026-05 unverdicted novelty 6.0

    An amortized variational framework jointly targets the posterior and posterior-predictive distributions via a KL upper bound and moment regularization, yielding more accurate predictions at lower online cost than two-...

  9. Bayesian X-Learner: Calibrated Posterior Inference for Heterogeneous Treatment Effects under Heavy-Tailed Outcomes

    stat.ML 2026-04 unverdicted novelty 6.0

    Bayesian X-Learner delivers calibrated posterior inference for CATE by combining cross-fitted doubly robust pseudo-outcomes with a Welsch redescending pseudo-likelihood and MCMC sampling.

  10. Tempered Sequential Monte Carlo for Trajectory and Policy Optimization with Differentiable Dynamics

    cs.LG 2026-04 unverdicted novelty 6.0

    Tempered sequential Monte Carlo samples efficiently from a temperature-annealed distribution over controller parameters to solve trajectory and policy optimization under differentiable dynamics.

  11. Tempered Sequential Monte Carlo for Trajectory and Policy Optimization with Differentiable Dynamics

    cs.LG 2026-04 unverdicted novelty 6.0

    Tempered sequential Monte Carlo samples from a Boltzmann-tilted distribution over controllers to optimize trajectories and policies under differentiable dynamics.

  12. Combining Bayesian and Frequentist Inference for Laboratory-Specific Performance Guarantees in Copy Number Variation Detection

    stat.ME 2026-04 unverdicted novelty 6.0

    A hybrid method models squared losses from Bayesian CNV posteriors with a Gamma distribution on validation samples to produce tolerance intervals with valid frequentist coverage, achieving single-digit mean absolute c...

  13. Jeffreys Flow: Robust Boltzmann Generators for Rare Event Sampling via Parallel Tempering Distillation

    cs.LG 2026-04 unverdicted novelty 6.0

    Jeffreys Flow distills Parallel Tempering trajectories via Jeffreys divergence to produce robust Boltzmann generators that suppress mode collapse and correct sampling inaccuracies for rare event sampling.

  14. Uncertainty-Aware Prediction of Lung Tumor Growth from Sparse Longitudinal CT Data via Bayesian Physics-Informed Neural Networks

    cs.LG 2026-05 unverdicted novelty 5.0

    Bayesian PINN integrates Gompertz dynamics and HMC sampling to predict tumor growth from sparse CT data, achieving log-space RMSE of 0.20 with well-calibrated 95% credible intervals on 30 NLST patients.

  15. Uncertainty Quantification for Cardiac Shape Reconstruction with Deep Signed Distance Functions via MCMC methods

    eess.IV 2026-05 unverdicted novelty 5.0

    Deep signed distance functions combined with MCMC sampling enable uncertainty-aware reconstruction of left and right ventricles from limited data.

  16. Singularity Formation: Synergy in Theoretical, Numerical and Machine Learning Approaches

    math.NA 2026-04 unverdicted novelty 5.0

    The work introduces a modulation-based analytical method for singularity proofs in singular PDEs and refines ML techniques like PINNs and KANs to identify blowup solutions, with application to the open 3D Keller-Segel...

  17. Uncertainty in Physics and AI: Taxonomy, Quantification, and Validation

    stat.ML 2026-05 accept novelty 4.0

    A unified taxonomy of uncertainty in ML for physics is introduced together with validation tools such as coverage, calibration, and proper scoring rules, illustrated on regression and classification tasks.