pith. machine review for the scientific record. sign in

arxiv: 2604.20301 · v1 · submitted 2026-04-22 · 📊 stat.ML · cs.LG· stat.CO· stat.ME

Recognition: unknown

Properties and limitations of geometric tempering for gradient flow dynamics

Authors on Pith no claims yet

Pith reviewed 2026-05-09 22:56 UTC · model grok-4.3

classification 📊 stat.ML cs.LGstat.COstat.ME
keywords geometric temperingWasserstein gradient flowFisher-Rao gradient flowKL divergencesamplingexponential convergenceadaptive tempering
0
0 comments X

The pith

Geometric tempering produces exponential convergence for Wasserstein and Fisher-Rao gradient flows but never accelerates the Fisher-Rao case.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper studies sampling as minimization of Kullback-Leibler divergence and replaces the fixed target distribution with a sequence of moving targets obtained by geometric tempering. It proves that both the Wasserstein and Fisher-Rao gradient flows under this tempering converge exponentially fast in continuous time and supplies explicit rate bounds for each geometry. The same tempering is then examined under standard time discretizations. A central negative result follows: in the Fisher-Rao geometry the geometric mixture of initial and target distributions produces no improvement in convergence speed, whether the dynamics remain continuous or are discretized. The work also derives new adaptive tempering schedules from the underlying gradient-flow structure.

Core claim

Replacing the target with a geometrically tempered path yields exponential convergence in continuous time for both the Wasserstein and Fisher-Rao gradient flows that minimize KL divergence, together with new explicit bounds. In the Fisher-Rao geometry, however, the geometric mixture of initial and target distributions never produces a faster rate than the untempered flow, and the same absence of speedup persists after discretization. The paper further identifies the gradient-flow structure induced by the tempering and uses it to construct novel adaptive schedules for the tempering parameter.

What carries the argument

Geometric tempering, which replaces the fixed target with a continuous path of intermediate distributions formed by geometric mixtures of the initial and target measures, applied inside the Wasserstein and Fisher-Rao gradient flows.

If this is right

  • Both geometries deliver exponential convergence in continuous time with explicit rate bounds.
  • Popular discretizations of the tempered flows inherit convergence properties that can be quantified.
  • Geometric mixtures produce no convergence speedup in the Fisher-Rao geometry in either continuous or discrete time.
  • The gradient-flow structure of tempered dynamics yields new adaptive schedules for the tempering parameter.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The lack of speedup in Fisher-Rao suggests that acceleration strategies for tempered sampling must be chosen according to the underlying geometry.
  • Practitioners working with Fisher-Rao flows may obtain better efficiency by forgoing geometric tempering altogether or by adopting non-geometric paths.
  • The adaptive schedules derived here could be combined with other discretization schemes or hybrid geometries to improve practical sampling performance.

Load-bearing premise

The target distribution and the tempered sequence are regular enough for the Wasserstein and Fisher-Rao gradient flows to exist and for the geometric path to remain absolutely continuous with respect to each metric.

What would settle it

A concrete counter-example, such as a pair of Gaussian distributions, in which the measured convergence rate of the Fisher-Rao flow under geometric tempering is strictly smaller than the rate of the untempered flow.

Figures

Figures reproduced from arXiv: 2604.20301 by Francesca Romana Crucinio, Sahani Pathiraja.

Figure 1
Figure 1. Figure 1: Evolution of mean, variance and DKL(µt||π) with µt given by the PDE flows in equation 1, equation 4 and equation 10 in the 1D Gaussian case with µ0(x) = N (x; 0, 1) and π(x) = N (x; 20, 0.1) (first row), π(x) = N (x; 1, 5) (second row). Denote by πn ∝ π λn µ 1−λn 0 obtained by considering a time discretisation 0 = λ0 ≤ λ1 ≤ · · · ≤ λn ≤ λn+1 ≤ . . . of λt : R+ → [0, 1] with λn ≡ λtn . The Tempered ULA dyna… view at source ↗
Figure 2
Figure 2. Figure 2: Comparison of tempering sequences for tempered dynamics with [PITH_FULL_IMAGE:figures/full_fig_p012_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Convergence of approximations to π given by tempered and standard PDE flow with target the Gaussian mixture equation 26. Convergence is measured through the MMD with standard Gaussian kernel and is averaged over 50 replications. We plot the percent of iterations required to achieve MMD< 0.01 for increasing m. We include standard W & WFR and tempered flows with λs = s/t, λs = 1 − e −αs , λs = 1 − 1/(2 + s).… view at source ↗
Figure 4
Figure 4. Figure 4: Evolution of DKL along the tempered WFR PDE flow with linear schedule and two approximations via SMC. We have µ0(x) = N (x; 0, 1) and π(x) = N (x; 20, 0.1), γ ranges from 0.001 (lighter color) to 0.1 (darker color). 4.3 Tempered SMC-WFR and Tempering SMC We now compare two approximations of the tempered WFR flow: Algorithm 1 obtained using the analytic solution of the FR flow and the usual tempering SMC al… view at source ↗
Figure 5
Figure 5. Figure 5: Distribution of ESS for tempered SMC-WFR and tempering SMC for different values of [PITH_FULL_IMAGE:figures/full_fig_p016_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Comparison of adaptive strategies for λt and resulting MMD decay for µ0(x) = N (x; 0, 1) and π(x) = 1 2N (x; 0, 1) + 1 2N (x; m, 1). of ULA. The ESS based strategy is slighlty slower. However, ULA remains the fastest strategy in terms of convergence to π. 5 Discussion In this work, we provided a systematic analysis of geometric tempering in gradient flow dynamics, with a focus on the Fisher–Rao and Wassers… view at source ↗
read the original abstract

We consider the problem of sampling from a probability distribution $\pi$. It is well known that this can be written as an optimisation problem over the space of probability distributions in which we aim to minimise the Kullback--Leibler divergence from $\pi$. We consider the effect of replacing $\pi$ with a sequence of moving targets $(\pi_t)_{t\ge0}$ defined via geometric tempering on the Wasserstein and Fisher--Rao gradient flows. We show that convergence occurs exponentially in continuous time, providing novel bounds in both cases. We also consider popular time discretisations and explore their convergence properties. We show that in the Fisher--Rao case, replacing the target distribution with a geometric mixture of initial and target distribution never leads to a convergence speed up both in continuous time and in discrete time. Finally, we explore the gradient flow structure of tempered dynamics and derive novel adaptive tempering schedules.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The manuscript analyzes geometric tempering applied to Wasserstein and Fisher-Rao gradient flows for KL-divergence minimization in sampling from a target π. It claims to establish exponential convergence in continuous time together with novel bounds in both geometries, studies convergence under popular time discretizations, proves that geometric mixtures of initial and target distributions yield no convergence speedup in the Fisher-Rao case (continuous and discrete time), and derives novel adaptive tempering schedules from the underlying gradient-flow structure.

Significance. If the central claims hold under the stated assumptions, the work supplies useful theoretical limits on tempering strategies within optimal transport and information geometry. The no-speedup result for Fisher-Rao geometric mixtures is particularly valuable because it constrains the design space for annealed sampling algorithms; the adaptive schedules could translate into practical improvements once the regularity conditions are clarified.

major comments (2)
  1. [Abstract and §2] Abstract and §2 (setup): the exponential-convergence claims and novel bounds presuppose that each tempered distribution π_t and the path t ↦ π_t remain in the manifold on which the Wasserstein and Fisher-Rao gradient flows are well-defined and unique. No explicit regularity conditions (absolute continuity w.r.t. the reference measure, finite entropy or Fisher information, tangent-space membership of velocity fields) are stated, nor is a single concrete target distribution exhibited for which the entire tempering schedule satisfies them. This assumption is load-bearing for both the convergence rates and the no-speedup theorem.
  2. [§4] §4 (Fisher-Rao no-speedup): the statement that geometric mixtures never accelerate convergence is not accompanied by a precise definition of 'speed-up' (e.g., comparison of the decay constant of KL(·||π_t) or of the squared Wasserstein/Fisher-Rao distance) nor by the explicit form of the velocity field under the mixture. Without these, it is impossible to verify whether the result is parameter-free or merely a consequence of the chosen metric.
minor comments (2)
  1. The abstract refers to 'popular time discretisations' without naming them; the main text should list the specific schemes (e.g., explicit Euler, implicit, or proximal) whose convergence is analyzed.
  2. Notation for the tempered path (π_t) and the geometric mixture should be introduced once and used consistently; several passages mix π_t with the mixture parameter without re-stating the definition.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their careful reading and constructive comments on our manuscript. We appreciate the positive assessment of the potential value of our results on geometric tempering. We address each major comment below and indicate the revisions we will make to strengthen the presentation.

read point-by-point responses
  1. Referee: [Abstract and §2] Abstract and §2 (setup): the exponential-convergence claims and novel bounds presuppose that each tempered distribution π_t and the path t ↦ π_t remain in the manifold on which the Wasserstein and Fisher-Rao gradient flows are well-defined and unique. No explicit regularity conditions (absolute continuity w.r.t. the reference measure, finite entropy or Fisher information, tangent-space membership of velocity fields) are stated, nor is a single concrete target distribution exhibited for which the entire tempering schedule satisfies them. This assumption is load-bearing for both the convergence rates and the no-speedup theorem.

    Authors: We agree that the regularity conditions should be stated explicitly. In the revised manuscript we will add a dedicated paragraph in Section 2 listing the standing assumptions: absolute continuity of each π_t with respect to Lebesgue measure, finite entropy and Fisher information along the entire tempering path, and membership of the velocity fields in the appropriate tangent spaces of the Wasserstein and Fisher-Rao manifolds. We will also include a concrete one-dimensional Gaussian example for which the full geometric-tempering schedule satisfies these conditions at every t. These hypotheses are standard in the gradient-flow literature but making them explicit will remove any ambiguity. revision: yes

  2. Referee: [§4] §4 (Fisher-Rao no-speedup): the statement that geometric mixtures never accelerate convergence is not accompanied by a precise definition of 'speed-up' (e.g., comparison of the decay constant of KL(·||π_t) or of the squared Wasserstein/Fisher-Rao distance) nor by the explicit form of the velocity field under the mixture. Without these, it is impossible to verify whether the result is parameter-free or merely a consequence of the chosen metric.

    Authors: We thank the referee for highlighting the need for precision. In the revision we will define 'no speed-up' explicitly as the property that the dissipation rate of the KL functional (equivalently, the squared Fisher-Rao distance to the target) is never strictly larger than in the untempered flow. We will also derive and display the explicit velocity field induced by the geometric mixture in the Fisher-Rao geometry, showing that the resulting ODE for the KL divergence yields a decay constant that is at most equal to the untempered case for any tempering parameter. This establishes the result as parameter-free under the geometric-mixture construction. revision: yes

Circularity Check

0 steps flagged

No significant circularity detected; derivations rely on standard gradient flow properties.

full rationale

The paper derives exponential convergence bounds and the no-speedup result for geometric tempering directly from the definitions of Wasserstein and Fisher-Rao gradient flows applied to the tempered sequence π_t. These steps use explicit computations of the velocity fields and dissipation rates under the respective metrics, without reducing to quantities defined from the paper's own outputs or fitted parameters. The adaptive tempering schedules are obtained by optimizing the flow structure itself, and all claims rest on external regularity assumptions about absolute continuity and finite entropy rather than self-citations or ansatzes that would create circularity. No load-bearing step collapses to a self-definitional or fitted-input reduction.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The central claims rest on standard background results from optimal transport and information geometry; no free parameters, ad-hoc axioms, or new invented entities are introduced in the abstract.

axioms (2)
  • domain assumption The Wasserstein and Fisher-Rao gradient flows of the KL divergence are well-defined for the tempered sequence of measures.
    Invoked when stating exponential convergence in continuous time.
  • domain assumption Geometric tempering produces a valid absolutely continuous path in the chosen metric space.
    Required for the flow equations and convergence analysis to hold.

pith-pipeline@v0.9.0 · 5460 in / 1363 out tokens · 48520 ms · 2026-05-09T22:56:57.387925+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

154 extracted references · 30 canonical work pages

  1. [1]

    Journal of Functional Analysis , volume=

    Otto, Felix and Villani, C. Journal of Functional Analysis , volume=. 2000 , publisher=

  2. [2]

    Signal and Data Processing of Small Targets 2008 , volume=

    Particle flow for nonlinear filters with log-homotopy , author=. Signal and Data Processing of Small Targets 2008 , volume=. 2008 , organization=

  3. [3]

    Annals of Statistics , volume=

    Slice sampling , author=. Annals of Statistics , volume=. 2003 , publisher=

  4. [4]

    Advances in Neural Information Processing Systems , volume=

    Lambert, Marc and Chewi, Sinho and Bach, Francis and Bonnabel, Silv. Advances in Neural Information Processing Systems , volume=

  5. [5]

    Schlichting, Andr. Poincar. Entropy , volume=. 2019 , publisher=

  6. [6]

    Journal of the American Statistical Association , volume=

    Variational inference: A review for statisticians , author=. Journal of the American Statistical Association , volume=. 2017 , publisher=

  7. [7]

    Statistical Optimal Transport:

    Chewi, Sinho and Niles-Weed, Jonathan and Rigollet, Philippe , volume=. Statistical Optimal Transport:. 2025 , publisher=

  8. [8]

    2023 , publisher=

    Crucinio, Francesca R and Johansen, Adam M , journal=. 2023 , publisher=

  9. [9]

    arXiv preprint arXiv:2410.02711 , year =

    Nets: A non-equilibrium transport sampler , author=. arXiv preprint arXiv:2410.02711 , year=

  10. [10]

    N\"usken, Nikolas and Vargas, Francisco and Padhy, Shreyas and Blessing, Denis , booktitle=

  11. [11]

    2022 , publisher=

    Dai, Chenguang and Heng, Jeremy and Jacob, Pierre E and Whiteley, Nick , journal=. 2022 , publisher=

  12. [12]

    Proceedings of the 41st International Conference on Machine Learning , pages =

    A connection between Tempering and Entropic Mirror Descent , author =. Proceedings of the 41st International Conference on Machine Learning , pages =. 2024 , editor =

  13. [13]

    , TITLE =

    Dupuis, Paul and Ellis, Richard S. , TITLE =. 1997 , PAGES =

  14. [14]

    arXiv preprint arXiv: 2409.13272 , year=

    Stochastic mirror descent for nonparametric adaptive importance sampling , author=. arXiv preprint arXiv: 2409.13272 , year=

  15. [15]

    International Conference on Artificial Intelligence and Statistics , pages=

    Adaptive importance sampling meets mirror descent: a bias-variance tradeoff , author=. International Conference on Artificial Intelligence and Statistics , pages=. 2022 , organization=

  16. [16]

    2016 , organization=

    Dai, Bo and He, Niao and Dai, Hanjun and Song, Le , booktitle=. 2016 , organization=

  17. [17]

    Advances in Neural Information Processing Systems , volume=

    Aubin-Frankowski, Pierre-Cyril and Korba, Anna and L. Advances in Neural Information Processing Systems , volume=

  18. [18]

    and Pathiraja, S

    Crucinio, F.R. and Pathiraja, S. , journal=

  19. [19]

    2006 , publisher=

    Del Moral, Pierre and Doucet, Arnaud and Jasra, Ajay , journal=. 2006 , publisher=

  20. [20]

    2020 , publisher=

    An introduction to sequential Monte Carlo , author=. 2020 , publisher=

  21. [21]

    Computing Science and Statistics: Proceedings of the 23rd Symposium on the Interface , editor=

    Geyer, Charles J , year=. Computing Science and Statistics: Proceedings of the 23rd Symposium on the Interface , editor=

  22. [22]

    Statistics and Computing , volume=

    Annealed importance sampling , author=. Statistics and Computing , volume=. 2001 , publisher=

  23. [23]

    arXiv preprint arXiv:2409.01464 , keywords =

    N. arXiv preprint arXiv:2409.01464 , keywords =. arXiv , arxivId =:arXiv:2409.01464v1 , file =

  24. [24]

    arXiv , arxivId =:1905.09863 , file =

    arXiv preprint arXiv: 1905.09863 , author =. arXiv , arxivId =:1905.09863 , file =

  25. [25]

    Salim, Adil and Korba, Anna and Luise, Giulia , journal=

  26. [26]

    Stein variational gradient descent: A general purpose

    Liu, Qiang and Wang, Dilin , journal=. Stein variational gradient descent: A general purpose

  27. [27]

    Gradient flows: in metric spaces and in the space of probability measures , Year =

    Ambrosio, Luigi and Gigli, Nicola and Savar. Gradient flows: in metric spaces and in the space of probability measures , Year =

  28. [28]

    2021 , publisher=

    Mou, Wenlong and Ma, Yi-An and Wainwright, Martin J and Bartlett, Peter L and Jordan, Michael I , journal=. 2021 , publisher=

  29. [29]

    Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices , volume =

    Vempala, Santosh and Wibisono, Andre , booktitle =. Rapid Convergence of the Unadjusted Langevin Algorithm: Isoperimetry Suffices , volume =

  30. [30]

    2020 , publisher=

    Garbuno-Inigo, Alfredo and Hoffmann, Franca and Li, Wuchen and Stuart, Andrew M , journal=. 2020 , publisher=

  31. [31]

    Stochastics and Stochastics Reports , volume=

    Almost sure weak convergence of random probability measures , author=. Stochastics and Stochastics Reports , volume=. 2006 , publisher=

  32. [32]

    Boustati and

    A. Boustati and. Generalized. 2020 , booktitle =

  33. [33]

    Biometrika , volume=

    Large sample asymptotics of the pseudo-marginal method , author=. Biometrika , volume=

  34. [34]

    Crisan and A

    D. Crisan and A. Doucet , title =. IEEE Transactions on Signal Processing , year = 2002, volume = 50, number = 3, pages =

  35. [35]

    1996 , publisher=

    Roberts, Gareth O and Tweedie, Richard L , journal=. 1996 , publisher=

  36. [36]

    The Langevin Equation

    Pavliotis, Grigorios A. The Langevin Equation. Stochastic Processes and Applications: Diffusion Processes, the Fokker-Planck and Langevin Equations. 2014. doi:10.1007/978-1-4939-1323-7_6

  37. [37]

    Chewi, Sinho and Gouic, Thibaut Le and Lu, Chen and Maunu, Tyler and Rigollet, Philippe , month = jun, year =

  38. [38]

    2024 , publisher=

    Yan, Yuling and Wang, Kaizheng and Rigollet, Philippe , journal=. 2024 , publisher=

  39. [39]

    Belhadji, Ayoub and Sharp, Daniel and Marzouk, Youssef , journal=

  40. [40]

    NeurIPs , author =

    Variational. NeurIPs , author =. 2018 , file =

  41. [41]

    2023 , note=

    Carles Domingo-Enrich and Aram-Alexandre Pooladian , journal=. 2023 , note=

  42. [42]

    2003 , publisher=

    Stochastic Differential Equations: An Introduction with Applications , author=. 2003 , publisher=

  43. [43]

    Statistical Science , volume=

    Generalized Multiple Importance Sampling , author=. Statistical Science , volume=. 2019 , publisher=

  44. [44]

    SIAM Journal on Numerical Analysis , volume=

    Long-run accuracy of variational integrators in the stochastic context , author=. SIAM Journal on Numerical Analysis , volume=. 2010 , publisher=

  45. [45]

    Applied Mathematics Research eXpress , volume=

    Rational construction of stochastic numerical methods for molecular sampling , author=. Applied Mathematics Research eXpress , volume=. 2013 , publisher=

  46. [46]

    Nonlinear Analysis: Theory, Methods & Applications , volume=

    A splitting method for nonlinear diffusions with nonlocal, nonpotential drifts , author=. Nonlinear Analysis: Theory, Methods & Applications , volume=. 2017 , publisher=

  47. [47]

    Sampling in Unit Time with Kernel

    Maurais, Aimee and Marzouk, Youssef , booktitle =. Sampling in Unit Time with Kernel. 2024 , editor =

  48. [48]

    arXiv preprint arXiv:2408.12057 , year=

    Syed, Saifuddin and Bouchard-C. arXiv preprint arXiv:2408.12057 , year=

  49. [49]

    Tan, Lezhi and Lu, Jianfeng , journal=

  50. [50]

    2011 , publisher=

    Jasra, Ajay and Stephens, David A and Doucet, Arnaud and Tsagaris, Theodoros , journal=. 2011 , publisher=

  51. [51]

    2024 , publisher=

    Chewi, Sinho and Erdogdu, Murat A and Li, Mufan and Shen, Ruoqi and Zhang, Matthew S , journal=. 2024 , publisher=

  52. [52]

    Acta Numerica , volume=

    Splitting methods for differential equations , author=. Acta Numerica , volume=. 2024 , publisher=

  53. [53]

    1993 , journal=

    Chebyshev inequalities and comonotonicity , author=. 1993 , journal=

  54. [54]

    Proceedings of the 36th International Conference on Machine Learning , pages =

    Neuron birth-death dynamics accelerates gradient descent and converges asymptotically , author =. Proceedings of the 36th International Conference on Machine Learning , pages =. 2019 , editor =

  55. [55]

    arXiv preprint arXiv:2310.03597 , year=

    Sampling via gradient flows in the space of probability measures , author=. arXiv preprint arXiv:2310.03597 , year=

  56. [56]

    1997 , publisher=

    Berzuini, Carlo and Best, Nicola G and Gilks, Walter R and Larizza, Cristiana , journal=. 1997 , publisher=

  57. [57]

    2024 , publisher=

    Chen, Yifan and Huang, Daniel Zhengyu and Huang, Jiaoyang and Reich, Sebastian and Stuart, Andrew M , journal=. 2024 , publisher=

  58. [58]

    International Conference on Learning Representations , year=

    Sampling with Mollified Interaction Energy Descent , author=. International Conference on Learning Representations , year=

  59. [59]

    Annals of Statistics , year =

    Mathieu Gerber and Nicolas Chopin and Nick Whiteley , title =. Annals of Statistics , year =

  60. [60]

    Proceedings of The 26th International Conference on Artificial Intelligence and Statistics , year=

    Particle algorithms for maximum likelihood training of latent variable models , author=. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics , year=

  61. [61]

    and Chen, Yifan and Huang, Daniel Zhengyu and Huang, Jiaoyang and Wei, Dongyi , year =

    Carrillo, José A. and Chen, Yifan and Huang, Daniel Zhengyu and Huang, Jiaoyang and Wei, Dongyi , year =. Fisher-. arXiv preprint arXiv:2407.15693 , annote =

  62. [62]

    Duncan, Andrew B and Pavliotis, Grigorios A and Zygalakis, KC , journal=

  63. [63]

    arXiv preprint arXiv:1712.07879 , year=

    A probabilistic interpretation of replicator-mutator dynamics , author=. arXiv preprint arXiv:1712.07879 , year=

  64. [64]

    Chada, Neil K and Leimkuhler, Benedict and Paulin, Daniel and Whalley, Peter A , journal=

  65. [65]

    The Oxford Handbook of Nonlinear Filtering , publisher =

    Doucet, Arnaud and Johansen, Adam M , title =. The Oxford Handbook of Nonlinear Filtering , publisher =. 2011 , pages=

  66. [66]

    2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) , pages=

    Septier, Fran. 2009 3rd IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) , pages=. 2009 , organization=

  67. [67]

    Liero, Matthias and Mielke, Alexander and Tse, Oliver and Zhu, Jia-Jie , journal=

  68. [68]

    and Pathiraja, Sahani , journal=

    Crucinio, Francesca R. and Pathiraja, Sahani , journal=

  69. [69]

    Stochastic Processes and their Applications , volume =

    Francesca R Crucinio and Valentin. Stochastic Processes and their Applications , volume =. 2024 , issn =

  70. [70]

    Electronic Journal of Statistics , volume=

    Monmarch. Electronic Journal of Statistics , volume=. 2021 , publisher=

  71. [71]

    2015 , publisher=

    Abdulle, Assyr and Vilmart, Gilles and Zygalakis, Konstantinos C , journal=. 2015 , publisher=

  72. [72]

    Theoretical Population Biology , author =

    Unifying evolutionary dynamics:. Theoretical Population Biology , author =. 2006 , pages =. doi:10.1016/j.tpb.2005.10.004 , abstract =

  73. [73]

    arXiv , arxivId =:2411.16366 , doi =

    Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences (accepted) , author =. arXiv , arxivId =:2411.16366 , doi =

  74. [74]

    Cerf, Raphaël and Dalmau, Joseba , year =. The. doi:10.1007/978-3-031-08663-2 , file =

  75. [75]

    Journal of Theoretical Biology , author =

    Convergence of a. Journal of Theoretical Biology , author =. 2017 , pages =. doi:10.1016/j.jtbi.2017.02.035 , abstract =

  76. [76]

    Wei Guo and Molei Tao and Yongxin Chen , booktitle=

  77. [77]

    Wang and N

    arXiv preprint arXiv:2401.12967 , author =. arXiv , arxivId =:2401.12967 , file =

  78. [78]

    Ensemble Markov chainMonteCarlowithteleportingwalkers

    Lindsey, Michael and Weare, Jonathan and Zhang, Anna , doi =. SIAM-ASA Journal on Uncertainty Quantification , keywords =. arXiv , arxivId =:2106.02686 , file =

  79. [79]

    Zhu, Jia-Jie , journal=

  80. [80]

    2018 , organization=

    Wibisono, Andre , booktitle=. 2018 , organization=

Showing first 80 references.