pith. machine review for the scientific record. sign in

arxiv: 2605.13648 · v1 · submitted 2026-05-13 · 🧮 math.PR · cs.NA· math.NA

Recognition: no theorem link

Sticky CIR process with potential: invariant measure and exact sampling

Tony Shardlow

Authors on Pith no claims yet

Pith reviewed 2026-05-14 18:01 UTC · model grok-4.3

classification 🧮 math.PR cs.NAmath.NA
keywords sticky CIR processinvariant measureexact samplingGirsanov change of measureconfluent hypergeometric functionsMetropolis-Hastings samplerunadjusted Langevin algorithm
0
0 comments X

The pith

For δ in (1,2), the sticky CIR process is well-posed and possesses a unique invariant measure that mixes a point mass at zero with a weighted gamma-type density on the interior.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper proves well-posedness of the sticky Cox-Ingersoll-Ross diffusion on the non-negative line when the origin is accessible but not absorbing. It establishes uniqueness of the invariant measure, expressed as a mixture of a Dirac mass at zero and a gamma-type density weighted by any potential function. An explicit Green's function for the resolvent, written in confluent hypergeometric functions, supplies the density and yields an exact sampler when the potential is zero. For a non-zero potential, a Girsanov change of measure produces the tilted invariant, which can be targeted exactly by a Metropolis-Hastings sampler or approximated by an unadjusted Langevin algorithm that carries an O(h) bias.

Core claim

The sticky CIR process with parameter δ in (1,2) is well-posed and possesses a unique invariant measure that is a mixture of a point mass at zero and a weighted gamma-type density. This measure is characterized by an explicit Green's function for the resolvent expressed using confluent hypergeometric functions, which also yields an exact sampling method in the zero-potential case. For non-trivial potentials, existence and uniqueness follow from a Girsanov change of measure, supported by two numerical sampling algorithms.

What carries the argument

The resolvent Green's function expressed in confluent hypergeometric functions, which determines the invariant distribution and enables construction of exact samplers.

If this is right

  • The invariant measure is unique for the given parameter range.
  • Exact sampling from the invariant measure is possible when the potential is zero.
  • A Metropolis-Hastings sampler targets the exact invariant measure for any potential.
  • An unadjusted Langevin algorithm approximates the invariant measure with an O(h) bias.
  • Existence and uniqueness of the tilted invariant measure hold via the Girsanov change of measure.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The explicit Green's function construction may extend to other one-dimensional sticky diffusions that appear in sparse Bayesian models.
  • The mixture structure implies that the process spends a positive proportion of time at the origin, which could slow mixing in MCMC applications.
  • Numerical comparison of the exact Metropolis-Hastings sampler against the unadjusted Langevin algorithm in higher dimensions would quantify practical bias-variance trade-offs.

Load-bearing premise

The Girsanov change of measure correctly tilts the invariant distribution for non-trivial potential while preserving the sticky boundary behavior at the origin.

What would settle it

A long Monte Carlo trajectory of the process that fails to converge in distribution to the predicted mixture of point mass at zero and gamma density would falsify uniqueness of the invariant measure.

Figures

Figures reproduced from arXiv: 2605.13648 by Tony Shardlow.

Figure 1
Figure 1. Figure 1: Signed boundary mass error πˆ({0}) − π({0}) for both algorithms across potentials (columns) and stickiness parameters (rows), as a function of step rate α. Error bars are ±2 standard errors across 4 chains. Small values confirm that MCMC targets the correct invariant measure, with ULA exhibiting bias that increases for small α. Efficiency: ESS per second [PITH_FULL_IMAGE:figures/full_fig_p019_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Bulk ESS per second (interior chain) for two algorithms across potentials (columns) [PITH_FULL_IMAGE:figures/full_fig_p020_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Empirical interior density (histogram) versus theoretical [PITH_FULL_IMAGE:figures/full_fig_p020_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: MCMC acceptance rates at α = 5 (h = 0.2) for four test potentials. Left: interior– interior rate. Centre: interior-to-boundary rate. Right: boundary-to-interior rate. Dashed line marks perfect acceptance (= 1). G = 0 G = u 2/2 G = (u − 1) 2/2 G = 2u 0.0 0.2 0.4 0.6 0.8 1.0 Boundary fraction ˆπ({0}) Theory π({0}) MCMC (30k steps, α = 5) [PITH_FULL_IMAGE:figures/full_fig_p021_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Empirical boundary fraction of the MCMC chain (30k steps, [PITH_FULL_IMAGE:figures/full_fig_p021_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: ULA boundary fraction (left) and absolute bias (right) versus step size [PITH_FULL_IMAGE:figures/full_fig_p022_6.png] view at source ↗
read the original abstract

We study the sticky Cox-Ingersoll-Ross (CIR) process in one dimension, a diffusion on $[0,\infty)$ with a sticky boundary condition at the origin, arising as the marginal process in a sparse Bayesian inference framework based on Hadamard-Langevin dynamics. For the parameter range $\delta\in(1,2)$, in which the origin is accessible but not absorbing, we prove well-posedness of the process and uniqueness of its invariant measure, which is a mixture of a point mass at zero and a weighted gamma-type density on the interior. We derive an explicit Green's function for the resolvent in terms of confluent hypergeometric functions, and use this to construct an exact sampler for the invariant measure in the zero-potential case. For a non-trivial potential $G$, we establish existence and uniqueness of the tilted invariant measure via a Girsanov change of measure, and develop two sampling algorithms: a Metropolis-Hastings corrected sampler that targets the invariant measure exactly, and an unadjusted Langevin algorithm (ULA) that is cheaper per step but introduces an $O(h)$ bias. Numerical experiments confirm the predicted behaviour: the Metropolis-Hastings sampler achieves the target invariant measure at all step sizes, while the ULA exhibits the expected $O(h)$ bias.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Circularity Check

0 steps flagged

Derivations rely on standard stochastic analysis and explicit constructions with no self-referential reductions

full rationale

The paper proves well-posedness and uniqueness of the invariant measure for the sticky CIR process in the range δ∈(1,2) via standard diffusion theory, constructs an explicit resolvent Green's function in terms of confluent hypergeometric functions, and applies a Girsanov change of measure to obtain the tilted invariant for non-zero potential G. These steps use external mathematical results (e.g., properties of hypergeometric functions and Girsanov theorem) and direct constructions rather than any fitted parameters renamed as predictions, self-definitional loops, or load-bearing self-citations that reduce the central claims to their own inputs by construction. The derivation chain is therefore self-contained against external benchmarks.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The work rests on standard results from stochastic calculus for SDE well-posedness and properties of confluent hypergeometric functions; no free parameters, ad-hoc axioms, or new invented entities are introduced in the abstract.

axioms (2)
  • standard math Existence and uniqueness theorems for one-dimensional SDEs with sticky boundary conditions
    Invoked to establish well-posedness for δ in (1,2).
  • standard math Analytic properties of confluent hypergeometric functions for constructing Green's functions
    Used to obtain explicit resolvent expression.

pith-pipeline@v0.9.0 · 5527 in / 1492 out tokens · 50194 ms · 2026-05-14T18:01:40.114735+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

14 extracted references · 12 canonical work pages · 1 internal anchor

  1. [1]

    BORODIN, A. N. and SALMINEN, P. (2002).Handbook of Brownian Motion: Facts and Formulae, 2nd ed. Probability and Its Applications. Birkhäuser, Basel. https://doi.org/10.1007/978-3-0348-8163-0

  2. [2]

    Hadamard Langevin dynamics for sampling the l1-prior

    CHELTSOV, I., CORNALBA, F., POON, C. and SHARDLOW, T. (2026). Hadamard–Langevin dynamics for sampling the l1-prior.Bernoulli. To appear. https://doi.org/10.48550/arXiv.2411.11403

  3. [3]

    C., INGERSOLL, J

    COX, J. C., INGERSOLL, J. E. and ROSS, S. A. (1985). A Theory of the Term Structure of Interest Rates. Econometrica53385–407. https://doi.org/10.2307/1911242

  4. [4]

    GEORGE, E. I. and MCCULLOCH, R. E. (1993). Variable Selection via Gibbs Sampling.Journal of the American Statistical Association88881–889. https://doi.org/10.1080/01621459.1993.10476353 GÖING-JAESCHKE, A. and YOR, M. (2003). A Survey and Some Generalizations of Bessel Processes.Bernoulli 9313–349. https://doi.org/10.3150/bj/1068128980

  5. [5]

    and MATTINGLY, J

    HAIRER, M. and MATTINGLY, J. C. (2011). Yet another look at Harris’ ergodic theorem for Markov chains. InSeminar on Stochastic Analysis, Random Fields and Applications VI.Progress in Probability63109–117. Birkhäuser/Springer Basel AG, Basel. https://doi.org/10.1007/978-3-0348-0021-1_7 ITÔ, K. and MCKEAN, H. P. (1965).Diffusion Processes and Their Sample P...

  6. [6]

    and TAYLOR, H

    KARLIN, S. and TAYLOR, H. M. (1981).A Second Course in Stochastic Processes, 1st ed. Academic Press, New York

  7. [7]

    and MARTIN, O

    KUMAR, R., CARROLL, C., HARTIKAINEN, A. and MARTIN, O. (2019). ArviZ a unified library for exploratory analysis of Bayesian models in Python.Journal of Open Source Software41143. https://doi.org/10.21105/joss. 01143

  8. [8]

    MEYN, S. P. and TWEEDIE, R. L. (1993).Markov Chains and Stochastic Stability.Communications and Control Engineering. Springer. https://doi.org/10.1007/978-1-4471-3267-7

  9. [9]

    MITCHELL, T. J. and BEAUCHAMP, J. J. (1988). Bayesian Variable Selection in Linear Regression.Journal of the American Statistical Association831023–1032. https://doi.org/10.1080/01621459.1988.10478694

  10. [10]

    OLVER, F. W. J., LOZIER, D. W., BOISVERT, R. F. and CLARK, C. W., eds. (2010).NIST Handbook of Mathematical Functions. Cambridge University Press, New York. Also available as the NIST Digital Library of Mathematical Functions at https://dlmf.nist.gov

  11. [11]

    PESKIR, G. (2022). Sticky Bessel Diffusions.Stochastic Processes and Their Applications1501015–1036. https://doi.org/10.1016/j.spa.2022.05.003

  12. [12]

    and YOR, M

    REVUZ, D. and YOR, M. (1999).Continuous Martingales and Brownian Motion, 3rd ed.Grundlehren der mathematischen Wissenschaften293. Springer. https://doi.org/10.1007/978-3-662-06400-9

  13. [13]

    ROGERS, L. C. G. and WILLIAMS, D. (2000).Diffusions, Markov Processes, and Martingales2, 2nd ed. Cambridge University Press. https://doi.org/10.1017/CBO9781107590120

  14. [14]

    VOLKONSKII, V. A. (1958). Random substitution of time in strong Markov processes.Theory of Probability and Its Applications3310–326. https://doi.org/10.1137/1103025