pith. machine review for the scientific record. sign in

arxiv: 2512.21411 · v3 · submitted 2025-12-24 · 🧮 math.ST · stat.ML· stat.TH

Recognition: unknown

Singular Fluctuation as Specific Heat in Bayesian Learning

Authors on Pith no claims yet
classification 🧮 math.ST stat.MLstat.TH
keywords fluctuationsingularwaicbayesianmodelscoefficientcomplexitycurvature
0
0 comments X
read the original abstract

Singular learning theory characterizes Bayesian models with non-identifiable parameterizations through two central quantities: the real log canonical threshold (RLCT), which governs marginal likelihood asymptotics, and the singular fluctuation, which determines second-order generalization behavior and the complexity term in WAIC. While the geometric meaning of the RLCT is well understood, the interpretation of singular fluctuation has remained comparatively opaque. We show that singular fluctuation admits a precise thermodynamic interpretation. Under a tempered (Gibbs) posterior, it is exactly the curvature of the Bayesian free energy with respect to inverse temperature; equivalently, the variance of the log-likelihood observable. In this sense, singular fluctuation is the statistical analogue of specific heat. This identity clarifies why singular fluctuation controls the equation of state relating training and generalization error and explains the success of WAIC in singular models: WAIC estimates a fluctuation coefficient rather than a parameter dimension. Across Gaussian mixture models and reduced-rank regression, we demonstrate that singular fluctuation behaves as a thermodynamic response coefficient. As temperature decreases, posterior reorganization suppresses fluctuation directions that affect predictive performance, and model-specific geometric observables track the decay of singular fluctuation. Rather than introducing new asymptotic expansions, this work unifies existing variance identities, equation-of-state results, and WAIC complexity corrections under a single free-energy curvature framework.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Using Statistical Mechanics to Improve Real-World Bayesian Inference: A New Method Combining Tempered Posteriors and Wang-Landau Sampling

    stat.ME 2026-04 unverdicted novelty 5.0

    Tempered posteriors combined with Wang-Landau sampling identify transition temperatures that optimize predictive performance in Bayesian inference for real-world problems.