In deterministic partially observable worlds, perfect prediction requires either identifying the relevant hidden quotient or achieving overwrite control, while high empowerment alone is insufficient.
hub
IBM Journal of Research and Development5(3), 183–191 (Jul 1961)
6 Pith papers cite this work, alongside 4,171 external citations. Polarity classification is still indexing.
hub tools
years
2026 6verdicts
UNVERDICTED 6representative citing papers
Statistical analysis of energy data falsifies the 1% exponential growth in the Kardashev model, shows linear extrapolation yields a 1.6E15-year Type II timescale, and introduces the KSN renormalization B(t) = P(t)/H(t) spanning 14 orders of magnitude.
A crumbling abstract machine yields a reversible Landauer's embedding for call-by-value lambda calculus with constant space overhead per step.
Information defined as maximum-caliber deviation derives IIT 3.0 cause-effect repertoires from constrained entropy maximization and equates to prediction error under CLT and LDT.
A thermodynamic-inspired information-geometric framework defines a composite LLM stability score that outperforms a utility-entropy baseline by 0.0299 on average across 80 observations, with gains increasing at higher entropy.
The Kerimov-Alekberli model uses KL divergence on a Riemannian manifold with a Fisher-derived threshold and the Landauer principle to treat adversarial perturbations as measurable physical work for real-time AI system stability.
citing papers explorer
-
Prediction and Empowerment: A Theory of Agency through Bridge Interfaces
In deterministic partially observable worlds, perfect prediction requires either identifying the relevant hidden quotient or achieving overwrite control, while high empowerment alone is insufficient.
-
Kardashev's Conundrum: Statistical Falsification of the Standard Kardashev Model and the Kardashev--Sagan--Nakamoto Resolution
Statistical analysis of energy data falsifies the 1% exponential growth in the Kardashev model, shows linear extrapolation yields a 1.6E15-year Type II timescale, and introduces the KSN renormalization B(t) = P(t)/H(t) spanning 14 orders of magnitude.
-
A Reversible Crumbling Abstract Machine for Plotkin's Call-by-Value
A crumbling abstract machine yields a reversible Landauer's embedding for call-by-value lambda calculus with constant space overhead per step.
-
Information as Maximum-Caliber Deviation: A bridge between Integrated Information Theory and the Free Energy Principle
Information defined as maximum-caliber deviation derives IIT 3.0 cause-effect repertoires from constrained entropy maximization and equates to prediction error under CLT and LDT.
-
An Information-Geometric Framework for Stability Analysis of Large Language Models under Entropic Stress
A thermodynamic-inspired information-geometric framework defines a composite LLM stability score that outperforms a utility-entropy baseline by 0.0299 on average across 80 observations, with gains increasing at higher entropy.
-
The Kerimov-Alekberli Model: An Information-Geometric Framework for Real-Time System Stability
The Kerimov-Alekberli model uses KL divergence on a Riemannian manifold with a Fisher-derived threshold and the Landauer principle to treat adversarial perturbations as measurable physical work for real-time AI system stability.