pith. machine review for the scientific record. sign in

arxiv: 2605.04493 · v3 · submitted 2026-05-06 · ❄️ cond-mat.stat-mech · cs.IT· math.IT

Recognition: 2 theorem links

· Lean Theorem

The unique, universal entropy for complex systems

Kenric P. Nelson

Pith reviewed 2026-05-12 02:28 UTC · model grok-4.3

classification ❄️ cond-mat.stat-mech cs.ITmath.IT
keywords entropycomplex systemscoupled entropystretched exponential distributionsnon-additive entropyuniversality classeslong-range dependenceinformation thermodynamics
0
0 comments X

The pith

Coupled entropy is the unique universal entropy for complex systems because it measures uncertainty at the maximizing distribution's scale and is extensive across all scaling classes.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper builds an axiomatic foundation for entropy in complex systems by adding two requirements that earlier definitions missed. Entropy must quantify uncertainty at the specific informational scale of the distribution it maximizes, marked by a log-log slope of negative one. It must also remain extensive over the entire range of universality scaling classes identified by Hanel and Thurner. Under these conditions the coupled entropy, the form maximized by coupled stretched exponential distributions, is shown to be the only one that works. This directly ties the entropy's non-additivity to the system's long-range dependence and supplies a basis for thermodynamic relations in non-equilibrium settings.

Core claim

The central claim is that the coupled entropy is the unique entropy satisfying the full set of axioms for complex systems. It measures uncertainty precisely at the informational scale of its maximizing distribution where the log-log slope equals negative one, and it is extensive across every Hanel-Thurner universality scaling class. The non-additivity parameter equals the long-range dependence or nonlinear statistical coupling, while the matched extensivity is fixed by the coupling, stretching parameter, and dimensions. This removes the misalignment produced by Tsallis q-statistics and supports consistent information-thermodynamic applications.

What carries the argument

The coupled entropy, maximized by the coupled stretched exponential distributions, which enforces uncertainty measurement at the log-log slope of negative one and guarantees extensivity across all scaling classes.

If this is right

  • The non-additivity of the entropy equals the long-range dependence or nonlinear statistical coupling of the system.
  • Entropy-matched extensivity is a function of the coupling, stretching parameter, and dimensions.
  • Tsallis q-statistics produces misalignment when used for physical modeling of complex systems.
  • The framework supports applications including complexity measurement, a zeroth law of temperature, thermodynamic consistency of the coupled free energy, and modeling intelligence in non-equilibrium conditions.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same scale and extensivity conditions could be used to test whether other generalized entropies map onto the coupled form through parameter redefinition.
  • Direct measurement of entropy extensivity in physical systems known to follow stretched exponential statistics would provide a concrete check on the claimed universality.
  • The link between non-additivity and coupling suggests a route to derive consistent temperature definitions in systems with long-range interactions beyond the cases examined.

Load-bearing premise

Entropy must measure uncertainty exactly at the informational scale of the maximizing distribution where the log-log slope equals negative one, and it must be extensive across the full Hanel-Thurner universality scaling classes.

What would settle it

An explicit construction of a different entropy that satisfies both the scale-specific uncertainty condition and extensivity over all Hanel-Thurner classes, or experimental data from a system with known nonlinear coupling showing entropy values inconsistent with the coupled form.

read the original abstract

An axiomatic foundation regarding the entropy for complex systems is established. Missing from decades of research was the requirement that entropy must measure the uncertainty at the informational scale of the maximizing distribution, where the log-log slope equals $-1$. Additionally, entropy must be extensive across the full universality scaling classes defined by Hanel-Thurner. The coupled entropy, maximized by the coupled stretched exponential distributions, is proven to be the unique, universal entropy that satisfies these requirements. The non-additivity of the entropy is equal to the long-range dependence or nonlinear statistical coupling. The entropy-matched extensivity is a function of the coupling, stretching parameter, and dimensions. Evidence is provided that the Tsallis $q$-statistics creates misalignment in the physical modeling of complex systems. Information thermodynamic applications are reviewed, including measuring complexity, a zeroth law of temperature, the thermodynamic consistency of the coupled free energy, and a model of intelligence in non-equilibrium.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The manuscript establishes an axiomatic foundation for the entropy of complex systems. It asserts that two requirements were missing from prior work: entropy must measure uncertainty precisely at the informational scale of the maximizing distribution where the log-log slope equals -1, and entropy must be extensive across the full Hanel-Thurner universality scaling classes. The coupled entropy, maximized by coupled stretched exponential distributions, is claimed to be the unique universal entropy satisfying these axioms. Non-additivity is stated to equal long-range dependence or nonlinear statistical coupling, with entropy-matched extensivity depending on the coupling and stretching parameters plus dimensions. Applications to information thermodynamics, including complexity measurement, a zeroth law, and thermodynamic consistency of the coupled free energy, are reviewed, along with evidence that Tsallis q-statistics misaligns with physical modeling of complex systems.

Significance. If the uniqueness result is established via independent, non-circular derivations and the extensivity holds over the claimed scaling classes, the work could supply a principled entropy for systems with long-range dependence, potentially unifying aspects of non-extensive statistics with Hanel-Thurner classes. The explicit linkage of non-additivity to coupling and the reviewed thermodynamic applications would strengthen its utility in information thermodynamics. The absence of visible proof steps or checks against data in the abstract, however, leaves the significance conditional on verification of the central derivation.

major comments (2)
  1. [Abstract] Abstract: The claim that the coupled entropy 'is proven to be the unique, universal entropy' is asserted without any derivation, key intermediate equations, or explicit verification that the 'informational scale where the log-log slope equals -1' is obtained independently of the functional form of the coupled stretched exponential distributions. This makes it impossible to assess whether the uniqueness follows from the axioms or is selected by construction.
  2. [Abstract] Abstract: The assertion that 'the non-additivity of the entropy is equal to the long-range dependence' is presented as a direct consequence, yet no steps are shown establishing how this equality arises from the two new requirements rather than from the prior definition of the coupling parameter. If the coupling parameter is fitted or defined via the target distributions, the equality risks reducing to a tautology.
minor comments (1)
  1. The abstract states that 'evidence is provided' that Tsallis q-statistics creates misalignment but does not indicate the form of that evidence (analytic, numerical, or empirical) or the section in which it appears.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their careful reading and constructive feedback on our manuscript. The comments highlight opportunities to improve the clarity of the abstract, which we have addressed through targeted revisions while preserving the paper's core claims. The full derivations and proofs are contained in the main text; we agree that the abstract can better signpost these without expanding its length unduly. Below we respond point by point to the major comments.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The claim that the coupled entropy 'is proven to be the unique, universal entropy' is asserted without any derivation, key intermediate equations, or explicit verification that the 'informational scale where the log-log slope equals -1' is obtained independently of the functional form of the coupled stretched exponential distributions. This makes it impossible to assess whether the uniqueness follows from the axioms or is selected by construction.

    Authors: The abstract is a concise summary of the central result. The independent derivation of the informational scale condition (log-log slope exactly -1) from the two axiomatic requirements, without presupposing the functional form of the maximizing distributions, is carried out in Sections II and III via functional equations that first fix the scale and then impose extensivity across all Hanel-Thurner classes. This yields the coupled entropy as the unique solution. We have revised the abstract to reference these sections explicitly and to note that the scale condition is solved prior to identifying the distribution family, allowing readers to trace the non-circular logic. revision: yes

  2. Referee: [Abstract] Abstract: The assertion that 'the non-additivity of the entropy is equal to the long-range dependence' is presented as a direct consequence, yet no steps are shown establishing how this equality arises from the two new requirements rather than from the prior definition of the coupling parameter. If the coupling parameter is fitted or defined via the target distributions, the equality risks reducing to a tautology.

    Authors: The equality is derived, not assumed a priori. The coupling parameter is obtained by enforcing extensivity over the full set of Hanel-Thurner scaling classes once the informational scale condition has fixed the log-log slope at -1; non-additivity then equals the resulting long-range dependence by construction of the entropy functional (detailed after Eq. (12) and in the extensivity analysis). The parameter is not fitted to any distribution but emerges as the unique value satisfying both axioms simultaneously. We have added a clarifying clause to the abstract and a short explanatory sentence in the introduction to make this derivation sequence explicit. revision: yes

Circularity Check

1 steps flagged

Uniqueness proof defines informational scale using properties of the target coupled stretched-exponential distributions

specific steps
  1. self definitional [Abstract]
    "Missing from decades of research was the requirement that entropy must measure the uncertainty at the informational scale of the maximizing distribution, where the log-log slope equals -1. ... The coupled entropy, maximized by the coupled stretched exponential distributions, is proven to be the unique, universal entropy that satisfies these requirements."

    The new requirement is phrased using the functional form and log-log behavior of the coupled stretched-exponential distributions themselves. The uniqueness proof then selects the entropy maximized by exactly those distributions, making the result hold by the choice of axiom rather than independent derivation.

full rationale

The paper introduces a new axiom requiring entropy to measure uncertainty precisely at the scale where the maximizing distribution has log-log slope -1, then proves the coupled entropy (maximized by coupled stretched exponentials) is the unique entropy satisfying this plus Hanel-Thurner extensivity. This creates a self-definitional loop: the scale condition is stated in terms of the very distributions that maximize the proposed entropy, so the uniqueness result holds by construction once the axiom is accepted. No independent derivation of the slope=-1 condition from more primitive axioms is evident in the provided text. The non-additivity claim is also tied directly to the coupling parameter without external justification shown. This matches a moderate circularity level (one load-bearing self-definitional step) but does not reduce the entire derivation to tautology.

Axiom & Free-Parameter Ledger

2 free parameters · 2 axioms · 0 invented entities

The central claim rests on two new domain assumptions about what entropy must satisfy and on the coupling and stretching parameters whose status as free or derived is not clarified in the abstract.

free parameters (2)
  • coupling parameter
    Non-additivity is stated to equal the nonlinear statistical coupling, implying this parameter is central and likely chosen or fitted to match system behavior.
  • stretching parameter
    Appears in the maximizing distributions and in the extensivity function, suggesting it is a key adjustable element.
axioms (2)
  • domain assumption Entropy must measure the uncertainty at the informational scale of the maximizing distribution where the log-log slope equals -1
    Presented as a previously missing requirement in the axiomatic foundation.
  • domain assumption Entropy must be extensive across the full universality scaling classes defined by Hanel-Thurner
    Second required property for the entropy to be universal.

pith-pipeline@v0.9.0 · 5451 in / 1579 out tokens · 59571 ms · 2026-05-12T02:28:38.875469+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

Reference graph

Works this paper leans on

97 extracted references · 97 canonical work pages

  1. [1]

    Lewin, R.Complexity: Life at the Edge of Chaos(University of Chicago Press, 1999)

  2. [2]

    B., Ray, C

    Marino, S., Hogue, I. B., Ray, C. J. & Kirschner, D. E. A methodology for performing global uncertainty and sensitivity analysis in systems biology.Journal of Theoretical Biology254, 178–196 (2008). URL https://www.sciencedirect.com/science/article/pii/S0022519308001896

  3. [3]

    V., Filatova, D

    Eskov, V. V., Filatova, D. Y., Ilyashenko, L. K. & Vochmina, Y. V. Classification of Uncertainties in Modeling of Complex Biological Systems.Moscow University Physics Bulletin74, 57–63 (2019). URL https://doi.org/10.3103/S0027134919010089

  4. [4]

    Holland, J. H. Complex Adaptive Systems.Daedalus121, 17–30 (1992). URL https://www.jstor.org/stable/ 20025416?seq=1

  5. [5]

    Yoshida, Z.Nonlinear Science: The Challenge of Complex Systems(Springer Science & Business Media, 2010)

  6. [6]

    Human-CenteredArtificialIntelligence:Reliable,Safe&Trustworthy.International Journal of Human–Computer Interaction36, 495–504 (2020)

    Shneiderman,B. Human-CenteredArtificialIntelligence:Reliable,Safe&Trustworthy.International Journal of Human–Computer Interaction36, 495–504 (2020). URL https://doi.org/10.1080/10447318.2020.1741118

  7. [7]

    Almeida, J. S. Predictive non-linear modeling of complex data by artificial neural networks.Cur- rent Opinion in Biotechnology13, 72–76 (2002). URL https://www.sciencedirect.com/science/article/pii/ S0958166902002884

  8. [8]

    Benderskaya, E. N. inNonlinear Trends in Modern Artificial Intelligence: A New Perspective(eds Kelemen, J., Romportl, J. & Zackova, E.)Beyond Artificial Intelligence: Contemplations, Expectations, Applications 113–124 (Springer, Berlin, Heidelberg, 2013). URL https://doi.org/10.1007/978-3-642-34422-0_8

  9. [9]

    Dietterich, T. G. Steps Toward Robust Artificial Intelligence.AI Magazine38, 3–24 (2017). URL https: //ojs.aaai.org/aimagazine/index.php/aimagazine/article/view/2756

  10. [10]

    URL http://link.springer.com/10.1007/978-0-387-45024-7

    Resnick, S.Heavy-Tail Phenomena: Probabilistic and Statistical Modeling(Springer, New York, NY, 2007). URL http://link.springer.com/10.1007/978-0-387-45024-7

  11. [11]

    & Jou, D

    Casas-Vázquez, J. & Jou, D. Temperature in non-equilibrium states: a review of open problems and current proposals.Reports on Progress in Physics66, 1937 (2003). URL https://doi.org/10.1088/0034-4885/66/ 11/R03

  12. [12]

    L., Tumulka, R

    Goldstein, S., Lebowitz, J. L., Tumulka, R. & Zanghì, N. inGibbs and Boltzmann Entropy in Classical and Quantum Mechanics(ed.Allori, V.)Statistical Mechanics and Scientific Explanation519–581 (World Scientific, Singapore, 2019). URL https://www.worldscientific.com/doi/abs/10.1142/9789811211720_0014

  13. [13]

    Jaynes, E. T. Information Theory and Statistical Mechanics.Physical Review106, 620–630 (1957)

  14. [14]

    Arnold, B. C. inPareto Distribution(ed.Balakrishnan)Wiley StatsRef: Statistics Reference Online1–10 (John Wiley & Sons, Ltd, Statistics Reference Online, 2015). URL https://onlinelibrary.wiley.com/doi/abs/ 10.1002/9781118445112.stat01100.pub2

  15. [15]

    Statistical Inference Using Extreme Order Statistics.The Annals of Statistics3, 119–131 (1975)

    Pickands III, J. Statistical Inference Using Extreme Order Statistics.The Annals of Statistics3, 119–131 (1975). URL https://www.jstor.org/stable/2958083

  16. [16]

    Student’s

    Pearson, E. S. & Wishart, J. "Student’s” collected papers, issued by the biometrika office.University College, London(1942)

  17. [17]

    On measures of entropy and information.Proceedings of the 4th Berkeley symposium on mathe- matical statistics and probability1, 547–561 (1961)

    Rényi, A. On measures of entropy and information.Proceedings of the 4th Berkeley symposium on mathe- matical statistics and probability1, 547–561 (1961). URL https://cir.nii.ac.jp/crid/1570291224329272960

  18. [18]

    D., Mitter, J

    Sharma, B. D., Mitter, J. & Mohan, M. On measures of “useful” information.Information and Control39, 323–336 (1978). URL https://www.sciencedirect.com/science/article/pii/S001999587890671X

  19. [19]

    Sharma, B. D. & Mittal, D. P. New non-additive measures of entropy for discrete probability distributions. Journal of Combinatorics, Information & System Sciences10, 28–40 (1975)

  20. [20]

    Journal of Physics A: Mathematical and Theoretical , author =

    Nielsen, F. & Nock, R. A closed-form expression for the Sharma–Mittal entropy of exponential families. Journal of Physics A: Mathematical and Theoretical45, 032003 (2012). URL https://iopscience.iop.org/ article/10.1088/1751-8113/45/3/032003

  21. [21]

    Possible generalization of Boltzmann-Gibbs statistics,

    Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics.Journal of Statistical Physics52, 479–487 (1988). URL https://doi.org/10.1007/BF01016429. 31

  22. [22]

    Nonadditive entropy: The concept and its use.The European Physical Journal A40, 257 (2009)

    Tsallis, C. Nonadditive entropy: The concept and its use.The European Physical Journal A40, 257 (2009). URL http://link.springer.com/10.1140/epja/i2009-10799-0

  23. [23]

    & Thurner, S

    Hanel, R. & Thurner, S. A comprehensive classification of complex statistical systems and an axiomatic derivation of their entropy and distribution functions.EPL (Europhysics Letters)93, 20006 (2011)

  24. [24]

    Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy.Annals of Physics365, 180–197 (2016)

    Tempesta, P. Beyond the Shannon–Khinchin formulation: The composability axiom and the universal-group entropy.Annals of Physics365, 180–197 (2016). URL https://www.sciencedirect.com/science/article/pii/ S0003491615003176

  25. [25]

    Generalised information and entropy measures in physics.Contemporary Physics50, 495–510 (2009)

    Beck, C. Generalised information and entropy measures in physics.Contemporary Physics50, 495–510 (2009). URL http://www.tandfonline.com/doi/abs/10.1080/00107510902823517

  26. [26]

    Maximum entropy principle and power-law tailed distributions.The European Physical Journal B70, 3–13 (2009)

    Kaniadakis, G. Maximum entropy principle and power-law tailed distributions.The European Physical Journal B70, 3–13 (2009). URL https://doi.org/10.1140/epjb/e2009-00161-0

  27. [27]

    A Fresh Take on Disorder, Or Disorderly Science?Science297, 1268–1269 (2002)

    Cho, A. A Fresh Take on Disorder, Or Disorderly Science?Science297, 1268–1269 (2002). URL https: //www.science.org/doi/10.1126/science.297.5585.1268

  28. [28]

    Salthe, S. N. inWhat is Infodynamics?(eds Ragsdell, G. & Wilby, J.)Understanding Complexity31–38 (Springer US, Boston, MA, 2001). URL https://doi.org/10.1007/978-1-4615-1313-1_5

  29. [29]

    M., Horowitz, J

    Parrondo, J. M., Horowitz, J. M. & Sagawa, T. Thermodynamics of information.Nature physics 11, 131–139 (2015). URL https://idp.nature.com/authorize/casa?redirect_uri=https://www.nature. com/articles/nphys3230&casa_token=2uOX9QNCAcsAAAAA:wtAqdrwWODsegKhW67Z83X8I_ 1VFFZQlhiFBI5pu6MOeA9vnqaEZ4w6DMBgMuEWeklsJAAuk_SyRbeRT

  30. [30]

    Shannon, C. E. A Mathematical Theory of Communication.Bell System Technical Journal27, 379–423 (1948). URL https://ieeexplore.ieee.org/document/6773024

  31. [31]

    URL https://books.google.com/books?hl= en&lr=&id=xCjDAgAAQBAJ&oi=fnd&pg=PP1&dq=thermodynamics&ots=Wed_YMSDWY&sig= ex7bCpsyEfHvuebB0SlaosX8MRA

    Fermi, E.Thermodynamics(Courier Corporation, 2012). URL https://books.google.com/books?hl= en&lr=&id=xCjDAgAAQBAJ&oi=fnd&pg=PP1&dq=thermodynamics&ots=Wed_YMSDWY&sig= ex7bCpsyEfHvuebB0SlaosX8MRA

  32. [32]

    Lewis, G. N. & Randall, M.Thermodynamics(Courier Dover Publications, 2020). URL https: //books.google.com/books?hl=en&lr=&id=CK7qDwAAQBAJ&oi=fnd&pg=PR5&dq=thermodynamics& ots=P67GjE-4uY&sig=D--pCO4WKqZ4KCnNGBt2cFWAxns

  33. [33]

    URL https: //books.google.com/books?hl=en&lr=&id=ywc8DQAAQBAJ&oi=fnd&pg=PR7&dq=statistical+ mechanics&ots=pvlUVrc8IS&sig=C35VHQ4h4a6RPOSYVuiNyRa6UUM

    Ma, S.-K.Statistical mechanics(World Scientific Publishing Company, 1985). URL https: //books.google.com/books?hl=en&lr=&id=ywc8DQAAQBAJ&oi=fnd&pg=PR7&dq=statistical+ mechanics&ots=pvlUVrc8IS&sig=C35VHQ4h4a6RPOSYVuiNyRa6UUM

  34. [34]

    URL https://books.google.com/books? hl=en&lr=&id=-svCAgAAQBAJ&oi=fnd&pg=PA1&dq=statistical+mechanics&ots=llBQPvd52u&sig= P99pGBFXkJHn5n1x1smDAKWe5Go

    Davidson, N.Statistical mechanics(Courier Corporation, 2013). URL https://books.google.com/books? hl=en&lr=&id=-svCAgAAQBAJ&oi=fnd&pg=PA1&dq=statistical+mechanics&ots=llBQPvd52u&sig= P99pGBFXkJHn5n1x1smDAKWe5Go

  35. [35]

    Tsallis, C. inStatistical mechanics for complex systems: On the structure of q-triplets(eds Duarte, S.et al.) Physical and Mathematical Aspects of Symmetries51–59 (Springer International Publishing, Cham, 2017). URL http://link.springer.com/10.1007/978-3-319-69164-0_7

  36. [36]

    Borges, E. P. A possible deformed algebra and calculus inspired in nonextensive thermostatistics.Physica A: Statistical Mechanics and its Applications340, 95–101 (2004)

  37. [37]

    https://doi.org/10.1016/j

    Nelson, K. P., Umarov, S. R. & Kon, M. A. On the average uncertainty for systems with nonlinear coupling. Physica A: Statistical Mechanics and its Applications468, 30–43 (2017). URL http://dx.doi.org/10.1016/j. physa.2016.09.046

  38. [38]

    Geometry of distributions associated with Tsallis statistics and properties of relative entropy minimization.Physics Letters A370, 184–193 (2007)

    Ohara, A. Geometry of distributions associated with Tsallis statistics and properties of relative entropy minimization.Physics Letters A370, 184–193 (2007). URL https://linkinghub.elsevier.com/retrieve/pii/ S0375960107008250

  39. [39]

    & Ohara, A

    Amari, S.-i. & Ohara, A. Geometry of q-Exponential Family of Probability Distributions.Entropy13, 1170–1185 (2011). URL https://www.mdpi.com/1099-4300/13/6/1170

  40. [40]

    & Nelson, K

    AL-Najafi, A., Tirnakli, U. & Nelson, K. P. Independent Approximates provide a maximum likelihood estimate for heavy-tailed distributions.Physica A: Statistical Mechanics and its Applications690, 131442 (2026). URL https://linkinghub.elsevier.com/retrieve/pii/S0378437126001780

  41. [41]

    & Tsallis, C

    Anteneodo, C. & Tsallis, C. Multiplicative noise: A mechanism leading to nonextensive statistical mechanics. Journal of Mathematical Physics44, 5194–5203 (2003). URL https://doi.org/10.1063/1.1617365

  42. [42]

    Tsallis Statistics, Statistical Mechanics for Non-extensive Systems and Long-Range Interactions (2021)

    Shalizi, Cosma R. Tsallis Statistics, Statistical Mechanics for Non-extensive Systems and Long-Range Interactions (2021). URL http://bactra.org/notebooks/tsallis.html

  43. [43]

    & Lazar, M

    Pierrard, V. & Lazar, M. Kappa Distributions: Theory and Applications in Space Plasmas.Solar Physics 267, 153–174 (2010). URL https://doi.org/10.1007/s11207-010-9640-2

  44. [44]

    & Schögl, F.Thermodynamics of Chaotic Systems : An Introduction(Cambridge University Press, 32 1993)

    Beck, C. & Schögl, F.Thermodynamics of Chaotic Systems : An Introduction(Cambridge University Press, 32 1993)

  45. [45]

    L., Martínez, S

    Ferri, G. L., Martínez, S. & Plastino, A. The role of constraints in Tsallis’ nonextensive treatment revisited. Physica A: Statistical Mechanics and its Applications347, 205–220 (2005). URL https://www.sciencedirect. com/science/article/pii/S0378437104011446

  46. [46]

    Nelson, K. P. Independent Approximates enable closed-form estimation of heavy-tailed distributions.Physica A: Statistical Mechanics and its Applications601, 127574 (2022). URL https://www.sciencedirect.com/ science/article/pii/S0378437122003983

  47. [47]

    Nelson, K. P. & Umarov, S. Nonlinear statistical coupling.Physica A: Statistical Mechanics and its Applications389, 2157–2163 (2010). URL https://linkinghub.elsevier.com/retrieve/pii/S0378437110000993

  48. [48]

    Nelson, K. P. inReduced Perplexity: A Simplified Perspective on Assessing Probabilistic Forecasts(eds Chen, M., Dunn, J. M., Golan, A. & Ullah, A.)Advances in Info-Metrics: Information and Information Processing across Disciplines(Oxford University Press, 2020). URL https://doi.org/10.1093/oso/9780190636685.003. 0012

  49. [49]

    Wang, Q. A. Probability distribution and entropy as a measure of uncertainty.Journal of Physics A: Mathematical and Theoretical41, 065004 (2008). URL http://arxiv.org/abs/cond-mat/0612076

  50. [50]

    & Gell-Mann, M

    Hanel, R., Thurner, S. & Gell-Mann, M. Generalized entropies and logarithms and their duality relations. Proceedings of the National Academy of Sciences109, 19151–19154 (2012). URL https://www.pnas.org/ doi/abs/10.1073/pnas.1216885109

  51. [51]

    & Jensen, H

    Tempesta, P. & Jensen, H. J. Universality Classes and Information-Theoretic Measures of Complex- ity via Group Entropies.Scientific Reports10, 5952 (2020). URL https://www.nature.com/articles/ s41598-020-60188-y

  52. [52]

    Multivariate group entropies, super-exponentially growing complex systems, and functional equations.Chaos: An Interdisciplinary Journal of Nonlinear Science30, 123119 (2020)

    Tempesta, P. Multivariate group entropies, super-exponentially growing complex systems, and functional equations.Chaos: An Interdisciplinary Journal of Nonlinear Science30, 123119 (2020). URL https://pubs. aip.org/cha/article/30/12/123119/282743/Multivariate-group-entropies-super-exponentially

  53. [53]

    & Hanel, R

    Thurner, S. & Hanel, R. The entropy of non-ergodic complex systems — a derivation from first prin- ciples.International Journal of Modern Physics: Conference Series16, 105–115 (2012). URL https: //www.worldscientific.com/doi/abs/10.1142/S2010194512007817

  54. [54]

    Relativistic Roots ofκ-Entropy.Entropy26, 406 (2024)

    Kaniadakis, G. Relativistic Roots ofκ-Entropy.Entropy26, 406 (2024). URL https://www.mdpi.com/ 1099-4300/26/5/406

  55. [55]

    & Corominas-Murtra, B

    Hanel, R. & Corominas-Murtra, B. The Typical Set and Entropy in Stochastic Systems with Arbitrary Phase Space Growth.Entropy25, 350 (2023). URL https://www.mdpi.com/1099-4300/25/2/350

  56. [56]

    Y.Mathematical Foundations Of Information Theory(Dover Publications, New York, NY, 1957)

    Khinchin, A. Y.Mathematical Foundations Of Information Theory(Dover Publications, New York, NY, 1957). URL http://archive.org/details/khinchin-mathematical-foundations-of-information-theory

  57. [57]

    Suyari, H. Generalization of Shannon-Khinchin axioms to nonextensive systems and the uniqueness theorem for the nonextensive entropy.IEEE Transactions on Information Theory50, 1783–1787 (2004). URL https://ieeexplore.ieee.org/abstract/document/1317122

  58. [58]

    Shiner, J. S. Simple measure for complexity.Physical Review E59, 1459–1464 (1999)

  59. [59]

    L., Mancini, H

    Ruiz, R. L., Mancini, H. & Calbet, X. inA statistical measure of complexity(eds Kowalski, Rossignoli & Curado)Concepts and Recent Advances in Generalized Information Measures and Statistics147–168 (Ben- tham Science Publishers, The Netherlands, 2013). URL https://www.benthamdirect.com/content/books/ 9781608057603.chapter-7?crawler=true&mimetype=application/pdf

  60. [60]

    T., Plastino, A

    Martin, M. T., Plastino, A. & Rosso, O. A. Generalized statistical complexity measures: Geometrical and analytical properties.Physica A: Statistical Mechanics and its Applications369, 439–462 (2006). URL https: //www.sciencedirect.com/science/article/pii/S0378437106001324?casa_token=oA_vAhvI77EAAAAA: xCQtUVmXe5rDOFlGkbG4G7wCKEecdeCzgPAL0gXqFXpyGGkMkdGZZhQ...

  61. [61]

    V., Sánchez-Moreno, P

    Rudnicki, L., Toranzo, I. V., Sánchez-Moreno, P. & Dehesa, J. S. Monotone measures of statistical com- plexity.Physics Letters A380, 377–380 (2016). URL https://www.sciencedirect.com/science/article/pii/ S0375960115009196

  62. [62]

    Generalized thermostatistics based on deformed exponential and logarithmic functions.Physica A: Statistical Mechanics and its Applications340, 32–40 (2004)

    Naudts, J. Generalized thermostatistics based on deformed exponential and logarithmic functions.Physica A: Statistical Mechanics and its Applications340, 32–40 (2004). URL https://www.sciencedirect.com/ science/article/pii/S0378437104003942

  63. [63]

    URL https://link.springer.com/10.1007/ 978-0-85729-355-8

    Naudts, J.Generalised Thermostatistics(Springer, London, 2011). URL https://link.springer.com/10.1007/ 978-0-85729-355-8

  64. [64]

    Curado, E. M. F. General aspects of the thermodynamical formalism.Brazilian Journal of Physics29, 36–45 (1999). URL https://www.scielo.br/j/bjp/a/j8RxQBH9QrsrDLHN4M5C4fB/?lang=en. 33

  65. [65]

    Enrico Fermi

    Tsallis, C., Baldovin, F., Cerbino, R. & Pierobon, P. inIntroduction to nonextensive statistical mechan- ics and thermodynamics(eds Mallamace, F. & Stanley, H. E.)The Physics of Complex Systems (New Advances and Perspectives), Vol. 155 ofProceedings of the International School of Physics "Enrico Fermi" 229–252 (IOS Press, Amsterdam, The Netherlands, 2004)...

  66. [66]

    L., Martínez, S

    Ferri, G. L., Martínez, S. & Plastino, A. Equivalence of the four versions of Tsallis’s statistics.Journal of Sta- tistical Mechanics: Theory and Experiment2005, P04009 (2005). URL https://doi.org/10.1088/1742-5468/ 2005/04/P04009

  67. [67]

    D., Souza, A

    Nobre, F. D., Souza, A. M. C. & Curado, E. M. F. Effective-temperature concept: A physical application for nonextensive statistical mechanics.Physical Review E86, 061113 (2012). URL https://link.aps.org/doi/10. 1103/PhysRevE.86.061113

  68. [68]

    & Worku, D

    Cleymans, J. & Worku, D. The Tsallis distribution in proton–proton collisions at $\sqrt{s}$ = 0.9 TeV at the LHC.Journal of Physics G: Nuclear and Particle Physics39, 025006 (2012). URL https://iopscience. iop.org/article/10.1088/0954-3899/39/2/025006

  69. [69]

    URL https://linkinghub.elsevier.com/ retrieve/pii/S0370269313003985

    Cleymans, J.et al.Systematic properties of the Tsallis distribution: Energy dependence of parameters in high energy p – p collisions.Physics Letters B723, 351–354 (2013). URL https://linkinghub.elsevier.com/ retrieve/pii/S0370269313003985

  70. [70]

    & Kiebel, S

    Friston, K. & Kiebel, S. Predictive coding under the free-energy principle.Philosophical Transactions of the Royal Society B: Biological Sciences364, 1211–1221 (2009). URL https://royalsocietypublishing.org/doi/ abs/10.1098/rstb.2008.0300

  71. [71]

    & Friston, K

    Smith, R., Badcock, P. & Friston, K. J. Recent advances in the application of predictive coding and active inference models within clinical neuroscience.Psychiatry and Clinical Neurosciences75, 3–13 (2021). URL https://onlinelibrary.wiley.com/doi/abs/10.1111/pcn.13138

  72. [72]

    & Lukasiewicz, T

    Millidge, B., Salvatori, T., Song, Y., Bogacz, R. & Lukasiewicz, T. Predictive Coding: Towards a Future of Deep Learning beyond Backpropagation? (2022). URL http://arxiv.org/abs/2202.09467

  73. [73]

    On the relationship between predictive coding and backpropagation.PLOS ONE17, e0266102 (2022)

    Rosenbaum, R. On the relationship between predictive coding and backpropagation.PLOS ONE17, e0266102 (2022). URL https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0266102

  74. [74]

    M., KUCUKELBIR, A., MCAULIFFE, J

    Blei, D. M., Kucukelbir, A. & McAuliffe, J. D. Variational Inference: A Review for Statisticians.Journal of the American Statistical Association112, 859–877 (2017). URL https://www.tandfonline.com/doi/full/10. 1080/01621459.2017.1285773

  75. [75]

    ActPC-Geom: Towards Scalable Online Neural-Symbolic Learning via Accelerating Active Predictive Coding with Information Geometry & Diverse Cognitive Mechanisms (2025)

    Goertzel, B. ActPC-Geom: Towards Scalable Online Neural-Symbolic Learning via Accelerating Active Predictive Coding with Information Geometry & Diverse Cognitive Mechanisms (2025). URL http://arxiv. org/abs/2501.04832

  76. [76]

    Coupled VAE:Improved AccuracyandRobustnessof aVariational Autoencoder.Entropy24, 423 (2022)

    Cao,S.,Li, J.,Nelson,K.P.&Kon, M.A. Coupled VAE:Improved AccuracyandRobustnessof aVariational Autoencoder.Entropy24, 423 (2022). URL https://www.mdpi.com/1099-4300/24/3/423

  77. [77]

    & Figueiredo, M

    Martins, A., Smith, N., Xing, E., Aguiar, P. & Figueiredo, M. Nonextensive information theoretic ker- nels on measures.Journal of Machine Learning Research(2009). URL http://www.jmlr.org/papers/v10/ martins09a.html

  78. [78]

    & Turner, R

    Li, Y. & Turner, R. E. Lee, Sugiyama, Luxburg, Guyon & Garnett (eds)Rényi Divergence Variational Inference. (eds Lee, Sugiyama, Luxburg, Guyon & Garnett)Advances in Neural Information Processing Systems, Vol. 29 (Curran Associates, Inc., Barcelona, Spain, 2016). URL https://proceedings.neurips.cc/ paper_files/paper/2016/hash/7750ca3559e5b8e1f44210283368fc...

  79. [79]

    q-VAE for Disentangled Representation Learning and Latent Dynamical Systems.IEEE Robotics and Automation Letters5, 5669–5676 (2020)

    Kobayashis, T. q-VAE for Disentangled Representation Learning and Latent Dynamical Systems.IEEE Robotics and Automation Letters5, 5669–5676 (2020). URL https://ieeexplore.ieee.org/abstract/document/ 9143393

  80. [80]

    P., Oliveira, I., Al-Najafi, A., Zhang, F

    Nelson, K. P., Oliveira, I., Al-Najafi, A., Zhang, F. & Ng, H. K. T. Iklé, M., Kolonin, A. & Bennett, M. (eds)Variational Inference Optimized Using the Curved Geometry of Coupled Free Energy. (eds Iklé, M., Kolonin, A. & Bennett, M.)Artificial General Intelligence: 18th International Conference, AGI 2025, 433–445 (Springer Nature Switzerland, Reykjavik, I...

Showing first 80 references.