pith. machine review for the scientific record. sign in
module module high

IndisputableMonolith.Information.ShannonEntropy

show as:
view Lean formalization →

ShannonEntropy module equips Recognition Science with discrete probability distributions and Shannon entropy expressed as J-cost. Researchers deriving compression and error-correction bounds cite these primitives. The module supplies type definitions for ProbDist, entropy functions, non-negativity results, and direct equality to total J-cost, built from imported Cost definitions.

claimLet $P=(p_1,…,p_n)$ be a discrete probability distribution with $p_i≥0$ and $∑p_i=1$. Shannon entropy is $H(P)=-∑p_i log_2 p_i$, shown equal to the total J-cost of $P$.

background

The module sits in the Information domain and imports the RS time quantum τ₀=1 tick together with J-cost primitives. It defines ProbDist as a discrete probability distribution over n outcomes, introduces shannonEntropy, surprisal, and the relation shannon_equals_jcost. Additional results cover entropy non-negativity, maximality for the uniform distribution, and zero entropy for deterministic cases, all expressed in RS-native units.

proof idea

This is a definition module, no proofs.

why it matters in Recognition Science

Definitions here supply the information measures required by the Compression module for INFO-003 (data compression limits from J-cost and Shannon source coding), by ErrorCorrectionBounds for INFO-005 (8-tick error correction), and by InformationIsLedger for IC-001 (information as the physical ledger).

scope and limits

used by (3)

From the project-wide theorem graph. These declarations reference this one in their body.

depends on (2)

Lean names referenced from this declaration's body.

declarations in this module (19)