pith. machine review for the scientific record. sign in
def

fairCoinEntropy

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
76 · github
papers citing
none yet

open explainer

Generate a durable explainer page for this declaration.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 76.

browse module

All declarations in this module, on Recognition.

explainer page

Tracked in the explainer inventory; generation is lazy so crawlers do not trigger LLM jobs.

open explainer

depends on

used by

formal source

  73    H = -0.5 log₂(0.5) - 0.5 log₂(0.5) = 0.5 + 0.5 = 1 bit
  74
  75    Can't compress below 1 bit per symbol! -/
  76noncomputable def fairCoinEntropy : ℝ :=
  77  -0.5 * log2 0.5 - 0.5 * log2 0.5
  78
  79theorem fair_coin_one_bit :
  80    fairCoinEntropy = 1 := by
  81  unfold fairCoinEntropy
  82  simp only [show (0.5 : ℝ) = 1/2 from by norm_num]
  83  rw [log2_half]
  84  ring
  85
  86/-- Example: Biased coin (entropy < 1 bit).
  87
  88    P(H) = 0.9, P(T) = 0.1
  89    H = -0.9 log₂(0.9) - 0.1 log₂(0.1)
  90      ≈ 0.137 + 0.332 ≈ 0.47 bits
  91
  92    Can compress to ~0.47 bits per symbol! -/
  93noncomputable def biasedCoinEntropy : ℝ :=
  94  -0.9 * log2 0.9 - 0.1 * log2 0.1
  95
  96/-! ## J-Cost Connection -/
  97
  98/-- In RS, compression is J-cost minimization:
  99
 100    **Uncompressed data**: High redundancy = High J-cost
 101    **Compressed data**: No redundancy = Low J-cost
 102    **Perfect compression**: J-cost = entropy (minimum)
 103
 104    Compression algorithms seek minimum J-cost! -/
 105theorem compression_is_jcost_minimization :
 106    -- Compression minimizes J-cost of representation