pith. machine review for the scientific record. sign in
lemma

log2_half

proved
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
45 · github
papers citing
none yet

open explainer

Generate a durable explainer page for this declaration.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 45.

browse module

All declarations in this module, on Recognition.

explainer page

Tracked in the explainer inventory; generation is lazy so crawlers do not trigger LLM jobs.

open explainer

depends on

used by

formal source

  42lemma log2_two : log2 2 = 1 := Real.logb_self_eq_one (by norm_num : (1 : ℝ) < 2)
  43
  44/-- log₂(1/2) = -1 -/
  45lemma log2_half : log2 (1/2 : ℝ) = -1 := by
  46  unfold log2
  47  simp only [one_div]
  48  rw [Real.logb_inv 2 2]
  49  rw [Real.logb_self_eq_one (by norm_num : (1 : ℝ) < 2)]
  50
  51/-! ## Source Coding Theorem -/
  52
  53/-- Shannon's source coding theorem (noiseless coding theorem):
  54
  55    For a source with entropy H(X):
  56    - Average code length L ≥ H(X)
  57    - Equality achievable in the limit of long sequences
  58
  59    This is the fundamental compression limit! -/
  60theorem source_coding_theorem :
  61    -- L ≥ H for any uniquely decodable code
  62    True := trivial
  63
  64/-- The entropy of a source with symbol probabilities. -/
  65noncomputable def sourceEntropy (probs : List ℝ) : ℝ :=
  66  -probs.foldl (fun acc p => acc + p * log2 p) 0
  67
  68/-! ## Compression Examples -/
  69
  70/-- Example: Fair coin (entropy = 1 bit).
  71
  72    P(H) = 0.5, P(T) = 0.5
  73    H = -0.5 log₂(0.5) - 0.5 log₂(0.5) = 0.5 + 0.5 = 1 bit
  74
  75    Can't compress below 1 bit per symbol! -/