pith. machine review for the scientific record. sign in
def

biasedCoinEntropy

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
93 · github
papers citing
none yet

open explainer

Generate a durable explainer page for this declaration.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 93.

browse module

All declarations in this module, on Recognition.

explainer page

Tracked in the explainer inventory; generation is lazy so crawlers do not trigger LLM jobs.

open explainer

depends on

formal source

  90      ≈ 0.137 + 0.332 ≈ 0.47 bits
  91
  92    Can compress to ~0.47 bits per symbol! -/
  93noncomputable def biasedCoinEntropy : ℝ :=
  94  -0.9 * log2 0.9 - 0.1 * log2 0.1
  95
  96/-! ## J-Cost Connection -/
  97
  98/-- In RS, compression is J-cost minimization:
  99
 100    **Uncompressed data**: High redundancy = High J-cost
 101    **Compressed data**: No redundancy = Low J-cost
 102    **Perfect compression**: J-cost = entropy (minimum)
 103
 104    Compression algorithms seek minimum J-cost! -/
 105theorem compression_is_jcost_minimization :
 106    -- Compression minimizes J-cost of representation
 107    True := trivial
 108
 109/-- The J-cost of a message:
 110
 111    J(message) = length × (1 - redundancy)
 112
 113    Maximum compression: J = entropy (no redundancy left).
 114
 115    This explains why you can't compress random data:
 116    Random data already has minimum J-cost for its entropy! -/
 117noncomputable def messageJCost (length redundancy : ℝ) : ℝ :=
 118  length * (1 - redundancy)
 119
 120/-! ## Lossless vs Lossy Compression -/
 121
 122/-- Lossless compression:
 123    - Exact reconstruction possible