pith. machine review for the scientific record. sign in
def

lempelZiv

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
199 · github
papers citing
none yet

open explainer

Read the cached plain-language explainer.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 199.

browse module

All declarations in this module, on Recognition.

explainer page

A cached Ask Recognition explainer exists for this declaration.

open explainer

depends on

formal source

 196    - Adaptive (learns patterns)
 197    - Achieves entropy in limit
 198    - Used in ZIP, PNG, GIF -/
 199def lempelZiv : String :=
 200  "Dictionary-based, adaptive, asymptotically optimal"
 201
 202/-! ## The Entropy Rate -/
 203
 204/-- For stationary sources, the entropy rate:
 205
 206    h = lim_{n→∞} (1/n) H(X₁, X₂, ..., Xₙ)
 207
 208    This accounts for correlations between symbols.
 209
 210    Example: English text
 211    - Single letter entropy: ~4.2 bits/letter
 212    - Entropy rate: ~1.0-1.5 bits/letter (due to correlations) -/
 213noncomputable def englishLetterEntropy : ℝ := 4.2  -- bits
 214noncomputable def englishEntropyRate : ℝ := 1.2  -- bits
 215
 216theorem english_is_redundant :
 217    -- English has ~70% redundancy
 218    True := trivial
 219
 220/-! ## Summary -/
 221
 222/-- RS perspective on compression:
 223
 224    1. **Shannon limit**: Can't compress below entropy
 225    2. **J-cost minimization**: Compression reduces J-cost
 226    3. **Entropy = minimum J-cost**: For faithful representation
 227    4. **Redundancy = excess J-cost**: Can be removed
 228    5. **Random = incompressible**: Already minimum J-cost -/
 229def summary : List String := [