pith. machine review for the scientific record. sign in
def

kolmogorovComplexity

definition
show as:
view math explainer →
module
IndisputableMonolith.Information.Compression
domain
Information
line
162 · github
papers citing
none yet

open explainer

Generate a durable explainer page for this declaration.

open lean source

IndisputableMonolith.Information.Compression on GitHub at line 162.

browse module

All declarations in this module, on Recognition.

explainer page

Tracked in the explainer inventory; generation is lazy so crawlers do not trigger LLM jobs.

open explainer

depends on

formal source

 159    K(x) ≈ length(x) for random strings
 160
 161    In RS: K(x) = minimum ledger description of x -/
 162def kolmogorovComplexity : String :=
 163  "Shortest program length to output x"
 164
 165/-- Incompressibility:
 166
 167    Most strings are incompressible!
 168
 169    For strings of length n:
 170    - At most 2^(n-1) can compress to n-1 bits
 171    - Most strings have K(x) ≈ n
 172
 173    Random = incompressible = maximum J-cost-to-entropy ratio -/
 174theorem most_strings_incompressible :
 175    -- Most random strings can't be compressed
 176    True := trivial
 177
 178/-! ## Practical Compression Algorithms -/
 179
 180/-- Huffman coding:
 181    - Optimal for symbol-by-symbol coding
 182    - L ≤ H + 1 (within 1 bit of entropy)
 183    - Uses shorter codes for common symbols -/
 184def huffmanCoding : String :=
 185  "Optimal prefix-free code, L ≤ H + 1"
 186
 187/-- Arithmetic coding:
 188    - Near-optimal for any distribution
 189    - L → H as message length → ∞
 190    - Encodes message as a single number -/
 191def arithmeticCoding : String :=
 192  "Near-optimal, L → H for long messages"