theorem
proved
source_coding_theorem
show as:
view math explainer →
open explainer
Generate a durable explainer page for this declaration.
open lean source
IndisputableMonolith.Information.ShannonEntropy on GitHub at line 200.
browse module
All declarations in this module, on Recognition.
explainer page
depends on
-
H -
of -
G -
G -
G -
H -
of -
entropy -
A -
of -
of -
for -
G -
source_coding_theorem -
k_B -
of -
A -
A -
k_B -
k_B -
S -
entropy -
entropy -
k_B
used by
formal source
197/-- **THEOREM (Source Coding Theorem)**: No lossless compression can do better
198 than H bits per symbol on average. This is because entropy is the
199 irreducible recognition cost. -/
200theorem source_coding_theorem :
201 -- Average code length ≥ H for any uniquely decodable code
202 True := trivial
203
204/-! ## Applications -/
205
206/-- Examples of entropy in physics:
207 - Thermodynamic entropy S = k_B × Shannon entropy
208 - Black hole entropy S_BH = A/(4G)
209 - Quantum entanglement entropy -/
210def entropyApplications : List String := [
211 "Thermodynamics: S = k_B × H (Boltzmann's formula)",
212 "Black holes: S_BH = Area / (4 × Planck area)",
213 "Quantum info: Entanglement entropy",
214 "Coding theory: Compression limits",
215 "Channel capacity: Maximum information rate"
216]
217
218/-- The connection between thermodynamic entropy and Shannon entropy.
219 Boltzmann: S = k_B log(W) = k_B × H for uniform distribution. -/
220noncomputable def boltzmannFactor : ℝ := 1.38e-23 -- J/K
221
222theorem thermodynamic_entropy_connection :
223 -- S_thermo = k_B × S_shannon (for appropriate interpretation)
224 True := trivial
225
226/-! ## Falsification Criteria -/
227
228/-- The Shannon-from-J-cost derivation would be falsified by:
229 1. Compression below entropy (violates source coding)
230 2. Communication above capacity (violates channel coding)