pith. machine review for the scientific record. sign in
def definition def or abbrev high

messageJCost

show as:
view Lean formalization →

messageJCost defines the J-cost of a message as its length multiplied by one minus its redundancy. Information theorists deriving compression bounds in Recognition Science would cite this expression when linking entropy to minimum organizational cost. The definition is a direct algebraic product with no lemmas or tactics applied.

claimFor a message of length $l$ and redundancy $r$, the J-cost is $J = l(1-r)$. Maximum compression sets $J$ equal to the source entropy when all redundancy is removed.

background

The Information.Compression module derives fundamental limits on data compression from J-cost. J-cost measures the organizational cost of a message, with compression acting as J-cost minimization while preserving faithful representation. The module restates Shannon's source coding theorem as the statement that average code length cannot fall below entropy $H(X)$ and identifies this bound with minimum J-cost in Recognition Science units.

proof idea

This is a one-line definition that directly computes the product of length and (1 minus redundancy). No upstream lemmas are invoked beyond the arithmetic operations on real numbers.

why it matters in Recognition Science

The definition supplies the explicit J-cost expression used to connect compression to the Recognition Composition Law and the reparametrized cost function $H(x) = J(x) + 1$. It supports the module claim that entropy equals minimum J-cost for lossless representation and feeds the sibling result that compression is J-cost minimization. The construction aligns with the broader framework in which information carries J-cost derived from the functional equation and the eight-tick octave structure.

scope and limits

formal statement (Lean)

 117noncomputable def messageJCost (length redundancy : ℝ) : ℝ :=

proof body

Definition body.

 118  length * (1 - redundancy)
 119
 120/-! ## Lossless vs Lossy Compression -/
 121
 122/-- Lossless compression:
 123    - Exact reconstruction possible
 124    - Limit: H(X) bits
 125    - Examples: ZIP, PNG, FLAC
 126
 127    In RS: Preserves all ledger information -/

depends on (7)

Lean names referenced from this declaration's body.