1. Plain English Statement
The declaration states that the $N$-dimensional aggregate cost of the "all-ones" vector is exactly zero, regardless of the dimension size or the specific weights applied to each component.
2. Importance in Recognition Science
In Recognition Science, cost represents the structural "effort" of distinguishing states. The foundational scalar identity is that self-recognition (a ratio of 1) is cost-free. When extending the framework to $N$-dimensional compound states, the JcostN function aggregates individual component costs. This THEOREM confirms that the multi-component aggregate perfectly preserves the baseline: a state experiencing no change or distinction across any of its $N$ dimensions incurs zero cost, anchoring the ledger's absolute floor.
3. Reading the Formal Statement
theorem JcostN_unit {n : ℕ} (α : Vec n) :
JcostN α (fun _ => 1) = 0
{n : ℕ}: The theorem holds for any natural number $n$, representing the number of dimensions.(α : Vec n): It accepts any vector $\alpha$ of $n$ real numbers, which act as the aggregation weights.(fun _ => 1): This is Lean's way of writing a constant function returning1for every index (the all-ones vector).JcostN α ... = 0: The $N$-dimensional cost function evaluated with weights $\alpha$ on the all-ones vector evaluates to0.
4. Visible Dependencies
The supplied proof uses the simp tactic:
simp [JcostN, JlogN, dot, logVec, Jcost_unit0]
This reveals the mechanical steps:
- logVec takes the natural log of each component of the all-ones vector, yielding a vector of zeros.
- dot takes the weighted sum of these zeros with $\alpha$, yielding
0. - JlogN exponentiates this sum (
exp(0) = 1) and passes it to the scalar cost function. - The theorem relies on the externally defined scalar fact
Jcost_unit0(which statesJcost 1 = 0) to conclude the result is zero.
5. What This Does Not Prove
This declaration establishes that the all-ones vector has zero cost, but it does not prove uniqueness. It does not assert that the all-ones vector is the only vector with zero cost for a given set of weights. In fact, JcostN_eq_zero_iff explicitly proves that JcostN α x = 0 if and only if the weighted sum of the logarithms (dot α (logVec x)) is zero. If the weights $\alpha$ and logs of components $x$ cancel each other out, non-unit vectors can also yield zero aggregate cost.