pith. machine review for the scientific record. sign in

Explain the Lean theorem `FApply_add` in module `IndisputableMonolith.Cost.Ndim.Projector`. Write for an educated reader who knows science and programming but may not know Lean. Cover: (1) what the declaration says in plain English, (2) why it matters in Recognition Science, (3) how to read the formal statement, (4) visible dependencies or certificates in the supplied source, and (5) what this declaration does not prove. Cite only declarations present in the supplied Recognition source.

Big AI job. Grok 4.3 reads the canon and writes a Lean-grounded derivation; usually 20 seconds to 2 minutes. The page will refresh to your answer when it’s ready.
confidence: high in recognition cached

1. Plain English Statement

The theorem FApply_add states that the almost-product operator $F$ distributes over vector addition. If you apply $F$ to the sum of two vectors $v$ and $w$, the result is exactly the same as applying $F$ to each vector individually and adding the outcomes: $F(v + w) = F(v) + F(w)$. In mathematical terms, this establishes the additivity property of the operator.

2. Role in Recognition Science

In Recognition Science, the geometry of cost induces a finite-dimensional operator algebra. Specifically, a rank-one operator $A$ leads to a normalized projector $P$. From this projector, the theory defines the almost-product operator $F = 2P - I$ via FApply.

The operator $F$ is the algebraic precursor to the golden operator ($G$) and the broader metallic family, which formalize the fundamental $\phi$-scaling found in the RS framework. For $F$ to function properly as a linear operator in these constructions, it must respect vector space operations. This theorem proves the necessary additivity, serving as an intermediate structural lemma to guarantee the linear behavior of the cost-induced tensors.

3. Reading the Formal Statement

theorem FApply_add {n : ℕ}
    (lam : ℝ) (hInv : Fin n → Fin n → ℝ) (β : Vec n)
    (v w : Vec n) :
    FApply lam hInv β (v + w) = FApply lam hInv β v + FApply lam hInv β w
  • {n : ℕ}: The operation occurs in an $n$-dimensional vector space.
  • lam : ℝ: A real scalar parameter $\lambda$.
  • hInv : Fin n → Fin n → ℝ: The inverse metric kernel, represented as an $n \times n$ matrix.
  • β : Vec n: A distinguished one-form (or covector) in the space.
  • v w : Vec n: The two input vectors being added.
  • FApply lam hInv β (v + w): The operator $F$ applied to the sum $(v + w)$.
  • = FApply lam hInv β v + FApply lam hInv β w: Asserts equality with the sum of $F(v)$ and $F(w)$.

4. Visible Dependencies and Certificates

Within the provided source, the proof of FApply_add is a direct, component-wise computation. It relies on:

  • ext i: Proving equality for each component $i$ of the resulting vector.
  • PApply_add: The underlying additivity of the normalized projector $P$, which is required because $F(v) = 2P(v) - v$.
  • Lean's simp and ring tactics: Expanding definitions and verifying the ring arithmetic underlying vector addition.

5. What This Declaration Does Not Prove

  • Full Linearity: Additivity is only half of linearity. The other half (scalar multiplication, $F(cv) = cF(v)$) is proved separately in FApply_smul.
  • Involutivity: This theorem does not show that $F$ is an almost-product structure (i.e., that $F^2 = I$). That critical geometric property is proven downstream in FApply_square.
  • Physical Interpretation: The declaration is purely algebraic and structural; it does not assign any physical meaning (such as spacetime or state space) to the $n$-dimensional space, nor does it connect to the 8-tick processing cadence or specific RS observables.

cited recognition theorems

  • Projector.FApply_add This is the target theorem requested by the user, proving the additivity of the F operator.
  • Projector.FApply Defines the almost-product operator F as 2P - I, which is the subject of the theorem.
  • Projector.PApply_add Provides the required additivity of the projector P, which the proof of FApply_add relies on.

recognition modules consulted

The Recognition library is at github.com/jonwashburn/shape-of-logic. The model is restricted to the supplied Lean source and instructed not to invent theorem names. Treat output as a starting point, not a verified proof.