dAlembert_to_ODE_general_theorem
Smooth solutions to the d'Alembert equation H(t + u) + H(t - u) = 2 H(t) H(u) satisfy the second-order linear ODE H''(t) = H''(0) H(t). Recognition theorists cite this when reducing the Recognition Composition Law to its differential form prior to solving for the cosh fixed point. The argument proceeds by twice differentiating both sides of the functional equation and evaluating at the origin after verifying C^2 regularity from the given C^∞ hypothesis.
claimLet $H : ℝ → ℝ$ be infinitely differentiable. If $H(t + u) + H(t - u) = 2 H(t) H(u)$ holds for all real numbers $t$ and $u$, then the second derivative satisfies $H''(t) = H''(0) · H(t)$ for every $t$.
background
In the Recognition Science framework the cost function J obeys the Recognition Composition Law J(xy) + J(x/y) = 2 J(x) J(y) + 2 J(x) + 2 J(y). The reparametrization H = J + 1 converts this into the additive d'Alembert equation H(t + u) + H(t - u) = 2 H(t) H(u) after a logarithmic change of variables. The present module supplies auxiliary results for the T5 uniqueness proof that J is the unique solution of this form.
proof idea
The proof first extracts C^2 regularity from the C^∞ hypothesis via ContDiff lemmas. It constructs explicit HasDerivAt maps for the linear shifts t + u and t - u. After equating the two sides of the functional equation as functions of u, it equates their second derivatives at zero. Direct computation shows the left-hand side equals 2 H''(t) while the right-hand side equals 2 H(t) H''(0); the relation follows by algebraic cancellation.
why it matters in Recognition Science
This general ODE bridge is invoked by dAlembert_to_ODE_general in AxiomDischargePlan to handle the unnormalized case when discharging the Aczél–Kannappan axiom. It supports the T5 step that forces J-uniqueness by showing that smooth solutions to the cost equation must satisfy the linear ODE whose solutions are multiples of cosh. The result closes one link in the forcing chain from the Recognition Composition Law to the explicit form J(x) = cosh(log x) - 1.
scope and limits
- Does not assume the normalization H''(0) = 1.
- Does not classify the solutions of the resulting ODE.
- Does not extend to merely continuous or measurable solutions.
- Does not address the multiplicative form of d'Alembert directly.
formal statement (Lean)
882theorem dAlembert_to_ODE_general_theorem (H : ℝ → ℝ)
883 (h_smooth : ContDiff ℝ ⊤ H)
884 (h_dAlembert : ∀ t u, H (t+u) + H (t-u) = 2 * H t * H u) :
885 ∀ t, deriv (deriv H) t = deriv (deriv H) 0 * H t := by
proof body
Tactic-mode proof.
886 have hCDiff2 : ContDiff ℝ 2 H := h_smooth.of_le le_top
887 have hDiff : Differentiable ℝ H :=
888 hCDiff2.differentiable (by decide : (2 : WithTop ℕ∞) ≠ 0)
889 have hCDiff1_H' : ContDiff ℝ 1 (deriv H) := by
890 rw [show (2 : WithTop ℕ∞) = 1 + 1 from rfl] at hCDiff2
891 rw [contDiff_succ_iff_deriv] at hCDiff2
892 exact hCDiff2.2.2
893 have hDiffDeriv : Differentiable ℝ (deriv H) :=
894 hCDiff1_H'.differentiable (by decide : (1 : WithTop ℕ∞) ≠ 0)
895 have hsh_add : ∀ (s v : ℝ), HasDerivAt (fun u => s + u) (1 : ℝ) v := fun s v => by
896 have h := (hasDerivAt_id v).add_const s; simp only [id] at h
897 rwa [show (fun u : ℝ => u + s) = fun u => s + u from funext fun u => add_comm u s] at h
898 have hsh_sub : ∀ (s v : ℝ), HasDerivAt (fun u => s - u) (-1 : ℝ) v := fun s v => by
899 have h1 : HasDerivAt (fun u : ℝ => -u) (-1 : ℝ) v := by
900 have := (hasDerivAt_id v).neg; simp only [id] at this; exact this
901 have h2 := h1.const_add s
902 rwa [show (fun u : ℝ => s + -u) = fun u => s - u from funext fun u => by ring] at h2
903 intro t
904 have h_feq : (fun u => H (t + u) + H (t - u)) = (fun u => 2 * H t * H u) :=
905 funext (h_dAlembert t)
906 have key : deriv (deriv (fun u => H (t + u) + H (t - u))) 0 =
907 deriv (deriv (fun u => 2 * H t * H u)) 0 :=
908 congr_arg (fun f => deriv (deriv f) 0) h_feq
909 have lhs_eq : deriv (deriv (fun u => H (t + u) + H (t - u))) 0 = 2 * deriv (deriv H) t := by
910 have h_plus : ∀ v, HasDerivAt (fun u => H (t + u)) (deriv H (t + v)) v := fun v => by
911 have hH := (hDiff (t + v)).hasDerivAt
912 have hcomp := hH.comp v (hsh_add t v)
913 simp only [mul_one, Function.comp_apply] at hcomp; exact hcomp
914 have h_minus : ∀ v, HasDerivAt (fun u => H (t - u)) (-deriv H (t - v)) v := fun v => by
915 have hH := (hDiff (t - v)).hasDerivAt
916 have hcomp := hH.comp v (hsh_sub t v)
917 simp only [mul_neg, mul_one, Function.comp_apply] at hcomp; exact hcomp
918 have hfirst_fun : deriv (fun u => H (t + u) + H (t - u)) =
919 fun v => deriv H (t + v) - deriv H (t - v) := funext fun v => by
920 have heq : (fun u => H (t + u)) + (fun u => H (t - u)) =
921 fun u => H (t + u) + H (t - u) := by ext u; rfl
922 have h12 : deriv (fun u => H (t + u) + H (t - u)) v = deriv H (t + v) + -deriv H (t - v) := by
923 rw [← heq]; exact ((h_plus v).add (h_minus v)).deriv
924 linarith [show deriv H (t + v) + -deriv H (t - v) =
925 deriv H (t + v) - deriv H (t - v) from by ring]
926 have hd2_plus : HasDerivAt (fun v => deriv H (t + v)) (deriv (deriv H) t) 0 := by
927 have hDH : HasDerivAt (deriv H) (deriv (deriv H) (t + 0)) (t + 0) :=
928 (hDiffDeriv (t + 0)).hasDerivAt
929 have hcomp := hDH.comp 0 (hsh_add t 0)
930 simp only [mul_one, add_zero, Function.comp_apply] at hcomp; exact hcomp
931 have hd2_minus : HasDerivAt (fun v => deriv H (t - v)) (-deriv (deriv H) t) 0 := by
932 have hDH : HasDerivAt (deriv H) (deriv (deriv H) (t - 0)) (t - 0) :=
933 (hDiffDeriv (t - 0)).hasDerivAt
934 have hcomp := hDH.comp 0 (hsh_sub t 0)
935 simp only [mul_neg, mul_one, sub_zero, Function.comp_apply] at hcomp; exact hcomp
936 rw [congr_fun (congr_arg deriv hfirst_fun) 0]
937 have heq2 : (fun v => deriv H (t + v)) - (fun v => deriv H (t - v)) =
938 fun v => deriv H (t + v) - deriv H (t - v) := by ext v; rfl
939 have h : deriv (fun v => deriv H (t + v) - deriv H (t - v)) 0 =
940 deriv (deriv H) t - -deriv (deriv H) t := by
941 rw [← heq2]; exact (hd2_plus.sub hd2_minus).deriv
942 linarith [show deriv (deriv H) t - -deriv (deriv H) t = 2 * deriv (deriv H) t from by ring]
943 have rhs_eq : deriv (deriv (fun u => 2 * H t * H u)) 0 =
944 2 * H t * deriv (deriv H) 0 := by
945 have hfirst_fun : deriv (fun u => 2 * H t * H u) = fun v => 2 * H t * deriv H v :=
946 funext fun v => ((hDiff v).hasDerivAt.const_mul (2 * H t)).deriv
947 have hsecond := (hDiffDeriv 0).hasDerivAt.const_mul (2 * H t)
948 rw [congr_fun (congr_arg deriv hfirst_fun) 0, hsecond.deriv]
949 rw [lhs_eq, rhs_eq] at key
950 linarith
951
952/-- **Theorem (ODE Derivation)**: If H is C∞ and satisfies d'Alembert with H''(0) = 1,
953 then H'' = H everywhere.
954
955 Proof: Fix t. Define f(u) = H(t+u) + H(t-u) and g(u) = 2H(t)H(u).
956 Since f = g, their second derivatives at 0 agree.
957 Differentiating f twice and evaluating at 0 gives 2H''(t).
958 Differentiating g twice and evaluating at 0 gives 2H(t)H''(0) = 2H(t).
959 Hence 2H''(t) = 2H(t), so H''(t) = H(t). -/