theorem
proved
atomic_tick
show as:
view math explainer →
open explainer
Generate a durable explainer page for this declaration.
open lean source
IndisputableMonolith.Foundation.Atomicity on GitHub at line 258.
browse module
All declarations in this module, on Recognition.
explainer page
depends on
formal source
255
256/-- Atomic tick (finite history): any finite recognition history admits a
257 serialization with exactly one posting per tick that respects precedence. -/
258theorem atomic_tick
259 (prec : Precedence E) [DecidableRel prec] [DecidableEq E]
260 (wf : WellFounded prec) (H : Finset E) :
261 ∃ σ : Schedule E,
262 σ.order.toFinset = H ∧
263 ∀ {e₁ e₂}, e₁ ∈ H → e₂ ∈ H → prec e₁ e₂ →
264 σ.order.indexOf e₁ < σ.order.indexOf e₂ := by
265 classical
266 obtain ⟨σ, _nodup, hcover, hrespect⟩ :=
267 exists_sequential_schedule (E:=E) prec wf H
268 exact ⟨σ, hcover, hrespect⟩
269
270/-- ## Countable serialization --/
271
272section Countable
273
274variable {E : DiscreteEventSystem}
275variable (ev : EventEvolution E)
276variable [DecidableEq E.Event]
277
278open LedgerNecessity
279
280local notation:max "Prec" => Precedence ev
281
282noncomputable def enum : ℕ → E.Event :=
283 Classical.choose (Countable.exists_surjective_nat (E.countable))
284
285lemma enum_surjective : Function.Surjective (enum (E:=E)) :=
286 Classical.choose_spec (Countable.exists_surjective_nat (E.countable))
287
288/-- An event is eligible once all predecessors have been posted. -/
papers checked against this theorem (showing 23 of 23)
-
FPGA SNN accelerator scales inference near-linearly with sparsity
"The core operates in discrete timesteps... lookup table (LUT) with precalculated solutions to this term for n_max entries"
-
TMPO lifts diffusion diversity 9.1% by matching trajectories to rewards
"Dynamic Stochastic Tree Sampling … trajectories share denoising prefixes and branch at dynamically scheduled steps"
-
Urea swaps water in BSA shell then lets it return
"Hydrogen bonds were identified based on geometric criteria, with a donor–acceptor distance≤0.35 nm and an acceptor–donor–hydrogen (A–D–H) angle ≤30°"
-
Packet metadata switches resident BNN models in 0.005 us
"multiple Binary Neural Network (BNN) models ... packet metadata selects the active model at packet granularity with O(1) selection cost"
-
Gaussian processes scale exactly to high dimensions on incomplete grids
"Definition 3.2. The set M is closed under taking subsets (CUTS) if m′ ⊂ m ∈ M implies m′ ∈ M. ... Definition 3.3. A set of multi-indices bI is closed under decrements (CUD) if ..."
-
Learned Lagrangians give stable long-term PDE forecasts
"ELM is an optimization-based integrator... scales linearly with domain size via Jacobi iteration"
-
Partitioning turns noisy EEG into self-supervised training data
"intelligent partitioning... segments into non-overlapping local windows... learns to extract... subset I(k),l constrained to cardinality W/2"
-
Multi-quark clusters form in neutron-star matter simulations
"we adopt dcluster = 0.15 fm as the clustering threshold"
-
MVP deconfliction keeps eVTOL energy overhead below 1.5% median
"MVP iteratively computes minimal velocity adjustments... Δv = (rpz - ||dCPA||)/tCPA · n̂ (Eq. 7); simulations across N=10-60 aircraft, 71,767 transits"
-
Condition number bounds clustering error from objective gap
"Core(s) = {i : d(x_i,θ*_j) ≤ D_eff − s}; enhanced margin γ + 2s; zero-error cores when local κ(s)·δ < 1/n (Lemma 5.1, Proposition 5.2)"
-
Sparse pinnings suffice for entropic independence with 1/c loss
"Application: down-up walk on fixed size independent sets … bounded degree graphs."
-
Exact protocol audits what LLM memory preserves under budget
"We instantiate this protocol with a concave-over-modular semantic coverage objective under storage and one-representation-per-experience constraints, and compute exact package optima using branch-and-bound with MILP certification."
-
LLM scaling cuts bits per character 38% and favors Huffman
"D(n) = max(0, t_exit(n−1) − t_arr(n)) + b(x_n)/C, with t_exit(0) = 0. ... Lindley's recursion ... per-token delay depends on the mean and variance of the bit lengths and on the coder's algorithmic latency."
-
Backward search plus distributed control yields simultaneous-arrival plans
"Two-step method: time-reversed CBS+SIPP-IP, then receding-horizon NADMM-based distributed OCP improvement."
-
Mutual information beats entropy in benchmark selection
"Entropy greedy is pivoted Cholesky, runs in O(k²N) time... MI is non-monotone in general but empirically monotone for small k, so we use greedy as a heuristic rather than invoking the standard monotone-submodular guarantee."
-
HADES agent generates mechanistic DILI hypotheses
"An AOP represents how an initial biological perturbation propagates ... starts with a Molecular Initiating Event (MIE) and proceeds through one or more Key Events (KEs) to the AO"
-
Risk budgets stabilize continuous edge inference scheduling
"Asynchronous online algorithm with finite-step convergence via FIP of potential games (Monderer–Shapley)."
-
Future-value particles improve training-free LLM reasoning
"Auxiliary Particle Power Sampling (APPS), a blockwise particle algorithm... future-value-guided selection at resampling boundaries."
-
Ancestor Hawkes process ties event impact to cluster origin
"If ρ(L) < 1, the triggered branching process is subcritical ... r = (I − L)⁻¹ K μ"
-
Greatest-solution method extends to symmetrized and supertropical semirings
"The usual approach is to first find the greatest solution... and then to solve a much harder problem of finding all minimal solutions... equivalent to the hypergraph transversal enumeration problem."
-
Event languages mapped to one Temporal Datalog engine for streams
"Atomic tick (finite history): any finite recognition history admits a serialization with exactly one posting per tick that respects precedence (RS Foundation/Atomicity.lean, theorem atomic_tick)."
-
Heuristic DSS raises retail ship-to-order ratio by 13.7 points
"Sort eligible orders by warehouse rank, then business priority, then volume; cumulative filtering. O(N log N)."
-
Pretrained LMs as energy functions boost text diffusion
"p(x_k = σ | x_{\k}) = min{1, exp(-f(x_{\k},σ) + f(x))}, with self-loops to ensure lazy chains."