Exhaustive enumeration of functions up to complexity k across operator bases shows the integrability fraction declines with k but rises sharply with logarithms, and the method discovers three integrals that resist SymPy, Mathematica, RUBI, FriCAS, Maxima, and Giac.
author Charton, F
4 Pith papers cite this work. Polarity classification is still indexing.
years
2026 4representative citing papers
Latent Grammar Flow discovers ODEs by placing grammar-based equation representations in a discrete latent space, using a behavioral loss to cluster similar equations, and sampling via a discrete flow model guided by data fit and constraints.
k-server-bench formulates potential-function discovery for the k-server conjecture as a code-based inequality-satisfaction task; current agents fully solve the resolved k=3 case and reduce violations on the open k=4 case.
The grokking delay in encoder-decoder models on one-step Collatz prediction stems from decoder inability to use early-learned encoder representations of parity and residue structure, with numeral base acting as a strong inductive bias that can raise accuracy from failure to 99.8%.
citing papers explorer
-
Exhaustive Symbolic Integration: Integration by Differentiation and the Landscape of Symbolic Integrability
Exhaustive enumeration of functions up to complexity k across operator bases shows the integrability fraction declines with k but rises sharply with logarithms, and the method discovers three integrals that resist SymPy, Mathematica, RUBI, FriCAS, Maxima, and Giac.
-
Neuro-Symbolic ODE Discovery with Latent Grammar Flow
Latent Grammar Flow discovers ODEs by placing grammar-based equation representations in a discrete latent space, using a behavioral loss to cluster similar equations, and sampling via a discrete flow model guided by data fit and constraints.
-
$k$-server-bench: Automating Potential Discovery for the $k$-Server Conjecture
k-server-bench formulates potential-function discovery for the k-server conjecture as a code-based inequality-satisfaction task; current agents fully solve the resolved k=3 case and reduce violations on the open k=4 case.
-
The Long Delay to Arithmetic Generalization: When Learned Representations Outrun Behavior
The grokking delay in encoder-decoder models on one-step Collatz prediction stems from decoder inability to use early-learned encoder representations of parity and residue structure, with numeral base acting as a strong inductive bias that can raise accuracy from failure to 99.8%.