pith. machine review for the scientific record. sign in

cs.SC

Symbolic Computation

Roughly includes material in ACM Subject Class I.1.

0
cs.SC 2026-05-11 1 theorem

Right-hand side regularity fixes ODE solving difficulty

Relating the Computational and Logical Difficulty of Solving ODEs: From Polynomial to Discontinuous Right-Hand Sides

It assigns each initial value problem to one precise stratum from polynomial-time solutions to transfinite computation.

abstract click to expand
When a computer algebra system fails to solve an Ordinary Differential Equation, is this a limitation of its implementation, or a genuine computational barrier? Three traditions bear on the question. Modern computer algebra algorithms can be extremely efficient: Newton-type methods solve polynomial ODEs over $\mathbb{Q}[[X]]$ in quasi-linear time. Analog models of computation has shown that polynomial ODEs and Turing machines are two presentations of the same phenomenon, with solution length acting as time and precision as space. Computable analysis shows that ODEs can be intrinsically hard -- undecidable, even $\mathsf{PSPACE}$-complete, over compact domains. Comparing these traditions is natural and necessary, yet such comparisons routinely reduce to comparisons of encodings rather than of underlying algorithmic content. We argue that reverse mathematics provides a representation-invariant lens in which algorithmic content is compared directly. We prove that every level of the Big Five hierarchy is inhabited by a natural statement from classical ODE theory, as an exact equivalence: the regularity of $f$ is an intrinsic algorithmic invariant placing the initial value problem $y'(t)=f(t,y(t))$, $y(t_0)=y_0$, into one of several computational strata, ranging from polynomial-time solvability to transfinite computation. The resulting stratification acts as a practical diagnostic common to the three traditions. By abstracting from representation, it separates fundamental barriers from the technical shortcomings of symbolic solvers, the artefacts of analog encodings, and the effectivity constraints of computable analysis, identifying the intrinsic parameters (length bounds, radii of convergence, moduli of continuity) under which feasibility is restored.
0
0
cs.SC 2026-05-07

Logarithms triple the fraction of integrable expressions

Exhaustive Symbolic Integration: Integration by Differentiation and the Landscape of Symbolic Integrability

Exhaustive search up to given complexity shows integrability depends strongly on operator basis and uncovers integrals missed by major CAS

Figure from the paper full image
abstract click to expand
We introduce Exhaustive Symbolic Integration (ESI), a method that enumerates all symbolic functions up to a given complexity $k$ within a specified operator basis and determines which admit closed-form antiderivatives within the same class. This allows us to compute the "integrability fraction" $\rho(k)$ (the fraction of functions whose derivatives lie within the same class), which we do for five operator bases including combinations of rational functions, powers, exponentials, logarithms and trigonometric functions. We find that $\rho(k)$ declines at high complexity and that the operator basis has a dramatic effect -- in particular, adding the logarithm boosts $\rho(k)$ by a factor of $\sim$3 and produces or exacerbates a clear peak at $k=6$. We also deploy ESI as a novel integration algorithm, identifying three integrals that resist SymPy, Mathematica, RUBI, FriCAS, Maxima and Giac under all tested strategies. When an antiderivative can be found by multiple methods, ESI often returns the simplest form. These results reveal that the landscape of symbolic integrability is shaped primarily by the choice of operators, and that exhaustive enumeration can systematically discover integrable forms -- including novel ones -- that elude computer albegra systems.
0
0
cs.SC 2026-05-07

Minimum adapted CAD exists for class of sets in R^3

On Minimum CADs for Algebraic Sets in Dimension Three

The class includes every algebraic set and gives the first existence result for a non-trivial collection beyond two dimensions

abstract click to expand
Cylindrical Algebraic Decomposition (CAD) algorithms typically produce a decomposition adapted to a finite family of semi-algebraic sets $\mathcal{F}$ (i.e. every member of $\mathcal{F}$ is a union of cells). Different algorithms may produce different outputs, and introduce unnecessary cell divisions. Recent work by Michel, Mathonet, and Z\'ena\"idi in ISSAC 2024 formalised this issue by studying the refinement order on the set of all CADs adapted to $\mathcal{F}$ and analysing the existence of a minimum (coarsest) adapted CAD. It was shown that such a minimum adapted CAD always exists for subsets of $\mathbb{R}$ and $\mathbb{R}^2$, but not of $\mathbb{R}^n$ ($n \geqslant 3$) in general. It is natural to seek natural classes of subsets of $\mathbb{R}^n$ that admit a minimum adapted CAD. In this paper, we identify a class of subsets of $\mathbb{R}^3$ that contains all algebraic sets for which minimum adapted CADs do exist. This provides the first positive existence theorem for minimum CAD for a non-trivial class of sets.
0
0
cs.SC 2026-05-06

LCM-lattice density in random monomial ideals jumps at sharp thresholds

Asymptotic properties of random monomial ideals

Data separate low-density Taylor regimes from high-density redundant ones by narrow transition windows whose location depends on generator

Figure from the paper full image
abstract click to expand
This paper focuses on asymptotic properties of random monomial ideals through a statistical viewpoint. It extends the study of redundancy in monomial ideals by analyzing the poset density of the LCM-lattice. We explore how this density behaves across random algebraic models and structured networks. Experimental data reveal that the LCM-lattice exhibits sharp threshold behavior rather than changing smoothly. We observe a strong negative correlation between the number of generators and LCM-lattice density, abruptly separating three distinct regimes: a low-density Taylor-like regime, a high-density redundant regime, and a narrow transition window. We show that increasing the generator degree causes this density drop to occur at lower probability thresholds. We conclude by conjecturing that for equigenerated squarefree ideals, the LCM-lattice density undergoes a sharp phase transition, analogous to the emergence of giant components in hypergraphs. This suggests that the classical, ideal-by-ideal role of the LCM-lattice as a combinatorial invariant also admits a statistical/asymptotic counterpart: in natural random families, redundancy and resolution-complexity indicators concentrate into distinct typical regimes separated by a narrow transition window.
0
0
cs.SC 2026-05-01

Order-3 Möbius map splits cube-root integrals into elementary and elliptic pieces

A Generalisation of Goursat's Algorithm for Integration in Finite Terms

Two eigencomponents reduce to rational curves and integrate elementarily; the third stays on y³ = x(x-K) and is generically transcendental.

abstract click to expand
We give a self-contained, modern exposition of \'Edouard Goursat's 1887 theorem on pseudo-elliptic integrals -- those integrals of the form $\int F(t)\,\d t/\sqrt{R(t)}$ with $R$ a cubic or quartic polynomial that, despite living on a genus-$1$ algebraic curve, admit elementary antiderivatives. After reviewing integration in finite terms and Liouville's theorem, we present Goursat's two main theorems with proofs phrased in the language of M\"obius automorphisms of the underlying hyperelliptic curve. We then develop a cube-root analog: for integrals of the form $\int F(t)\,\d t/\sqrt[3]{R(t)}$ with $R$ cubic, an order-$3$ M\"obius substitution cyclically permuting the roots of $R$ induces an eigendecomposition into three pieces. Two of the three eigenpieces (eigenvalues $1$ and $\omega^2$, where $\omega = e^{2\pi i/3}$) descend through a chain of substitutions to genus-$0$ curves and yield elementary antiderivatives; the middle eigenpiece (eigenvalue $\omega$) descends only to the genus-$1$ curve $y^3 = x(x-K)$ and is generically transcendental.
0
0
cs.SC 2026-04-30

Complex quantifier elimination reduces to real QE plus heuristic reinterpretation

Pseudo-Complex Quantifier Elimination

Formulas over complexes with i, Re, Im and conjugates are first mapped to real problems, solved there, and then rewritten back into the rich

Figure from the paper full image
abstract click to expand
We describe the design of a quantifier elimination framework for the complex numbers in the language of ordered rings supplemented with symbols for the imaginary unit, real parts, imaginary parts, and conjugates. Technically, we use a reduction to real quantifier elimination followed by a heuristic reinterpretation of the results within our complex framework. We present computational examples using a prototypical implementation of our approach in our Python-based open-source system Logic1.
0
0
cs.SC 2026-04-29

Haskell package manipulates tree and graph algebras symbolically

Arboretum.hs: Symbolic manipulation for algebras of graphs

Arboretum.hs follows mathematical definitions directly to enable flexible extensions and compile-time safety for algebraic combinatorics.

Figure from the paper full image
abstract click to expand
We design the Arboretum$.$hs package for symbolic computations with algebras of trees and more general graphs in Haskell. Thanks to the declarative nature of functional programming, the package's implementation closely follows mathematical definitions, making the code intuitive and transparent for users working with algebraic and combinatorial structures. To assist with current mathematical research, Arboretum$.$hs supports experimentation by facilitating the introduction of new algebraic operations, as well as providing functionality for rendering trees and forests through LaTeX integration. Compared to recent imperative implementations in languages such as Julia or Python, Arboretum$.$hs offers greater flexibility for manipulating and extending tree-based structures. Its use of Haskell enables safe programming and strong compile-time guarantees, serving both as a practical computational tool and a foundation for further research in algebraic combinatorics, beyond the setting of trees usually considered in the implementation of Butcher series, which are a fundamental tool for the analysis of numerical integrators.
0
0
cs.SC 2026-04-28

Hybrid method gives complete check for quantum circuit equivalence

Equivalence Checking of Quantum Circuits via Path-Sum and Weighted Model Counting

Path-sum reductions plus weighted model counting decide equivalence up to global phase without full simulation.

Figure from the paper full image
abstract click to expand
Equivalence checking of quantum circuits is a central verification task in quantum computing, ensuring the correctness of circuit optimizations, hardware mappings, and compilation pipelines. Among the primary symbolic methods for this purpose, the path-sum formalism provides a compact representation with powerful reduction rules that yield a canonical form for the classically simulable Clifford fragment, but confluence fails beyond the Clifford fragment. We introduce a new weighted model counting (WMC) encoding for path-sums and combine it with the existing path-sum reductions to obtain a verifier that is both complete and efficient. Our method applies reductions whenever possible and invokes the WMC-based decision procedure on the residual path-sum, yielding a complete semantic check up to a global phase. We implement the approach and evaluate it on standard benchmarks. Results show that the hybrid method outperforms either component in isolation and competes with state-of-the-art tools.
0
0
cs.SC 2026-04-27

Multiple equations let CAD quantifier elimination partition parameters finely

Enhanced CAD-Based Quantifier Elimination With Multiple Equational Constraints

The method identifies regions with finite or infinite unknowns and drops more polynomials from the second projection than prior theory.

abstract click to expand
This paper presents two enhancements to cylindrical algebraic decomposition (CAD) based quantifier elimination (QE) for cases in which multiple equational constraints are present in the given input formula $\phi^*$. The first enhancement provides more detail in the output when there is a conceptual partition of the set of variables of $\phi^*$ into parameters and unknowns. In such cases, we describe how to partition the parameter space so that: (1) in each open set of the partition the number $\nu$ of associated unknowns is a finite constant or is infinite; and (2) for each such open set for which $\nu$ is finite, an expression for the unknowns in terms of the parameters is provided. The second enhancement is an efficiency gain achievable in certain situations. Indeed, when certain conditions are met, the second CAD equational projection step can be reduced more significantly than is supported by the prior existing theory. Relevant theorems and worked examples for both enhancements are provided. Application areas include approximation theory, cuspidal manipulator classification, and biological/chemical systems.
0
0
cs.SC 2026-04-27

Probabilistic model recognizes goals via hierarchical task networks

A Probabilistic Framework for Hierarchical Goal Recognition

Three-stage generative model with HTN planner yields posteriors over goals and outperforms prior HTN recognizers on benchmarks.

Figure from the paper full image
abstract click to expand
Goal recognition aims to infer an agent's goal from observations of its behaviour. In realistic settings, recognition can benefit from exploiting hierarchical task structure and reasoning under uncertainty. Planning-based goal recognition has made substantial progress over the past decade, but to the best of our knowledge no existing approach jointly integrates hierarchical task structure with probabilistic inference. In this paper, we introduce the first planning-based probabilistic framework for hierarchical goal recognition over Hierarchical Task Networks (HTNs). We instantiate the framework by exploiting an HTN planner with a three-stage generative model for likelihood estimation, yielding posterior distributions over goal hypotheses. Empirical results show improved recognition performance over the existing HTN-based recognizer on HTN benchmarks. Overall, the framework lays a foundation for probabilistic goal recognition grounded in hierarchical planning structure, moving goal recognition toward more practical settings.
0
0
cs.SC 2026-04-22

New package computes path signature tensors in Julia

SignatureTensors.jl: A Package for Signature Tensors in Julia

SignatureTensors.jl supports exact calculations with OSCAR and numerical approximations for paths.

abstract click to expand
We introduce SignatureTensors.jl, a new package for computing signature tensors of paths in julia. We present its core functionality and demonstrate its use through illustrative examples. The package is compatible with the computer algebra system OSCAR, enabling both exact and numerical computations with signatures.
0

browse all of cs.SC → full archive · search · sub-categories