Recognition: 2 theorem links
· Lean TheoremEqOD: Symmetry-Informed Stability Selection for PDE Identification
Pith reviewed 2026-05-13 01:26 UTC · model grok-4.3
The pith
EqOD detects Galilean invariance in trajectory data to shrink the candidate operator library before stability selection recovers the true PDE even under high noise.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
EqOD is a fully automatic pipeline that first applies a weak-form structural test to decide whether the data exhibit Galilean invariance; if so it substitutes a provably reduced library that excludes impossible terms, otherwise it performs randomized LASSO stability selection, and in either case a residual check prevents degradation below the full library. Across thirty-two controlled experiments this yields F1 equal to 1.000 on the heat equation at 20 percent noise and wins or ties every comparison against WF-LASSO, official PySINDy 2.0, and a WSINDy reimplementation.
What carries the argument
Galilean library reduction triggered by a weak-form invariance detector, combined with randomized LASSO stability selection that prunes the space of differential operators while controlling false positives.
If this is right
- Accurate recovery of the governing equation without manual library pruning for any autonomous PDE that satisfies the symmetry test.
- Consistent outperformance over full-library sparse regression on both clean and noisy data across multiple equation families.
- Automatic fallback that guarantees the method is never worse than the unreduced baseline.
- Successful extension to two-dimensional, nonlinear-Schrödinger, coupled, and wake-flow problems without changing the core pipeline.
Where Pith is reading between the lines
- The same symmetry-first reduction strategy could be applied to other invariances such as rotational or scaling symmetries once corresponding structural tests are developed.
- Real sensor data from fluid or material experiments would be a natural next testbed where the noise-robustness gain matters most.
- Integration with physics-informed neural networks could use the reduced library as a prior, further tightening the discovery loop.
- Formal finite-sample guarantees for the stability-selection step under the correlated design matrices typical of PDE libraries remain open and would strengthen the method.
Load-bearing premise
The weak-form test must correctly identify Galilean invariance from finite noisy trajectories, and the true equation must obey the autonomy and library assumptions used in the exclusion proof.
What would settle it
A Galilean-invariant PDE at 20 percent noise on which the invariance detector returns negative and the subsequent model either misses a true term or includes a provably excluded term.
Figures
read the original abstract
Data-driven identification of partial differential equations (PDEs) relies on sparse regression over a candidate library of differential operators, where larger libraries inflate false positives under observation noise and smaller libraries risk missing true terms. We introduce Equivariant Operator Discovery (EqOD), a fully automatic method combining two library reduction mechanisms. When Galilean invariance is detected from trajectory data via a weak-form structural test, EqOD uses the symmetry-reduced library, eliminating terms that our Galilean exclusion result proves to be absent from the governing equation. Otherwise, it applies randomized LASSO stability selection guided by classical false-positive bounds. A residual-based fallback prevents degradation below the full-library baseline. On 8 PDEs at 4 noise levels, EqOD attains $F_1 = 1.000 \pm 0.000$ on Heat at $20\%$ noise, where WF-LASSO obtains $0.475 \pm 0.181$, official PySINDy 2.0 obtains $0.000$, and the WSINDy reimplementation obtains $0.789$. Under the strict criterion that the mean F1 difference exceeds the larger of the two standard deviations, EqOD wins 7 of 32 cells. WF-LASSO wins none, and the remaining 25 cells are ties. Across all 32 cells, EqOD outperforms PySINDy 2.0.0 in 23 of 32 cells, and all 5 PySINDy wins occur on reaction PDEs. External validation on WeakIdent and PINN-SR datasets gives $F_1 = 1.000$ on all 5 clean benchmarks. NLS, 2D, coupled-system, and cylinder-wake extensions are reported. The Galilean library reduction is proved under explicit autonomy and library assumptions. The stability-selection step is motivated by classical false-positive bounds, while formal guarantees for correlated PDE design matrices remain open.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper introduces Equivariant Operator Discovery (EqOD) for data-driven PDE identification from noisy trajectories. It detects Galilean invariance via a weak-form structural test on data and, when detected, applies a symmetry-reduced candidate library whose exclusion of absent terms is proved under autonomy and library assumptions; otherwise it falls back to randomized LASSO stability selection with classical false-positive bounds and a residual-based safeguard. Experiments on 8 PDEs at 4 noise levels report F1 scores superior to WF-LASSO, PySINDy 2.0, and a WSINDy reimplementation (including F1=1.000 on Heat at 20% noise), with 7 strict wins under a mean-difference criterion, plus external validation on clean benchmarks and extensions to NLS, 2D, coupled, and wake flows.
Significance. If the weak-form test reliably identifies Galilean invariance from finite noisy data, EqOD offers a principled way to prune libraries using physical symmetries, reducing false positives in sparse regression while preserving completeness via the fallback. The explicit proof of term exclusion and the robustness mechanism are clear strengths that could improve reproducibility and accuracy in equation discovery pipelines.
major comments (2)
- [§3.2 and Table 2] §3.2 (weak-form structural test) and Table 2 (Heat 20% noise row): the reported F1=1.000±0.000 (vs. 0.475 WF-LASSO, 0.000 PySINDy) is load-bearing for the symmetry-reduction claim, yet no precision/recall, detection-failure rates, or ablation (with vs. without the test) are supplied across the 8 PDEs and 4 noise levels; without these it is unclear whether the 7 strict wins arise from successful library reduction or from the fallback and stability-selection components.
- [§2.3] §2.3 (Galilean exclusion result): the proof eliminates absent terms only under explicit autonomy and library assumptions; the manuscript does not state how these assumptions are verified or relaxed for the experimental PDEs (e.g., Heat, NLS) when the test is applied to 20% noisy finite trajectories.
minor comments (2)
- [Abstract and §4] Abstract and §4: the strict win criterion (mean F1 difference exceeds the larger standard deviation) and the definition of ties should be restated explicitly when reporting the 7/32 and 23/32 counts.
- [§5] §5 (extensions): the NLS, 2D, coupled-system, and cylinder-wake results are mentioned only briefly; adding a short table or paragraph on how the structural test and fallback behaved on these cases would improve completeness.
Simulated Author's Rebuttal
We thank the referee for the constructive feedback and for highlighting the potential of EqOD's symmetry-informed approach. We address each major comment below with clarifications and commitments to strengthen the presentation of results and assumptions.
read point-by-point responses
-
Referee: [§3.2 and Table 2] §3.2 (weak-form structural test) and Table 2 (Heat 20% noise row): the reported F1=1.000±0.000 (vs. 0.475 WF-LASSO, 0.000 PySINDy) is load-bearing for the symmetry-reduction claim, yet no precision/recall, detection-failure rates, or ablation (with vs. without the test) are supplied across the 8 PDEs and 4 noise levels; without these it is unclear whether the 7 strict wins arise from successful library reduction or from the fallback and stability-selection components.
Authors: We agree that additional diagnostics would strengthen the claim that symmetry reduction, rather than the fallback mechanism, drives the reported gains. The manuscript states that the weak-form test triggers the reduced library when Galilean invariance is detected and otherwise falls back to randomized LASSO stability selection with the residual safeguard; the F1=1.000 result on Heat at 20% noise occurs under successful detection. However, we do not currently report per-PDE detection success rates, precision/recall of the test, or an explicit ablation. In the revision we will add a supplementary table listing detection outcomes across all 8 PDEs and 4 noise levels, together with an ablation comparing EqOD against its stability-selection-only variant. This will make the source of the 7 strict wins transparent. revision: yes
-
Referee: [§2.3] §2.3 (Galilean exclusion result): the proof eliminates absent terms only under explicit autonomy and library assumptions; the manuscript does not state how these assumptions are verified or relaxed for the experimental PDEs (e.g., Heat, NLS) when the test is applied to 20% noisy finite trajectories.
Authors: The Galilean exclusion theorem is stated under the explicit assumptions of autonomy (no explicit time dependence) and a library that is closed under the relevant differential operations. All PDEs in the experimental suite (Heat, NLS, Burgers, etc.) are autonomous, and the candidate libraries are constructed to contain all monomials up to the maximum order appearing in the true equation, satisfying the library assumption. The weak-form test is applied directly to the observed trajectories; because the test itself is derived from the weak form of the invariance condition, it does not require exact verification of the assumptions on noisy data. We will revise §2.3 to state these facts explicitly and to note that the assumptions hold for the autonomous PDE classes considered, while acknowledging that the proof does not cover non-autonomous or library-incomplete cases. revision: yes
Circularity Check
No significant circularity in derivation chain
full rationale
The Galilean reduction is presented as a proved result under explicit autonomy and library assumptions rather than a fitted or self-referential construction. Stability selection is motivated by classical false-positive bounds external to the paper. No equations or steps reduce by construction to their own inputs, fitted parameters renamed as predictions, or load-bearing self-citations. Empirical F1 comparisons are reported directly from experiments on benchmark PDEs and do not depend on tautological renaming or ansatz smuggling. The weak-form detection test is an assumption whose validity is separate from circularity analysis.
Axiom & Free-Parameter Ledger
axioms (2)
- domain assumption The governing PDE is autonomous
- domain assumption The candidate library satisfies closure properties under Galilean transformations
Lean theorems connected to this paper
-
IndisputableMonolith/Cost/FunctionalEquation.leanwashburn_uniqueness_aczel unclear?
unclearRelation between the paper passage and the cited Recognition theorem.
Stability-selected support estimation. We use randomized LASSO stability selection as a non-Galilean library-pruning gate.
What do these tags mean?
- matches
- The paper's claim is directly supported by a theorem in the formal canon.
- supports
- The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
- extends
- The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
- uses
- The paper appears to rely on the theorem as machinery.
- contradicts
- The paper's claim conflicts with a theorem or certificate in the canon.
- unclear
- Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.
Reference graph
Works this paper leans on
-
[1]
Nonlinear analysis of hydrodynamic instability in laminar flames—I
Gi Siv Ashinsky. Nonlinear analysis of hydrodynamic instability in laminar flames—I. Deriva- tion of basic equations. pages 459–488, 1988
work page 1988
-
[2]
Fitting partial differential equations to space- time dynamics.Physical Review E, 59(1):337, 1999
Markus Bär, Rainer Hegger, and Holger Kantz. Fitting partial differential equations to space- time dynamics.Physical Review E, 59(1):337, 1999
work page 1999
-
[3]
Data-driven discovery of PDEs in complex datasets.J
Jens Berg and Kaj Nyström. Data-driven discovery of PDEs in complex datasets.J. Comput. Phys., 384:239–252, 2019
work page 2019
-
[4]
George W Bluman, Sukeyuki Kumei, and GW Bluman.Symmetries and differential equations, volume 81. 1989
work page 1989
-
[5]
DeepMoD: Deep learning for model discovery in noisy data.J
Gert-Jan Both, Subham Choudhury, Pierre Sens, and Remy Kusters. DeepMoD: Deep learning for model discovery in noisy data.J. Comput. Phys., 428:109985, 2021
work page 2021
-
[6]
Steven L Brunton, Joshua L Proctor, and J Nathan Kutz. Discovering governing equations from data by sparse identification of nonlinear dynamical systems.Proceedings of the national academy of sciences, 113(15):3932–3937, 2016
work page 2016
-
[7]
Peter Bühlmann and Sara Van De Geer.Statistics for high-dimensional data: methods, theory and applications. 2011
work page 2011
-
[8]
A mathematical model illustrating the theory of turbulence
Johannes Martinus Burgers. A mathematical model illustrating the theory of turbulence. Advances in applied mechanics, 1:171–199, 1948
work page 1948
-
[9]
Kathleen Champion, Bethany Lusch, J Nathan Kutz, and Steven L Brunton. Data-driven discovery of coordinates and governing equations.Proceedings of the National Academy of Sciences, 116(45):22445–22451, 2019
work page 2019
-
[10]
Champion, Peng Zheng, Aleksandr Y
Kathleen P. Champion, Peng Zheng, Aleksandr Y . Aravkin, Steven L. Brunton, and J. Nathan Kutz. A Unified Sparse Optimization Framework to Learn Parsimonious Physics-Informed Models From Data.IEEE Access, 8:169259–169271, 2020
work page 2020
-
[11]
Yuntian Chen, Yingtao Luo, Qiang Liu, Hao Xu, and Dongxiao Zhang. Symbolic genetic algorithm for discovering open-form partial differential equations (SGA-PDE).Physical Review Research, 4(2):023174, 2022
work page 2022
-
[12]
Zhao Chen, Yang Liu, and Hao Sun. Physics-informed learning of governing equations from scarce data.Nature communications, 12(1):6136, 2021
work page 2021
-
[13]
Champion, Markus Quade, Jean-Christophe Loiseau, J
Brian de Silva, Kathleen P. Champion, Markus Quade, Jean-Christophe Loiseau, J. Nathan Kutz, and Steven L. Brunton. PySINDy: A Python package for the sparse identification of nonlinear dynamical systems from data.J. Open Source Softw., 5(49):2104, 2020
work page 2020
-
[14]
Urban Fasel, J Nathan Kutz, Bingni W Brunton, and Steven L Brunton. Ensemble-SINDy: Robust sparse model discovery in the low-data, high-noise limit, with active learning and control. Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 478 (2260), 2022
work page 2022
-
[15]
Mark Fels and Peter J Olver. Moving coframes: II. Regularization and theoretical foundations. Acta Applicandae Mathematica, 55(2):127–208, 1999
work page 1999
-
[16]
The wave of advance of advantageous genes.Annals of eugenics, 7(4): 355–369, 1937
Ronald Aylmer Fisher. The wave of advance of advantageous genes.Annals of eugenics, 7(4): 355–369, 1937
work page 1937
-
[17]
Daniel R Gurevich, Patrick AK Reinbold, and Roman O Grigoriev. Robust and optimal sparse regression for nonlinear PDE models.Chaos: An Interdisciplinary Journal of Nonlinear Science, 29(10), 2019
work page 2019
-
[18]
Governing Equation Discovery from Data Based on Differential Invariants.CoRR, abs/2505.18798, 2025
Lexiang Hu, Yikang Li, and Zhouchen Lin. Governing Equation Discovery from Data Based on Differential Invariants.CoRR, abs/2505.18798, 2025. 11
-
[19]
Kadierdan Kaheman, J Nathan Kutz, and Steven L Brunton. SINDy-PI: a robust algorithm for parallel implicit sparse identification of nonlinear dynamics.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 476(2242), 2020
work page 2020
-
[20]
Alan A. Kaptanoglu, Brian de Silva, Urban Fasel, Kadierdan Kaheman, Andy Goldschmidt, Jared Callaham, Charles B. Delahunt, Zachary Nicolaou, Kathleen P. Champion, Jean- Christophe Loiseau, J. Nathan Kutz, and Steven L. Brunton. PySINDy: A comprehensive Python package for robust sparse system identification.J. Open Source Softw., 7(69):3994, 2022
work page 2022
-
[21]
Learning Infinitesimal Generators of Continuous Symmetries from Data
Gyeonghoon Ko, Hyunsu Kim, and Juho Lee. Learning Infinitesimal Generators of Continuous Symmetries from Data. InNeurIPS, 2024
work page 2024
-
[22]
Andrei Nikolaevitch Kolmogorov. A study of the equation of diffusion with increase in the quantity of matter, and its application to a biological problem.Moscow University Bulletin of Mathematics, 1:1–25, 1937
work page 1937
-
[23]
Diederik Johannes Korteweg and Gustav De Vries. XLI. On the change of form of long waves advancing in a rectangular canal, and on a new type of long stationary waves.The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science, 39(240):422–443, 1895
-
[24]
Yoshiki Kuramoto and Toshio Tsuzuki. Persistent propagation of concentration waves in dissipative media far from thermal equilibrium.Progress of theoretical physics, 55(2):356–369, 1976
work page 1976
-
[25]
Machine learning conservation laws from trajectories.Physical Review Letters, 126(18):180604, 2021
Ziming Liu and Max Tegmark. Machine learning conservation laws from trajectories.Physical Review Letters, 126(18):180604, 2021
work page 2021
-
[26]
PDE-Net: Learning PDEs from Data
Zichao Long, Yiping Lu, Xianzhong Ma, and Bin Dong. PDE-Net: Learning PDEs from Data. InICML, pages 3214–3222, 2018
work page 2018
-
[27]
PDE-Net 2.0: Learning PDEs from data with a numeric-symbolic hybrid deep network.J
Zichao Long, Yiping Lu, and Bin Dong. PDE-Net 2.0: Learning PDEs from data with a numeric-symbolic hybrid deep network.J. Comput. Phys., 399, 2019
work page 2019
-
[28]
Suryanarayana Maddu, Bevan L Cheeseman, Ivo F Sbalzarini, and Christian L Müller. Stability selection enables robust learning of differential equations from limited noisy data.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 478(2262), 2022
work page 2022
-
[29]
Niall M Mangan, J Nathan Kutz, Steven L Brunton, and Joshua L Proctor. Model selection for dynamical systems via sparse regression and information criteria.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473(2204), 2017
work page 2017
-
[30]
Nicolai Meinshausen and Peter Bühlmann. Stability selection.Journal of the Royal Statistical Society Series B: Statistical Methodology, 72(4):417–473, 2010
work page 2010
-
[31]
Daniel A. Messenger and David M. Bortz. Weak SINDy for partial differential equations.J. Comput. Phys., 443:110525, 2021
work page 2021
-
[32]
Fourth-Order Time-Stepping For Stiff PDEs On The Sphere.SIAM J
Hadrien Montanelli and Yuji Nakatsukasa. Fourth-Order Time-Stepping For Stiff PDEs On The Sphere.SIAM J. Sci. Comput., 40(1), 2018
work page 2018
-
[33]
Peter J Olver.Applications of Lie groups to differential equations, volume 107. 1993
work page 1993
-
[34]
Maziar Raissi and George E. Karniadakis. Hidden physics models: Machine learning of nonlinear partial differential equations.J. Comput. Phys., 357:125–141, 2018
work page 2018
-
[35]
Maziar Raissi, Paris Perdikaris, and George E. Karniadakis. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations.J. Comput. Phys., 378:686–707, 2019
work page 2019
-
[36]
Patrick AK Reinbold, Daniel R Gurevich, and Roman O Grigoriev. Using noisy or incomplete data to discover models of spatiotemporal dynamics.Physical Review E, 101(1):010203, 2020. 12
work page 2020
-
[37]
Data-driven discovery of partial differential equations.Science advances, 3(4):e1602614, 2017
Samuel H Rudy, Steven L Brunton, Joshua L Proctor, and J Nathan Kutz. Data-driven discovery of partial differential equations.Science advances, 3(4):e1602614, 2017
work page 2017
-
[38]
Hayden Schaeffer. Learning partial differential equations via data discovery and sparse optimiza- tion.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 473(2197), 2017
work page 2017
-
[39]
Sparse model selection via integral terms.Physical Review E, 96(2):023302, 2017
Hayden Schaeffer and Scott G McCalla. Sparse model selection via integral terms.Physical Review E, 96(2):023302, 2017
work page 2017
-
[40]
Distilling free-form natural laws from experimental data
Michael Schmidt and Hod Lipson. Distilling free-form natural laws from experimental data. science, 324(5923):81–85, 2009
work page 2009
-
[41]
Rajen D Shah and Richard J Samworth. Variable selection with error control: another look at stability selection.Journal of the Royal Statistical Society Series B: Statistical Methodology, 75 (1):55–80, 2013
work page 2013
-
[42]
PDEBench: An Extensive Benchmark for Scientific Machine Learning
Makoto Takamoto, Timothy Praditia, Raphael Leiteritz, Daniel MacKinlay, Francesco Alesiani, Dirk Pflüger, and Mathias Niepert. PDEBench: An Extensive Benchmark for Scientific Machine Learning. InNeurIPS, 2022
work page 2022
-
[43]
WeakIdent: Weak formulation for identifying differential equation using narrow-fit and trimming.J
Mengyi Tang, Wenjing Liao, Rachel Kuske, and Sung Ha Kang. WeakIdent: Weak formulation for identifying differential equation using narrow-fit and trimming.J. Comput. Phys., 483: 112069, 2023
work page 2023
-
[44]
Robert Tibshirani. Regression shrinkage and selection via the lasso.Journal of the Royal Statistical Society Series B: Statistical Methodology, 58(1):267–288, 1996
work page 1996
-
[45]
Lloyd N Trefethen.Spectral methods in MATLAB. 2000
work page 2000
-
[46]
Tycho F. A. van der Ouderaa, Mark van der Wilk, and Pim de Haan. Noether’s Razor: Learning Conserved Quantities. InNeurIPS, 2024
work page 2024
-
[47]
The identification of continuous, spatiotemporal systems
H V oss, MJ Bünner, and Markus Abel. The identification of continuous, spatiotemporal systems. arXiv preprint chao-dyn/9907010, 1999
-
[48]
Martin J. Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso).IEEE Trans. Inf. Theory, 55(5):2183–2202, 2009
work page 2009
-
[49]
Hao Xu, Haibin Chang, and Dongxiao Zhang. Dl-pde: Deep-learning based data-driven discovery of partial differential equations from discrete and noisy data.arXiv preprint arXiv:1908.04463, 2019
-
[50]
Hao Xu, Haibin Chang, and Dongxiao Zhang. DLGA-PDE: Discovery of PDEs with incomplete candidate library via combination of deep learning and genetic algorithm.J. Comput. Phys., 418:109584, 2020
work page 2020
-
[51]
Hao Xu, Dongxiao Zhang, and Nanzhe Wang. Deep-learning based discovery of partial differential equations in integral form from sparse and noisy data.J. Comput. Phys., 445: 110592, 2021
work page 2021
-
[52]
Hao Xu, Junsheng Zeng, and Dongxiao Zhang. Discovery of partial differential equations from highly noisy and sparse data with physics-informed information criterion.Research, 6:0147, 2023
work page 2023
-
[53]
Generative Adversarial Symmetry Discovery
Jianke Yang, Robin Walters, Nima Dehmamy, and Rose Yu. Generative Adversarial Symmetry Discovery. InICML, pages 39488–39508, 2023
work page 2023
-
[54]
Symmetry-Informed Governing Equation Discovery
Jianke Yang, Wang Rao, Nima Dehmamy, Robin Walters, and Rose Yu. Symmetry-Informed Governing Equation Discovery. InNeurIPS, 2024
work page 2024
-
[55]
Discovering Symbolic Differential Equations with Symmetry Invariants.Trans
Jianke Yang, Manu Bhat, Bryan Hu, Yadi Cao, Nima Dehmamy, Robin Walters, and Rose Yu. Discovering Symbolic Differential Equations with Symmetry Invariants.Trans. Mach. Learn. Res., 2026, 2026. 13
work page 2026
-
[56]
Sheng Zhang and Guang Lin. Robust data-driven discovery of governing physical laws with error bars.Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 474(2217), 2018
work page 2018
-
[57]
On Model Selection Consistency of Lasso.J
Peng Zhao and Bin Yu. On Model Selection Consistency of Lasso.J. Mach. Learn. Res., 7: 2541–2563, 2006
work page 2006
-
[58]
Hui Zou and Trevor Hastie. Regularization and variable selection via the elastic net.Journal of the Royal Statistical Society Series B: Statistical Methodology, 67(2):301–320, 2005. 14 Appendix Table of Contents A List of Notation 16 B Related work 16 C Mathematical Background 18 C.1 Lie point symmetries of evolution PDEs . . . . . . . . . . . . . . . . ....
work page 2005
-
[59]
introduced the SINDy framework for ordinary differential equation discovery, using STLSQ. Schaeffer [38] and Rudy et al. [37] extended sparse-regression ideas to PDEs, computing spatial derivatives and applying sparse optimization over candidate libraries. These methods are effective on clean data but degrade under noise because pointwise derivatives ampl...
-
[60]
use ensemble bagging to estimate inclusion probabilities under low-data and high-noise regimes. Unified sparse-optimization frameworks broaden the optimizer family available for parsimonious model discovery [10]. de Silva et al. [13] provide the PySINDy library, and the later comprehensive PySINDy release adds PDE, implicit, integral, constrained, and ens...
-
[61]
The WF-LASSO baseline is EqOD without Stages 1 and 2, so it does not use symmetry detection or stability selection and isolates the contribution of library reduction. PySINDy 2.0.0 baseline.We use the official pip package pysindy==2.0.0, installed via pip install pysindy==2.0.0 . The setup uses the STLSQ optimizer with the best threshold from a grid searc...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.