pith. machine review for the scientific record. sign in

arxiv: 2605.04337 · v1 · submitted 2026-05-05 · 🧮 math.DS · eess.SP· stat.ML

Recognition: unknown

Symbolic Regression via Neural Networks

Jeff Moehlis, Nibodh Boddupalli, Timothy Matchen

Pith reviewed 2026-05-08 16:47 UTC · model grok-4.3

classification 🧮 math.DS eess.SPstat.ML
keywords symbolic regressionneural networksdynamical systemsgoverning equationsdeep learningmodel discovery
0
0 comments X

The pith

A deep neural network generates symbolic expressions for the governing equations of dynamical systems from data.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper develops a neural network that outputs symbolic mathematical expressions describing a dynamical system's evolution, rather than numerical predictions alone. Traditional deep learning provides accurate forecasts but limited insight into the rules, while fixed-dictionary symbolic methods typically demand prior system knowledge or produce overly complex results. By training the network end-to-end on trajectory data, the approach seeks to recover concise, interpretable equations without those limitations. A sympathetic reader would care because recovering explicit governing laws from observations alone would improve analysis, prediction, and control across fields that rely on dynamical models.

Core claim

The authors describe a deep neural network architecture that, when trained on data from a dynamical system, produces a symbolic expression for its governing equations. They demonstrate the model's performance by recovering accurate symbolic forms across several classical dynamical systems.

What carries the argument

A deep neural network architecture trained to map observed trajectories directly to symbolic differential equations.

Load-bearing premise

A neural network can be trained to output valid, parsimonious symbolic expressions for governing equations without prior knowledge of the system or overfitting to the data.

What would settle it

Train the network on data generated from the Lorenz system and verify whether it outputs the exact symbolic equations x' = sigma(y - x), y' = x(r - z) - y, z' = x y - b z.

Figures

Figures reproduced from arXiv: 2605.04337 by Jeff Moehlis, Nibodh Boddupalli, Timothy Matchen.

Figure 1
Figure 1. Figure 1: FIG. 1. Network architecture of view at source ↗
Figure 2
Figure 2. Figure 2: FIG. 2. Architecture of the view at source ↗
Figure 3
Figure 3. Figure 3: FIG. 3. (a) State-space comparison of the trajectory generated view at source ↗
Figure 4
Figure 4. Figure 4: FIG. 4. (a) State-space comparison of the trajectory generated view at source ↗
Figure 5
Figure 5. Figure 5: FIG. 5. (a) State-space visualization of the trajectory generated by view at source ↗
Figure 6
Figure 6. Figure 6: FIG. 6. (a) State-space visualization of the trajectory generated by view at source ↗
Figure 8
Figure 8. Figure 8: FIG. 8. (a) State-space comparison of trajectory generated (dashed view at source ↗
Figure 9
Figure 9. Figure 9: FIG. 9. State-space trajectory generated by Eqs. (31a-31b) (dashed view at source ↗
Figure 10
Figure 10. Figure 10: FIG. 10. (a) State-space comparison of the trajectory generated view at source ↗
read the original abstract

Identifying governing equations for a dynamical system is a topic of critical interest across an array of disciplines, from mathematics to engineering to biology. Machine learning -- specifically deep learning -- techniques have shown their capabilities in approximating dynamics from data, but a shortcoming of traditional deep learning is that there is little insight into the underlying mapping beyond its numerical output for a given input. This limits their utility in analysis beyond simple prediction. Simultaneously, a number of strategies exist which identify models based on a fixed dictionary of basis functions, but most either require some intuition or insight about the system, or are susceptible to overfitting or a lack of parsimony. Here we present a novel approach that combines the flexibility and accuracy of deep learning approaches with the utility of symbolic solutions: a deep neural network that generates a symbolic expression for the governing equations. We first describe the architecture for our model, then show the accuracy of our algorithm across a range of classical dynamical systems.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 0 minor

Summary. The manuscript proposes a novel deep neural network architecture that generates symbolic expressions for the governing equations of dynamical systems from data. It describes the model architecture and reports empirical accuracy on a range of classical dynamical systems, aiming to combine the flexibility of deep learning with the interpretability of symbolic models while avoiding the limitations of dictionary-based approaches.

Significance. If the method reliably produces accurate, parsimonious symbolic expressions without requiring prior system insight or suffering from overfitting, it would represent a meaningful advance in data-driven modeling of dynamical systems by bridging numerical approximation and symbolic insight. However, the abstract provides no quantitative metrics, error measures, data generation details, or baseline comparisons, so the significance cannot be assessed from the available information.

major comments (1)
  1. [Abstract] Abstract: the assertion that the algorithm demonstrates 'accuracy ... across a range of classical dynamical systems' supplies no metrics, error handling, data generation protocol, or comparisons to dictionary-based methods, rendering it impossible to evaluate whether the central claim of combining flexibility and utility is supported.

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for their review and constructive feedback on our manuscript. We address the single major comment below and will revise the manuscript to improve clarity.

read point-by-point responses
  1. Referee: [Abstract] Abstract: the assertion that the algorithm demonstrates 'accuracy ... across a range of classical dynamical systems' supplies no metrics, error handling, data generation protocol, or comparisons to dictionary-based methods, rendering it impossible to evaluate whether the central claim of combining flexibility and utility is supported.

    Authors: We agree that the abstract is insufficiently specific and does not allow readers to evaluate the strength of the results. The body of the manuscript contains quantitative metrics, error measures, and data generation details for the classical systems tested, along with discussion of how the approach avoids the need for a predefined dictionary. We will revise the abstract to summarize these elements (including key error metrics and data protocols) and to briefly note the advantages relative to dictionary-based methods, thereby making the central claim more readily assessable. revision: yes

Circularity Check

0 steps flagged

No significant circularity detected

full rationale

The abstract and context describe a neural network trained on data from classical dynamical systems to output symbolic governing equations, with subsequent empirical accuracy checks. No load-bearing derivations, self-citations, fitted parameters renamed as predictions, or ansatzes imported via prior work are present in the provided material. The central claim rests on external data and architecture design rather than reducing to its own inputs by construction.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

No specific free parameters, axioms, or invented entities are identifiable from the abstract alone; the method is described at a high level without detailing training losses, network hyperparameters, or symbolic grammar constraints.

pith-pipeline@v0.9.0 · 5460 in / 1131 out tokens · 46308 ms · 2026-05-08T16:47:11.752450+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

101 extracted references · 1 canonical work pages

  1. [1]

    Ljung , note =

    L. Ljung , note =. System Identification: Theory for the User , year =

  2. [2]

    Data-driven Science and Engineering , year =

    Brunton, Steven L and Kutz, J Nathan , publisher =. Data-driven Science and Engineering , year =

  3. [3]

    J. P. Crutchfield and B. S. McNamara , journal =. Equations of motion from a data series , volume =

  4. [4]

    Yao and E

    C. Yao and E. M. Bollt , journal =. Modeling and nonlinear parameter estimation with

  5. [5]

    Wang and R

    W.-X. Wang and R. Yang and Y.-C. Lai and V. Kovanis and C. Grebogi , journal =. Predicting catastrophes in nonlinear dynamical systems by compressive sensing , volume =

  6. [6]

    Lai , journal =

    Y.-C. Lai , journal =. Finding nonlinear system equations and complex network structures from data: A sparse optimization approach , volume =

  7. [7]

    S. L. Brunton and J. L. Proctor and J. N. Kutz , journal =. Discovering governing equations from data by sparse identification of nonlinear dynamical systems , volume =

  8. [8]

    Napoletani and T

    D. Napoletani and T. D. Sauer , journal =. Reconstructing the topology of sparsely connected dynamical networks , volume =

  9. [9]

    N. M. Mangan and S. L. Brunton and J. L. Proctor and J. N. Kutz , journal =. Inferring biological networks by sparse identification of nonlinear dynamics , volume =

  10. [10]

    Data-driven discovery of coordinates and governing equations , volume =

    Champion, Kathleen and Lusch, Bethany and Kutz, J Nathan and Brunton, Steven L , journal =. Data-driven discovery of coordinates and governing equations , volume =

  11. [11]

    Spectral properties of dynamical systems, model reduction and decompositions , volume =

    Mezi. Spectral properties of dynamical systems, model reduction and decompositions , volume =. Nonlinear Dynamics , number =

  12. [12]

    Budisic and R

    M. Budisic and R. Mohr and I. Mezic , journal =. Applied

  13. [13]

    Arbabi and I

    H. Arbabi and I. Mezic , journal =. Ergodic theory, dynamic mode decomposition, and computation of spectral properties of the

  14. [14]

    P. J. Schmid , journal =. Dynamic mode decomposition of numerical and experimental data , volume =

  15. [15]

    J. N. Kutz and S. L. Brunton and B. W. Brunton and J. L. Proctor , publisher =. Dynamic Mode Decomposition: Data-Driven Modeling of Complex Systems , year =

  16. [16]

    A data--driven approximation of the

    Williams, Matthew O and Kevrekidis, Ioannis G and Rowley, Clarence W , journal =. A data--driven approximation of the

  17. [17]

    Bongard and H

    J. Bongard and H. Lipson , journal =. Automated reverse engineering of nonlinear dynamical systems , volume =

  18. [18]

    Quade and M

    M. Quade and M. Abel and K. Shafi and R. K. Niven and B. R. Noack , journal =. Prediction of dynamical systems by symbolic regression , volume =

  19. [19]

    Glyph: Symbolic Regression Tools , volume =

    Quade, Markus and Gout, Julien and Abel, Markus , journal =. Glyph: Symbolic Regression Tools , volume =

  20. [20]

    Distilling free-form natural laws from experimental data , volume =

    Schmidt, Michael and Lipson, Hod , journal =. Distilling free-form natural laws from experimental data , volume =

  21. [21]

    Banzhaf and W

    W. Banzhaf and W. B. Langdon , journal =. Some considerations on the reason for bloat , volume =

  22. [22]

    Lapedes and R

    A. Lapedes and R. Farber , booktitle =. How neural nets work , year =

  23. [23]

    Rico-Martinez and K

    R. Rico-Martinez and K. Krischer and I. G. Kevrekidis and M. C. Kube and J. L. Hudson , journal =. Discrete- vs. continuous-time nonlinear signal processing of

  24. [24]

    A review of unsupervised feature learning and deep learning for time-series modeling , volume =

    L. A review of unsupervised feature learning and deep learning for time-series modeling , volume =. Pattern Recognition Letters , pages =

  25. [25]

    Integration of neural network-based symbolic regression in deep learning for scientific discovery , volume =

    Kim, Samuel and Lu, Peter Y and Mukherjee, Srijon and Gilbert, Michael and Jing, Li and. Integration of neural network-based symbolic regression in deep learning for scientific discovery , volume =. IEEE transactions on neural networks and learning systems , number =

  26. [26]

    Interpretable polynomial neural ordinary differential equations , volume =

    Fronk, Colby and Petzold, Linda , journal =. Interpretable polynomial neural ordinary differential equations , volume =

  27. [27]

    A new look at the statistical model identification , volume =

    Akaike, Hirotugu , journal =. A new look at the statistical model identification , volume =

  28. [28]

    Pynumdiff: a Python package for numerical differentiation of noisy time-series data , volume =

    Van Breugel, Floris and Liu, Yuying and Brunton, Bingni W and Kutz, J Nathan , journal =. Pynumdiff: a Python package for numerical differentiation of noisy time-series data , volume =

  29. [29]

    Recent advances in convolutional neural networks , volume =

    Gu, Jiuxiang and Wang, Zhenhua and Kuen, Jason and Ma, Lianyang and Shahroudy, Amir and Shuai, Bing and Liu, Ting and Wang, Xingxing and Wang, Gang and Cai, Jianfei and Chen, Tsuhan , journal =. Recent advances in convolutional neural networks , volume =

  30. [30]

    2010 , doi =

    Xu, ZongBen and Zhang, Hai and Wang, Yao and Chang, XiangYu and Liang, Yong , journal =. 2010 , doi =

  31. [31]

    Foundations of Machine Learning , year =

    Mohri, Mehryar and Rostamizadeh, Afshin and Talwalkar, Ameet , publisher =. Foundations of Machine Learning , year =

  32. [32]

    Reconciling modern machine-learning practice and the classical bias--variance trade-off , volume =

    Belkin, Mikhail and Hsu, Daniel and Ma, Siyuan and Mandal, Soumik , journal =. Reconciling modern machine-learning practice and the classical bias--variance trade-off , volume =

  33. [33]

    Adam: A Method for Stochastic Optimization , year =

    Kingma, Diederik P and Ba, Jimmy , booktitle =. Adam: A Method for Stochastic Optimization , year =

  34. [34]

    Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability , year =

    Jeremy Cohen and Simran Kaur and Yuanzhi Li and J Zico Kolter and Ameet Talwalkar , booktitle =. Gradient Descent on Neural Networks Typically Occurs at the Edge of Stability , year =

  35. [35]

    Model selection for dynamical systems via sparse regression and information criteria , volume =

    Mangan, Niall M and Kutz, J Nathan and Brunton, Steven L and Proctor, Joshua L , journal =. Model selection for dynamical systems via sparse regression and information criteria , volume =

  36. [36]

    Model Selection and Multimodel Inference , year =

    Burnham, K and Anderson, D , note =. Model Selection and Multimodel Inference , year =

  37. [37]

    Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields , year =

    Guckenheimer, John and Holmes, Philip , publisher =. Nonlinear Oscillations, Dynamical Systems, and Bifurcations of Vector Fields , year =

  38. [38]

    Spectrum of the

    Mezi. Spectrum of the. Journal of Nonlinear Science , number =

  39. [39]

    Deep learning for universal linear embeddings of nonlinear dynamics , volume =

    Lusch, Bethany and Kutz, J Nathan and Brunton, Steven L , journal =. Deep learning for universal linear embeddings of nonlinear dynamics , volume =

  40. [40]

    Chaotic behavior in simple reaction systems , volume =

    R. Chaotic behavior in simple reaction systems , volume =. Zeitschrift f

  41. [41]

    Deterministic nonperiodic flow , volume =

    Lorenz, Edward N , journal =. Deterministic nonperiodic flow , volume =

  42. [42]

    On explaining the surprising success of reservoir computing forecaster of chaos?

    Bollt, Erik , journal =. On explaining the surprising success of reservoir computing forecaster of chaos?

  43. [43]

    Impulses and physiological states in theoretical models of nerve membrane , volume =

    FitzHugh, Richard , journal =. Impulses and physiological states in theoretical models of nerve membrane , volume =

  44. [44]

    Chemical Oscillations and Instabilities , year =

    Gray, Peter and Scott, Stephen K , publisher =. Chemical Oscillations and Instabilities , year =

  45. [45]

    The double scroll family , volume =

    Chua, LEONO and Komuro, Motomasa and Matsumoto, Takashi , journal =. The double scroll family , volume =

  46. [46]

    On mean absolute error for deep neural network based vector-to-vector regression , volume =

    Qi, Jun and Du, Jun and Siniscalchi, Sabato Marco and Ma, Xiaoli and Lee, Chin-Hui , journal =. On mean absolute error for deep neural network based vector-to-vector regression , volume =

  47. [47]

    Deep Learning , year =

    Goodfellow, Ian and Bengio, Yoshua and Courville, Aaron , publisher =. Deep Learning , year =

  48. [48]

    Next generation reservoir computing , volume =

    Gauthier, Daniel J and Bollt, Erik and Griffith, Aaron and Barbosa, Wendson AS , journal =. Next generation reservoir computing , volume =

  49. [49]

    Tran and R

    G. Tran and R. Ward , journal =. Exact recovery of chaotic systems from highly corrupted data , volume =

  50. [50]

    A. A. R. AlMomani and J. Sun and E. Bollt , journal =. How entropic regression beats the outliers problem in nonlinear systems identification , volume =

  51. [51]

    NSF-2016004

    + commands may be crafted by hand or, preferably, * Acknowledgement This work was supported by National Science Foundation Grant No. NSF-2016004. * aipsamp document

  52. [52]

    Ljung ,\ @noop title System Identification: Theory for the User \ ( publisher Prentice Hall ,\ address Upper Saddle River, NJ ,\ year 1999 )\ note S econd Edition NoStop

    author author L. Ljung ,\ @noop title System Identification: Theory for the User \ ( publisher Prentice Hall ,\ address Upper Saddle River, NJ ,\ year 1999 )\ note S econd Edition NoStop

  53. [53]

    author author S. L. \ Brunton \ and\ author J. N. \ Kutz ,\ @noop title Data-driven Science and Engineering \ ( publisher Cambridge University Press ,\ address Cambridge ,\ year 2019 ) NoStop

  54. [54]

    author author J. P. \ Crutchfield \ and\ author B. S. \ McNamara ,\ title title Equations of motion from a data series , \ @noop journal journal Complex Systems \ volume 1 ,\ pages 417--452 ( year 1987 ) NoStop

  55. [55]

    Yao \ and\ author E

    author author C. Yao \ and\ author E. M. \ Bollt ,\ title title Modeling and nonlinear parameter estimation with K ronecker product representation for coupled oscillators and spatiotemporal systems , \ @noop journal journal Physica D \ volume 227 ,\ pages 78--99 ( year 2007 ) NoStop

  56. [56]

    \ Wang , author R

    author author W.-X. \ Wang , author R. Yang , author Y.-C. \ Lai , author V. Kovanis , \ and\ author C. Grebogi ,\ title title Predicting catastrophes in nonlinear dynamical systems by compressive sensing , \ @noop journal journal Phys. Rev. Lett. \ volume 106 ,\ pages 154101 ( year 2011 ) NoStop

  57. [57]

    author author Y.-C. \ Lai ,\ title title Finding nonlinear system equations and complex network structures from data: A sparse optimization approach , \ @noop journal journal Chaos \ volume 31 ,\ pages 082101 ( year 2021 ) NoStop

  58. [58]

    author author S. L. \ Brunton , author J. L. \ Proctor , \ and\ author J. N. \ Kutz ,\ title title Discovering governing equations from data by sparse identification of nonlinear dynamical systems , \ @noop journal journal Proceedings of the National Academy of Sciences \ volume 113 ,\ pages 3932--3937 ( year 2016 ) NoStop

  59. [59]

    Napoletani \ and\ author T

    author author D. Napoletani \ and\ author T. D. \ Sauer ,\ title title Reconstructing the topology of sparsely connected dynamical networks , \ @noop journal journal Phys. Rev. E \ volume 77 ,\ pages 026103 ( year 2008 ) NoStop

  60. [60]

    author author N. M. \ Mangan , author S. L. \ Brunton , author J. L. \ Proctor , \ and\ author J. N. \ Kutz ,\ title title Inferring biological networks by sparse identification of nonlinear dynamics , \ @noop journal journal IEEE Trans. Mol. Bio. Multi-Scale Commun. \ volume 2 ,\ pages 52--63 ( year 2016 ) NoStop

  61. [61]

    Champion , author B

    author author K. Champion , author B. Lusch , author J. N. \ Kutz , \ and\ author S. L. \ Brunton ,\ title title Data-driven discovery of coordinates and governing equations , \ @noop journal journal Proceedings of the National Academy of Sciences \ volume 116 ,\ pages 22445--22451 ( year 2019 ) NoStop

  62. [62]

    author author I. Mezi \'c ,\ title title Spectral properties of dynamical systems, model reduction and decompositions , \ @noop journal journal Nonlinear Dynamics \ volume 41 ,\ pages 309--325 ( year 2005 ) NoStop

  63. [63]

    Budisic , author R

    author author M. Budisic , author R. Mohr , \ and\ author I. Mezic ,\ title title Applied K oopmanism , \ @noop journal journal Chaos \ volume 22 ,\ pages 047510 ( year 2012 ) NoStop

  64. [64]

    Arbabi \ and\ author I

    author author H. Arbabi \ and\ author I. Mezic ,\ title title Ergodic theory, dynamic mode decomposition, and computation of spectral properties of the K oopman operator , \ @noop journal journal SIAM Journal on Applied Dynamical Sytems \ volume 16 ,\ pages 2096--2126 ( year 2017 ) NoStop

  65. [65]

    author author P. J. \ Schmid ,\ title title Dynamic mode decomposition of numerical and experimental data , \ @noop journal journal Journal of Fluid Mechanics \ volume 656 ,\ pages 5--28 ( year 2010 ) NoStop

  66. [66]

    author author J. N. \ Kutz , author S. L. \ Brunton , author B. W. \ Brunton , \ and\ author J. L. \ Proctor ,\ @noop title Dynamic Mode Decomposition: Data-Driven Modeling of Complex Systems \ ( publisher SIAM ,\ address Philadelphia ,\ year 2016 ) NoStop

  67. [67]

    author author M. O. \ Williams , author I. G. \ Kevrekidis , \ and\ author C. W. \ Rowley ,\ title title A data--driven approximation of the K oopman operator: Extending dynamic mode decomposition , \ @noop journal journal Journal of Nonlinear Science \ volume 25 ,\ pages 1307--1346 ( year 2015 ) NoStop

  68. [68]

    Bongard \ and\ author H

    author author J. Bongard \ and\ author H. Lipson ,\ title title Automated reverse engineering of nonlinear dynamical systems , \ @noop journal journal Proc. Natl. Acad. Sci. \ volume 104 ,\ pages 9943 ( year 2007 ) NoStop

  69. [69]

    Quade , author M

    author author M. Quade , author M. Abel , author K. Shafi , author R. K. \ Niven , \ and\ author B. R. \ Noack ,\ title title Prediction of dynamical systems by symbolic regression , \ @noop journal journal Physical Review E \ volume 94 ,\ pages 012214 ( year 2016 ) NoStop

  70. [70]

    Quade , author J

    author author M. Quade , author J. Gout , \ and\ author M. Abel ,\ title title Glyph: Symbolic regression tools , \ @noop journal journal Journal of Open Research Software \ volume 7 ,\ pages 1--8 ( year 2019 ) NoStop

  71. [71]

    Schmidt \ and\ author H

    author author M. Schmidt \ and\ author H. Lipson ,\ title title Distilling free-form natural laws from experimental data , \ @noop journal journal science \ volume 324 ,\ pages 81--85 ( year 2009 ) NoStop

  72. [72]

    Banzhaf \ and\ author W

    author author W. Banzhaf \ and\ author W. B. \ Langdon ,\ title title Some considerations on the reason for bloat , \ @noop journal journal Genetic Programming and Evolvable Machines \ volume 3 ,\ pages 81--91 ( year 2002 ) NoStop

  73. [73]

    Lapedes \ and\ author R

    author author A. Lapedes \ and\ author R. Farber ,\ title title How neural nets work , \ in\ @noop booktitle Neural Information Processing Systems ,\ editor edited by\ editor D. Z. \ Anderson \ ( publisher American Institute of Physics ,\ year 1988 )\ pp.\ pages 442--456 NoStop

  74. [74]

    Rico-Martinez , author K

    author author R. Rico-Martinez , author K. Krischer , author I. G. \ Kevrekidis , author M. C. \ Kube , \ and\ author J. L. \ Hudson ,\ title title Discrete- vs. continuous-time nonlinear signal processing of C u electrodissolution data , \ @noop journal journal Chem. Eng. Comm. \ volume 118 ,\ pages 25--48 ( year 1992 ) NoStop

  75. [75]

    L \"a ngkvist , author L

    author author M. L \"a ngkvist , author L. Karlsson , \ and\ author A. Loutfi ,\ title title A review of unsupervised feature learning and deep learning for time-series modeling , \ @noop journal journal Pattern Recognition Letters \ volume 42 ,\ pages 11--24 ( year 2014 ) NoStop

  76. [76]

    Kim , author P

    author author S. Kim , author P. Y. \ Lu , author S. Mukherjee , author M. Gilbert , author L. Jing , author V. C eperi \'c , \ and\ author M. Solja c i \'c ,\ title title Integration of neural network-based symbolic regression in deep learning for scientific discovery , \ @noop journal journal IEEE transactions on neural networks and learning systems \ v...

  77. [77]

    Fronk \ and\ author L

    author author C. Fronk \ and\ author L. Petzold ,\ title title Interpretable polynomial neural ordinary differential equations , \ @noop journal journal Chaos: An Interdisciplinary Journal of Nonlinear Science \ volume 33 ( year 2023 ) NoStop

  78. [78]

    author author H. Akaike ,\ title title A new look at the statistical model identification , \ @noop journal journal IEEE Transactions on Automatic Control \ volume 19 ,\ pages 716--723 ( year 1974 ) NoStop

  79. [79]

    Van Breugel , author Y

    author author F. Van Breugel , author Y. Liu , author B. W. \ Brunton , \ and\ author J. N. \ Kutz ,\ title title Pynumdiff: a python package for numerical differentiation of noisy time-series data , \ @noop journal journal Journal of Open Source Software \ volume 7 ,\ pages 4078 ( year 2022 ) NoStop

  80. [80]

    Gu , author Z

    author author J. Gu , author Z. Wang , author J. Kuen , author L. Ma , author A. Shahroudy , author B. Shuai , author T. Liu , author X. Wang , author G. Wang , author J. Cai , \ and\ author T. Chen ,\ title title Recent advances in convolutional neural networks , \ @noop journal journal Pattern Recognition \ volume 77 ,\ pages 354--377 ( year 2018 ) NoStop

Showing first 80 references.