pith. machine review for the scientific record. sign in

arxiv: 2604.25157 · v2 · submitted 2026-04-28 · 🧮 math.NA · cs.NA· cs.SY· eess.SY· physics.ao-ph· stat.ME· stat.ML

Recognition: unknown

A Continuous-Time Ensemble Kalman-Bucy Smoother for Causal Inference and Model Discovery

(2) University of Potsdam), Marios Andreou (1), Nan Chen (1) ((1) University of Wisconsin-Madison, Sebastian Reich (2), Zhang Jiang (1)

Authors on Pith no claims yet

Pith reviewed 2026-05-07 15:43 UTC · model grok-4.3

classification 🧮 math.NA cs.NAcs.SYeess.SYphysics.ao-phstat.MEstat.ML
keywords ensemble Kalman-Bucy smoothercontinuous-time data assimilationcausal inferencemodel discoverynonlinear dynamical systemsderivative-free methodsBayesian inferencereduced-order models
0
0 comments X

The pith

An ensemble Kalman-Bucy smoother integrates future observations to enable causal inference and model discovery in nonlinear systems without derivatives.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper develops a continuous-time ensemble Kalman-Bucy smoother that reconstructs conditional distributions from ensemble moments rather than linearizations. This produces a derivative-free method for nonlinear dynamics that converges to the exact smoother solution as the ensemble grows large. Future observations are incorporated to perform retrospective state updates that expose underlying causal mechanisms and their time scales. Demonstrations on a dyadic trigger-feedback system and a reduced-order atmospheric model show that the approach recovers causal links and hidden parameters even with ensembles of size ten under partial observations.

Core claim

The ensemble Kalman-Bucy smoother reconstructs the conditional distributions of the smoother using ensemble moments for continuous-time nonlinear dynamical systems. This yields a derivative-free framework that converges to the exact smoother solution in the infinite-ensemble limit for a wide class of systems. By integrating future observations that reveal causal mechanisms for retrospective updates, the smoother supports Bayesian inference of causal relationships and their temporal influence ranges in a dyadic trigger-feedback model and enables a causality-driven iterative learning algorithm that identifies structure and recovers hidden parameters of a nonlinear reduced-order model mimicking

What carries the argument

The ensemble Kalman-Bucy smoother, which approximates conditional distributions via ensemble moments and assimilates future observations for retrospective state updates.

Load-bearing premise

Ensemble moments sufficiently reconstruct the conditional distributions of the smoother for nonlinear systems, and standard regularization preserves this reconstruction under partial observations in high dimensions.

What would settle it

Run the EnKBS alongside a reference exact smoother (such as a particle smoother with a very large particle count) on a known nonlinear test system and verify whether the difference in estimated states or causal strengths approaches zero as ensemble size tends to infinity.

Figures

Figures reproduced from arXiv: 2604.25157 by (2) University of Potsdam), Marios Andreou (1), Nan Chen (1) ((1) University of Wisconsin-Madison, Sebastian Reich (2), Zhang Jiang (1).

Figure 1
Figure 1. Figure 1: Lorenz-96 reference trajectory over t ∈ [75,100]. The horizontal axis is time, the vertical axis is the state index j ∈ {1,...,40}, and the color indicates the value of xj (t). deterministic system has 13 positive and one neutral Lyapunov exponent, corresponding to the dimension of the unstable￾neutral subspace (Bocquet and Carrassi, 2017; Lorenz and Emanuel, 1998). 3.1.1 The Experiment Setup In this test,… view at source ↗
Figure 2
Figure 2. Figure 2: Trajectories of x1(t) over t ∈ [75,100] for Lorenz-96. The black curve is the reference solution; the red and green curves are the EnKBF and EnKBS ensemble means, respectively. Shaded bands show the respective ±2 standard-deviation intervals. Parameters: m = 10, r0 = 3, and δ 2 = 1.005. influence range of a hidden driver from observations of its effect, dynamically identifying the instantaneous cause-and-e… view at source ↗
Figure 3
Figure 3. Figure 3: v → u. CIR and ACI metric computed using the EnKBS with ensemble size m = 50. Top: true trajectories of u (magenta) and v (blue). The dashed line is the anti-damping threshold du/c. Middle: the CIR length for v → u. Bottom: the ACI metric as a function of time. peak magnitudes and underestimates the CIR at the beginning of the rise of v, i.e., during the build-up phase of u’s extreme event; this is especia… view at source ↗
Figure 4
Figure 4. Figure 4: RMSE comparison between EnKBF/EnKBS and CGNS exact filter/smoother as a function of ensemble size m. Red circle: EnKBF. Green square: EnKBS. Red dashed: optimal filter. Green dashed: optimal smoother. The RMSE between the filter/smoother estimates and the truth of v is evaluated over a time window of [0,500]. 0 2 4 6 8 10 12 14 16 18 20 -2 0 2 EnKBS m = 10: Time series for v ! u u; v u v du=c 0 2 4 6 8 10 … view at source ↗
Figure 5
Figure 5. Figure 5: v → u. CIR and ACI metric computed using EnKBS with ensemble size m = 10. Top: true trajectories of u (magenta) and v (blue). The dashed line is the anti-damping threshold du/c. Middle: the CIR length for v → u. Bottom: the ACI metric as a function of time. To interpret these patterns, given a time series of dv in the v-equation Eq. (17b), the filter infers u through the negative coupling term −cu2 . As a … view at source ↗
Figure 6
Figure 6. Figure 6: u → v. EnKBF vs. EnKBS diagnostics with ensemble size m = 50. Top: true trajectories of u (magenta) and v (blue). Middle: estimation errors of u for the filter (red) and smoother (green), computed as uref −uf and uref −us, respectively. Values closer to 0 (dashed) indicate smaller errors. Bottom: empirical standard deviation of u for the filter (red) and smoother (green). 0 2 4 6 8 10 12 14 16 18 20 -2 0 2… view at source ↗
Figure 7
Figure 7. Figure 7: u → v. CIR and ACI metric computed using EnKBS with ensemble size m = 50. Top: true trajectories of u (magenta) and v (blue). The dashed line is the anti-damping threshold du/c. Middle: the CIR length for u → v. Bottom: the ACI metric as a function of time. accurate estimate via dynamical interpolation, so the additional future information becomes negligible. As a result, the ACI metric does not persist af… view at source ↗
Figure 8
Figure 8. Figure 8: Learning progress for Lorenz-84 using EnKBS-based sampling. (a) Sampled trajectories of the hidden state x at selected iterations compared with the truth; iteration 0 corresponds to the initial guess. (b) Structure error measured by Frobenius norm ∥C − Cstable∥F . (c) Evolution of the estimated parameter ˆb toward the true value b. The initial guesses for the noise amplitudes of observed variables are set … view at source ↗
Figure 9
Figure 9. Figure 9: EnKBS-based sampling. Trajectory and statistics comparison between the true Lorenz-84 system (blue) and the identified model (red). In addition to short-term path-wise agreement, the identified model reproduces the PDF and the temporal ACF of the truth view at source ↗
Figure 10
Figure 10. Figure 10: Learning progress for Lorenz-84 using CGNS optimal sampling (benchmark). (a) Sampled trajectories of the hidden state x at selected iterations compared with the truth; iteration 0 corresponds to the initial guess. (b) Structure error measured by Frobenius norm ∥C − Cstable∥F . (c) Evolution of the estimated parameter ˆb toward the true value b. 0 10 20 30 40 50 0 1 2 Trajectories x Truth Identi-ed -1 0 1 … view at source ↗
Figure 11
Figure 11. Figure 11: CGNS optimal sampling. Trajectory and statistics comparison between the true Lorenz-84 system (blue) and the identified model (red). In addition to short-term path-wise agreement, the identified model reproduces the PDF and the temporal ACF of the truth. Finally, Figures 10 and 11 show the benchmark results using CGNS optimal sampling. The overall learning behavior is similar to the EnKBS-based results in… view at source ↗
read the original abstract

Data assimilation (DA) integrates observational information with model predictions to improve state estimation in complex systems. While filtering provides the basis for online forecasts by using only past and present observations, it can exhibit delays and biases when the underlying dynamics evolve rapidly or undergo regime transitions. Smoothing, which additionally incorporates future observations, provides a natural pipeline for hindcasting and reanalysis that yields an uncertainty reduction beyond the filter. This paper introduces an ensemble Kalman-Bucy smoother (EnKBS) for continuous-time DA of nonlinear dynamical systems, where the smoother's conditional distributions are reconstructed using ensemble moments. The result is a derivative-free framework that does not require explicit computation of tangent-linear or adjoint models, which converges to the exact smoother solution at the infinite-ensemble limit for a wide class of complex systems. Incorporating standard regularization techniques for high-dimensional systems, such as covariance localization and inflation, the skill of the EnKBS is demonstrated in various important scientific problems. By integrating future observations, which reveal the underlying causal mechanisms for retrospective state updates, the EnKBS is used for Bayesian-based inference of causal relationships and their temporal influence range in a dyadic trigger-feedback model and the development of a causality-driven iterative learning algorithm that identifies the structure and recovers the hidden parameters of a nonlinear reduced-order model mimicking midlatitude atmospheric circulation. Notably, both tasks remain effective with an ensemble size of $O(10)$ under partial observations, suggesting that EnKBS can support the instantaneous discovery of high-dimensional complex systems over time.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 1 minor

Summary. The paper introduces a continuous-time ensemble Kalman-Bucy smoother (EnKBS) for nonlinear dynamical systems in data assimilation. It reconstructs the smoother's conditional distributions from ensemble moments in a derivative-free manner (no tangent-linear or adjoint models required), claims convergence to the exact smoother solution in the infinite-ensemble limit for a wide class of complex systems, incorporates localization and inflation for high dimensions, and demonstrates the approach on Bayesian causal inference (including temporal influence range) in a dyadic trigger-feedback model plus a causality-driven iterative algorithm for structure identification and parameter recovery in a nonlinear reduced-order model of midlatitude atmospheric circulation. Both tasks are reported to remain effective with ensemble sizes of O(10) under partial observations.

Significance. If the convergence claim and the small-ensemble performance on the two example problems hold with rigorous justification, the EnKBS could provide a practical, adjoint-free smoothing tool for retrospective analysis and reanalysis in high-dimensional nonlinear systems, with direct extensions to causal discovery and iterative model learning in geophysics and related fields.

major comments (3)
  1. [Abstract] Abstract: The central claim that the EnKBS 'converges to the exact smoother solution at the infinite-ensemble limit for a wide class of complex systems' is load-bearing for the entire contribution yet lacks any derivation, theorem statement, or proof sketch showing that ensemble-estimated moments recover the true (generally non-Gaussian) conditional distributions of the smoother. For nonlinear dynamics the Kalman-Bucy update with empirical moments converges at best to the Gaussian smoother approximation, not the exact Bayesian smoother.
  2. [Abstract] Abstract and applications sections: No quantitative error metrics, baseline comparisons (e.g., against ensemble Kalman filters, variational smoothers, or particle smoothers), or sensitivity studies with respect to ensemble size, localization radius, or inflation factor are referenced for the dyadic causal-inference task or the reduced-order model discovery task. This makes it impossible to assess whether the reported effectiveness with O(10) members under partial observations is robust or merely illustrative.
  3. [Abstract] Abstract: The incorporation of covariance localization and inflation is stated to stabilize high-dimensional runs, but no analysis is given on whether these regularization heuristics commute with the infinite-ensemble limit or preserve the second-moment structure required for convergence to the exact smoother; the skeptic note correctly flags that they generally alter the covariance in a non-convergent manner.
minor comments (1)
  1. [Abstract] The abstract would benefit from a concise statement of the precise function space or Lipschitz-type assumptions under which the 'wide class of complex systems' convergence is asserted.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for their detailed and constructive comments, which help clarify the scope and limitations of our contribution. We address each major comment point by point below, indicating planned revisions where appropriate.

read point-by-point responses
  1. Referee: [Abstract] Abstract: The central claim that the EnKBS 'converges to the exact smoother solution at the infinite-ensemble limit for a wide class of complex systems' is load-bearing for the entire contribution yet lacks any derivation, theorem statement, or proof sketch showing that ensemble-estimated moments recover the true (generally non-Gaussian) conditional distributions of the smoother. For nonlinear dynamics the Kalman-Bucy update with empirical moments converges at best to the Gaussian smoother approximation, not the exact Bayesian smoother.

    Authors: We appreciate the referee's emphasis on rigor for this central claim. The EnKBS is obtained by substituting ensemble-estimated moments into the continuous-time Kalman-Bucy smoother equations. As the ensemble size tends to infinity, the empirical moments converge to the true moments almost surely by the strong law of large numbers, so the ensemble solution converges to the solution of the Kalman-Bucy equations. For nonlinear dynamics this solution is the Gaussian approximation furnished by the Kalman-Bucy filter/smoother rather than the full non-Gaussian Bayesian smoother; our phrasing 'exact smoother solution' refers to the exact solution of those moment equations. We will add a concise theorem statement together with a proof sketch in the revised manuscript to make this distinction explicit and to document the convergence argument. revision: yes

  2. Referee: [Abstract] Abstract and applications sections: No quantitative error metrics, baseline comparisons (e.g., against ensemble Kalman filters, variational smoothers, or particle smoothers), or sensitivity studies with respect to ensemble size, localization radius, or inflation factor are referenced for the dyadic causal-inference task or the reduced-order model discovery task. This makes it impossible to assess whether the reported effectiveness with O(10) members under partial observations is robust or merely illustrative.

    Authors: We agree that quantitative diagnostics would strengthen the empirical sections. The present manuscript emphasizes qualitative demonstration of the EnKBS capabilities for causal inference and iterative model discovery, supported by visual state reconstructions and identified structures. In the revision we will incorporate root-mean-square error metrics for state estimates, direct comparisons against ensemble Kalman filters and particle smoothers on the same tasks, and sensitivity plots showing performance as a function of ensemble size (5–100 members), localization radius, and inflation factor for both the dyadic and reduced-order model examples. revision: yes

  3. Referee: [Abstract] Abstract: The incorporation of covariance localization and inflation is stated to stabilize high-dimensional runs, but no analysis is given on whether these regularization heuristics commute with the infinite-ensemble limit or preserve the second-moment structure required for convergence to the exact smoother; the skeptic note correctly flags that they generally alter the covariance in a non-convergent manner.

    Authors: The referee correctly observes that localization and inflation modify the covariance and therefore do not in general commute with the infinite-ensemble limit. The convergence statement in the manuscript applies to the unregularized EnKBS; localization and inflation are presented solely as practical finite-ensemble stabilisers for high-dimensional regimes, consistent with standard ensemble data-assimilation practice. We will revise the text to state this separation explicitly and to include a short discussion of how these heuristics affect the second-moment structure relative to the theoretical limit. revision: partial

Circularity Check

0 steps flagged

No significant circularity; derivation relies on standard ensemble convergence properties and numerical demonstrations

full rationale

The paper extends ensemble Kalman-Bucy filtering to a continuous-time smoother by reconstructing conditionals from ensemble moments, claiming convergence to the exact solution at infinite ensemble size. This is the standard limiting behavior of ensemble Kalman methods (exact for linear-Gaussian systems, Gaussian approximation otherwise) and is not shown to reduce to a self-definition or fitted input. Applications to causal inference are presented as empirical demonstrations on specific models rather than tautological predictions. No load-bearing self-citations, ansatz smuggling, or renaming of known results are identified that collapse the central claims. The derivation chain remains self-contained against external benchmarks.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The framework rests on the standard ensemble approximation of conditional distributions and the applicability of localization/inflation regularization; no new free parameters, invented entities, or ad-hoc axioms are introduced in the abstract.

axioms (2)
  • domain assumption Ensemble moments can reconstruct the conditional distributions of the Kalman-Bucy smoother for nonlinear systems
    Invoked when stating that the smoother's conditional distributions are reconstructed using ensemble moments.
  • domain assumption Standard covariance localization and inflation preserve the smoother properties in high-dimensional partial-observation settings
    Mentioned as incorporated techniques for high-dimensional systems.

pith-pipeline@v0.9.0 · 5622 in / 1447 out tokens · 37553 ms · 2026-05-07T15:43:29.593288+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

71 extracted references · 71 canonical work pages

  1. [1]

    I., Nœvdal, G., Oliver, D

    Aanonsen, S. I., Nœvdal, G., Oliver, D. S., Reynolds, A. C., and Vallès, B.: The ensemble Kalman filter in reservoir engineering—a review, SPE J., 14, 393–412, https://doi.org/10.2118/117274-PA,

  2. [2]

    Amezcua, J., Ide, K., Kalnay, E., and Reich, S.: Ensemble transform Kalman–Bucy filters, Q. J. R. Meteorol. Soc., 140, 995–1004, https://doi.org/10.1002/qj.2186,

  3. [3]

    APACrefauthors \ 2001

    Anderson, J. L.: An ensemble adjustment Kalman filter for data assimilation, Mon. Weather Rev., 129, 2884–2903, https://doi.org/10.1175/1520-0493(2001)129<2884:AEAKFF>2.0.CO;2,

  4. [4]

    Anderson, J. L. and Anderson, S. L.: A Monte Carlo Implementation of the Nonlinear Filtering Problem to Produce Ensemble Assimilations and Forecasts, Mon. Weather Rev., 127, 2741–2758, https://doi.org/10.1175/1520-0493(1999)127<2741:amciot>2.0.co;2,

  5. [5]

    and Chen, N.: A Martingale-Free Introduction to Conditional Gaussian Nonlinear Systems, Entropy, 27, 2, https://doi.org/10.3390/e27010002, 2024a

    Andreou, M. and Chen, N.: A Martingale-Free Introduction to Conditional Gaussian Nonlinear Systems, Entropy, 27, 2, https://doi.org/10.3390/e27010002, 2024a. Andreou, M. and Chen, N.: Statistical response of ENSO complexity to initial condition and model parameter perturbations, J. Climate, 37, 5629–5651, https://doi.org/10.1175/JCLI-D-24-0017.1, 2024b. A...

  6. [6]

    Andreou, M., Chen, N., and Li, Y .: An Adaptive Online Smoother with Closed-Form Solutions and Information-Theoretic Lag Selection for Conditional Gaussian Nonlinear Systems, arXiv [preprint], arXiv:2411.05870, https://doi.org/10.48550/arXiv.2411.05870, 7 November,

  7. [7]

    Commun., 17, 1854, https://doi.org/10.1038/s41467-026-68568-0,

    Andreou, M., Chen, N., and Bollt, E.: Assimilative causal inference, Nat. Commun., 17, 1854, https://doi.org/10.1038/s41467-026-68568-0,

  8. [8]

    Asch, M., Bocquet, M., and Nodet, M.: Data assimilation: methods, algorithms, and applications, SIAM, https://doi.org/10.1137/1.9781611974546,

  9. [9]

    Bar-Yam, Y .: Dynamics of complex systems, CRC Press, https://doi.org/10.1201/9780429034961,

  10. [10]

    Bauer, P., Thorpe, A., and Brunet, G.: The quiet revolution of numerical weather prediction, Nature, 525, 47–55, https://doi.org/10.1038/nature14956,

  11. [11]

    and Reich, S.: A localization technique for ensemble Kalman filters, Q

    Bergemann, K. and Reich, S.: A localization technique for ensemble Kalman filters, Q. J. R. Meteorol. Soc., 136, 701–707, https://doi.org/10.1002/qj.591, 2010a. Bergemann, K. and Reich, S.: A mollified ensemble Kalman filter, Q. J. R. Meteorol. Soc., 136, 1636–1643, https://doi.org/10.1002/qj.672, 2010b. Bergemann, K. and Reich, S.: An Ensemble Kalman-Buc...

  12. [12]

    and Carrassi, A.: Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69, 1304 504, https://doi.org/10.1080/16000870.2017.1304504,

    Bocquet, M. and Carrassi, A.: Four-dimensional ensemble variational data assimilation and the unstable subspace, Tellus A, 69, 1304 504, https://doi.org/10.1080/16000870.2017.1304504,

  13. [13]

    Abhimanyu Das, Weihao Kong, Rajat Sen, and Yichen Zhou

    Brunton, S. L., Proctor, J. L., and Kutz, J. N.: Discovering governing equations from data by sparse identification of nonlinear dynamical systems, Proc. Natl. Acad. Sci. U.S.A., 113, 3932–3937, https://doi.org/10.1073/pnas.1517384113,

  14. [14]

    Weather Rev., 145, 617–635, https://doi.org/10.1175/MWR-D-16-0106.1,

    27 Buehner, M., McTaggart-Cowan, R., and Heilliette, S.: An ensemble Kalman filter for numerical weather prediction based on variational data assimilation: VarEnKF, Mon. Weather Rev., 145, 617–635, https://doi.org/10.1175/MWR-D-16-0106.1,

  15. [16]

    M.: Ensemble Kalman methods: A mean-field perspective, Acta Numer., 34, 123–291, https://doi.org/10.1017/S0962492924000060,

    Calvello, E., Reich, S., and Stuart, A. M.: Ensemble Kalman methods: A mean-field perspective, Acta Numer., 34, 123–291, https://doi.org/10.1017/S0962492924000060,

  16. [17]

    Chen, N.: Stochastic methods for modeling and predicting complex dynamical systems, Synthesis Lectures on Mathematics & Statistics, Springer Nature Switzerland, https://doi.org/10.1007/978-3-031-81924-7,

  17. [18]

    and Majda, A

    Chen, N. and Majda, A. J.: Filtering nonlinear turbulent dynamical systems through conditional Gaussian statistics, Mon. Weather Rev., 144, 4885–4917, https://doi.org/10.1175/MWR-D-15-0437.1,

  18. [19]

    and Majda, A

    Chen, N. and Majda, A. J.: Conditional Gaussian Systems for Multiscale Nonlinear Stochastic Systems: Prediction, State Estimation and Uncertainty Quantification, Entropy, 20, 509, https://doi.org/10.3390/e20070509,

  19. [20]

    and Majda, A

    Chen, N. and Majda, A. J.: Predicting Observed and Hidden Extreme Events in Complex Nonlinear Dynamical Systems with Partial Obser- vations and Short Training Time Series, Chaos, 30, 033 101, https://doi.org/10.1063/1.5122199,

  20. [21]

    Chen, N. and Zhang, Y .: A causality-based learning approach for discovering the underlying dynamics of complex systems from partial observations with stochastic parameterization, Physica D, 449, 133 743, https://doi.org/10.1016/j.physd.2023.133743,

  21. [22]

    Chen, N., Wiggins, S., and Andreou, M.: Taming uncertainty in a complex world: The rise of uncertainty quantification-a tutorial for begin- ners, Notices Am. Math. Soc., 72, 250–260, https://doi.org/10.1090/noti3120,

  22. [23]

    Cosme, E., Brankart, J.-M., Verron, J., Brasseur, P., and Krysta, M.: Implementation of a reduced rank square-root smoother for high resolu- tion ocean data assimilation, Ocean Model., 33, 87–100, https://doi.org/10.1016/j.ocemod.2009.12.004,

  23. [24]

    de Wiljes, J., Pathiraja, S., and Reich, S.: Ensemble transform algorithms for nonlinear smoothing problems, SIAM J. Sci. Comput., 42, A87–A114, https://doi.org/10.1137/19M1239544,

  24. [25]

    2019 , PAGES =

    Durrett, R.: Probability: theory and examples, vol. 49, Cambridge University Press, https://doi.org/10.1017/9781108591034,

  25. [26]

    and Rogers, J.: Causation Entropy Method for Covariate Selection in Dynamic Models, in: 2021 American Control Conference (ACC), pp

    Elinger, J. and Rogers, J.: Causation Entropy Method for Covariate Selection in Dynamic Models, in: 2021 American Control Conference (ACC), pp. 2842–2847, ISSN 2378-5861, https://doi.org/10.23919/ACC50511.2021.9483371,

  26. [27]

    Evensen, G.: Sequential data assimilation with a nonlinear quasi-geostrophic model using Monte Carlo methods to forecast error statistics, J. Geophys. Res.-Oceans, 99, 10 143–10 162, https://doi.org/10.1029/94JC00572,

  27. [28]

    Evensen, G.: The ensemble Kalman filter: Theoretical formulation and practical implementation, Ocean Dyn., 53, 343–367, https://doi.org/10.1007/s10236-003-0036-9,

  28. [29]

    Evensen, G.: Data assimilation: the ensemble Kalman filter, Springer, https://doi.org/10.1007/978-3-642-03711-5,

  29. [30]

    M., & Hallberg, R

    Evensen, G. and Van Leeuwen, P. J.: An ensemble Kalman smoother for nonlinear dynamics, Mon. Weather Rev., 128, 1852–1867, https://doi.org/10.1175/1520-0493(2000)128<1852:aeksfn>2.0.co;2,

  30. [31]

    Frisch, U.: Turbulence: The Legacy of A. N. Kolmogorov, Cambridge University Press, https://doi.org/10.1017/CBO9781139170666,

  31. [32]

    and Cohn, S

    Gaspari, G. and Cohn, S. E.: Construction of Correlation Functions in Two and Three Dimensions, Q. J. R. Meteorol. Soc., 125, 723–757, https://doi.org/10.1002/qj.49712555417,

  32. [33]

    J.: An Ensemble Kalman Filter for Statistical Estimation of Physics Constrained Nonlinear Regression Models, J

    28 Harlim, J., Mahdi, A., and Majda, A. J.: An Ensemble Kalman Filter for Statistical Estimation of Physics Constrained Nonlinear Regression Models, J. Comput. Phys., 257, 782–812, https://doi.org/10.1016/j.jcp.2013.10.025,

  33. [34]

    L., Berkooz, G., and Rowley, C

    Holmes, P., Lumley, J. L., Berkooz, G., and Rowley, C. W.: Turbulence, coherent structures, dynamical systems and symmetry, Cambridge University Press, 2 edn., https://doi.org/10.1017/CBO9780511919701,

  34. [35]

    Horn, R. A. and Johnson, C. R.: Matrix Analysis, Cambridge University Press, Cambridge, ISBN 978-0521305860, https://doi.org/10.1017/CBO9780511810817,

  35. [36]

    Houtekamer, P. L. and Mitchell, H. L.: Data assimilation using an ensemble Kalman filter technique, Mon. Weather Rev., 126, 796–811, https://doi.org/10.1175/1520-0493(1998)126<0796:DAUAEK>2.0.CO;2,

  36. [37]

    Houtekamer, P. L. and Zhang, F.: Review of the ensemble Kalman filter for atmospheric data assimilation, Mon. Weather Rev., 144, 4489– 4532, https://doi.org/10.1175/MWR-D-15-0440.1,

  37. [38]

    A., Law, K

    Iglesias, M. A., Law, K. J., and Stuart, A. M.: Ensemble Kalman methods for inverse problems, Inverse Probl., 29, 045 001, https://doi.org/10.1088/0266-5611/29/4/045001,

  38. [39]

    M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, The MIT Press, ISBN 9780262276078, https://doi.org/10.7551/mitpress/2526.001.0001,

    Izhikevich, E. M.: Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, The MIT Press, ISBN 9780262276078, https://doi.org/10.7551/mitpress/2526.001.0001,

  39. [40]

    H.: Stochastic Processes and Filtering Theory, vol

    Jazwinski, A. H.: Stochastic Processes and Filtering Theory, vol. 64 ofMathematics in Science and Engineering, Academic Press, New York, https://doi.org/10.1016/S0076-5392(09)X6022-4,

  40. [41]

    A New Approach to Linear Filtering and Prediction Problems

    Kalman, R. E.: A New Approach to Linear Filtering and Prediction Problems, J. Basic Eng., 82, 35–45, https://doi.org/10.1115/1.3662552,

  41. [42]

    Kalman, R. E. and Bucy, R. S.: New Results in Linear Filtering and Prediction Theory, J. Basic Eng., 83, 95–108, https://doi.org/10.1115/1.3658902,

  42. [43]

    Kalnay, E.: Atmospheric modeling, data assimilation and predictability, Cambridge University Press, https://doi.org/10.1017/CBO9780511802270,

  43. [44]

    Kalnay, E., Li, H., Miyoshi, T., Yang, S.-C., and Ballabrera-Poy, J.: 4-D-Var or ensemble Kalman filter?, Tellus A, 59, 758–773, https://doi.org/10.1111/j.1600-0870.2007.00261.x,

  44. [45]

    P., Anderson, J

    Khare, S. P., Anderson, J. L., Hoar, T. J., and Nychka, D.: An investigation into the application of an ensemble Kalman smoother to high- dimensional geophysical systems, Tellus A, 60, 97–112, https://doi.org/10.1111/j.1600-0870.2007.00281.x,

  45. [46]

    Kloeden, P. E. and Platen, E.: Numerical Solution of Stochastic Differential Equations, Springer, Berlin, Heidelberg, ISBN 978-3-642-08107- 1 978-3-662-12616-5, https://doi.org/10.1007/978-3-662-12616-5,

  46. [47]

    62 ofTexts in Applied Mathematics, Springer Cham, https://doi.org/10.1007/978-3-319-20325-6,

    Law, K., Stuart, A., and Zygalakis, K.: Data Assimilation: A Mathematical Introduction, vol. 62 ofTexts in Applied Mathematics, Springer Cham, https://doi.org/10.1007/978-3-319-20325-6,

  47. [48]

    Liptser, R. S. and Shiryayev, A. N.: Statistics of Random Processes II, Springer New York, New York, NY , ISBN 978-1-4757-4295-4 978-1- 4757-4293-0, https://doi.org/10.1007/978-1-4757-4293-0,

  48. [49]

    Analysis methods for numerical weather prediction

    Lorenc, A. C.: Analysis methods for numerical weather prediction, Q. J. R. Meteorol. Soc., 112, 1177–1194, https://doi.org/10.1002/qj.49711247414,

  49. [50]

    C.: The potential of the ensemble Kalman filter for NWP—A comparison with 4D-Var, Q

    Lorenc, A. C.: The potential of the ensemble Kalman filter for NWP—A comparison with 4D-Var, Q. J. R. Meteorol. Soc., 129, 3183–3203, https://doi.org/10.1256/qj.02.132,

  50. [51]

    The predictability of a flow which possesses many scales of motion,

    Lorenz, E. N.: The predictability of a flow which possesses many scales of motion, Tellus, 21, 289–307, https://doi.org/10.3402/tellusa.v21i3.10086,

  51. [52]

    N.: Irregularity: A Fundamental Property of the Atmosphere*, Tellus A, 36, 98–110, https://doi.org/10.3402/tellusa.v36i2.11473,

    29 Lorenz, E. N.: Irregularity: A Fundamental Property of the Atmosphere*, Tellus A, 36, 98–110, https://doi.org/10.3402/tellusa.v36i2.11473,

  52. [53]

    N.: Predictability - a problem partly solved, in: Predictability of Weather and Climate, edited by Palmer, T

    Lorenz, E. N.: Predictability - a problem partly solved, in: Predictability of Weather and Climate, edited by Palmer, T. and Hagedorn, R., pp. 40–58, Cambridge University Press, ISBN 9781107414853, https://doi.org/10.1017/cbo9780511617652.004,

  53. [54]

    Lorenz, E. N. and Emanuel, K. A.: Optimal Sites for Supplementary Weather Observations: Simulation with a Small Model, J. Atmos. Sci., 55, 399–414, https://doi.org/10.1175/1520-0469(1998)055<0399:osfswo>2.0.co;2,

  54. [55]

    J.: Introduction to turbulent dynamical systems in complex systems, Springer, https://doi.org/10.1007/978-3-319-32217-9,

    Majda, A. J.: Introduction to turbulent dynamical systems in complex systems, Springer, https://doi.org/10.1007/978-3-319-32217-9,

  55. [56]

    Majda, A. J. and Harlim, J.: Filtering complex turbulent systems, Cambridge University Press, https://doi.org/10.1017/CBO9781139061308, 2012a. Majda, A. J. and Harlim, J.: Physics Constrained Nonlinear Regression Models for Time Series, Nonlinearity, 26, 201, https://doi.org/10.1088/0951-7715/26/1/201, 2012b. Navon, I. M.: Practical and Theoretical Aspect...

  56. [57]

    Øksendal, B.: Stochastic Differential Equations: An Introduction with Applications, Universitext, Springer Berlin Heidelberg, ISBN 9783642143946, https://doi.org/10.1007/978-3-642-14394-6,

  57. [58]

    Rabier, F.: Overview of global data assimilation developments in numerical weather-prediction centres, Q. J. R. Meteorol. Soc., 131, 3215– 3233, https://doi.org/10.1256/qj.05.129,

  58. [59]

    Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations

    Raissi, M., Perdikaris, P., and Karniadakis, G. E.: Physics-informed neural networks: A deep learning framework for solv- ing forward and inverse problems involving nonlinear partial differential equations, J. Comput. Phys., 378, 686–707, https://doi.org/10.1016/j.jcp.2018.10.045,

  59. [60]

    E., Tung, F., and Striebel, C

    Rauch, H. E., Tung, F., and Striebel, C. T.: Maximum likelihood estimates of linear dynamic systems, AIAA J., 3, 1445–1450, https://doi.org/10.2514/3.3166,

  60. [61]

    Math., 51, 235–249, https://doi.org/10.1007/s10543-010-0302-4,

    Reich, S.: A Dynamical Systems Framework for Intermittent Data Assimilation, BIT Numer. Math., 51, 235–249, https://doi.org/10.1007/s10543-010-0302-4,

  61. [62]

    Rozovsky, B. L. and Lototsky, S. V .: Stochastic Evolution Systems: Linear Theory and Applications to Non-Linear Filtering, Probability Theory and Stochastic Modelling, Springer International Publishing, ISBN 9783319948935, https://doi.org/10.1007/978-3-319-94893-5,

  62. [63]

    Sakov, P. and Oke, P.: A Deterministic Formulation of the Ensemble Kalman Filter: An Alternative to Ensemble Square Root Filters, Tellus A, 60, 361–371, https://doi.org/10.1111/j.1600-0870.2007.00299.x,

  63. [64]

    Särkkä, S. and Solin, A.: Applied Stochastic Differential Equations, Cambridge University Press, 1 edn., ISBN 978-1-108-18673-5 978-1- 316-51008-7 978-1-316-64946-6, https://doi.org/10.1017/9781108186735,

  64. [65]

    Särkkä, S. and Svensson, L.: Bayesian Filtering and Smoothing, Institute of Mathematical Statistics Textbooks, Cambridge University Press, 2 edn., ISBN 9781108926645, https://doi.org/10.1017/9781108917407,

  65. [66]

    Simon, D.: Optimal state estimation: Kalman, H infinity, and nonlinear approaches, John Wiley & Sons, https://doi.org/10.1002/0470045345,

  66. [67]

    Weather Rev., 136, 4629–4640, https://doi.org/10.1175/2008MWR2529.1,

    Snyder, C., Bengtsson, T., Bickel, P., and Anderson, J.: Obstacles to high-dimensional particle filtering, Mon. Weather Rev., 136, 4629–4640, https://doi.org/10.1175/2008MWR2529.1,

  67. [68]

    2024.Nonlinear dynamics and chaos: with applications to physics, biology, chemistry, and engineering

    30 Strogatz, S. H.: Nonlinear dynamics and chaos: with applications to physics, biology, chemistry, and engineering, Chapman and Hall/CRC, https://doi.org/10.1201/9780429398490,

  68. [69]

    , journal =

    Sun, J. and Bollt, E. M.: Causation Entropy Identifies Indirect Influences, Dominance of Neighbors and Anticipatory Couplings, Physica D, 267, 49–57, https://doi.org/10.1016/j.physd.2013.07.001,

  69. [70]

    Särkkä, S.: Continuous-time and continuous-discrete-time unscented Rauch-Tung-Striebel smoothers, Signal Process., 90, 225–235, https://doi.org/10.1016/j.sigpro.2009.06.012,

  70. [71]

    Tarantola, A.: Inverse problem theory and methods for model parameter estimation, SIAM, https://doi.org/10.1137/1.9780898717921,

  71. [72]

    and Delsole, T.: Using the Ensemble Kalman Filter to Estimate Multiplicative Model Parameters, Tellus A, 61, 601–609, https://doi.org/10.1111/j.1600-0870.2009.00407.x,

    Yang, X. and Delsole, T.: Using the Ensemble Kalman Filter to Estimate Multiplicative Model Parameters, Tellus A, 61, 601–609, https://doi.org/10.1111/j.1600-0870.2009.00407.x,