pith. machine review for the scientific record. sign in

arxiv: 2605.08550 · v2 · submitted 2026-05-08 · 💻 cs.LG · stat.ML

Recognition: no theorem link

A Call to Lagrangian Action: Learning Population Mechanics from Temporal Snapshots

Authors on Pith no claims yet

Pith reviewed 2026-05-14 20:46 UTC · model grok-4.3

classification 💻 cs.LG stat.ML
keywords Wasserstein Lagrangian Mechanicspopulation dynamicssecond-order dynamicsgradient flowsoptimal transportHamiltonian equationsforecasting marginalsinterpolation
0
0 comments X

The pith

Wasserstein Lagrangian Mechanics learns second-order population dynamics directly from observed marginals without specifying the Lagrangian.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper shifts population dynamics modeling from Wasserstein gradient flows, which minimize free energy and miss periodic behavior, to dynamics that minimize a population-level action under a damped Wasserstein Lagrangian. The authors derive the corresponding Hamiltonian equations and formalize Wasserstein Lagrangian Mechanics as a broad class that includes classical mechanics, quantum mechanics, and gradient flows as special cases. They introduce the WLM algorithm that learns these second-order dynamics straight from sequences of marginal snapshots. Readers should care because the approach supports forecasting and interpolating unseen population states and shows better performance than prior methods on vortex dynamics, embryonic development, and flocking.

Core claim

Wasserstein Lagrangian Mechanics formalizes second-order dynamics in which populations evolve to minimize an action integral under a damped Wasserstein Lagrangian. Deriving the Hamiltonian equations from this principle produces a structured family that recovers gradient flows when inertia vanishes yet also permits periodic and inertial motions. The WLM algorithm learns the governing mechanics from observed marginals alone, without any pre-specified form for the Lagrangian, and thereby enables accurate forecasting and interpolation of future marginals.

What carries the argument

The damped Wasserstein Lagrangian, which defines the action integral whose minimization yields the Hamiltonian equations of motion for the evolving population.

If this is right

  • Periodic and inertial behaviors in populations become directly modelable without additional terms.
  • Forecasting of future marginal distributions is possible from temporal snapshots alone.
  • Interpolation between observed time points improves for dynamics beyond pure gradient flows.
  • A single learning procedure unifies classical mechanics, quantum mechanics, and gradient flows for population data.
  • Outperformance holds across vortex dynamics, embryonic development, and flocking examples.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same action-minimization view could be tested on high-dimensional single-cell data where cell-cycle periodicity is known to occur.
  • Neural-network parametrizations of more general Lagrangians might extend WLM to non-Euclidean population spaces.
  • Direct comparison against datasets with independently measured forces would clarify when the Wasserstein action assumption is realistic.

Load-bearing premise

That the population dynamics of interest arise from minimizing some action under a damped Wasserstein Lagrangian.

What would settle it

WLM producing inaccurate forecasts or interpolations on a dataset of known periodic population motion whose true underlying forces do not minimize any such action would falsify the central claim.

Figures

Figures reproduced from arXiv: 2605.08550 by Kirill Neklyudov, Lazar Atanackovic, Vincent Guan.

Figure 1
Figure 1. Figure 1: Wasserstein gradient flows describe first-order popu￾lation dynamics that minimize the free energy F[ρt]. We pro￾pose Wasserstein Lagrangian mechanics (WLM), which describe a richer class of damped second-order dynamics, based on the population-level potential energy U[ρt]. Given the same quadratic functional, gradient flows dissipate until equilibrium, while WLM produces oscillating dynamics if γ = 0, and… view at source ↗
Figure 2
Figure 2. Figure 2: Learning population mechanics with WLM: In (a), we illustrate the principle of least Wasserstein action (3): given observed marginals p0 and p1, the true interpolants form a minimal action curve in the space of densities, with respect to a population-level Lagrangian action. Alternative curves of densities have higher action and are drawn in red. Wasserstein least action induces Hamiltonian mechanics on th… view at source ↗
Figure 3
Figure 3. Figure 3: We visualize Proposition 2.4, which shows that Wasser￾stein gradient flows admit two characterizations under WLM. The implication is from the principle of superposition (Am￾brosio et al., 2008, Theorem 8.2.1). It follows that the over￾damped characterization produces gradient flow dynamics for t > 0, no matter what v0 is initialized as. Then, to prove the second description, we note that since a gradient f… view at source ↗
Figure 4
Figure 4. Figure 4: WLM’s predictions (×) for unseen interpolants ( ) are visualized for the (a) Ocean vortex (in spatial coordinates) and (b) Embryoid body (in UMAP coordinates) datasets. Marginals that are used to train WLM’s mechanics model are plotted in gray [PITH_FULL_IMAGE:figures/full_fig_p007_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Learning Boids: We plot the ground truth Boids (first row) against the population dynamics learned by our method WLM (second row), and the gradient flow baselines JKONET∗ (third row) and NN-APPEX (fourth row). All methods were trained on 50 marginals of size 1000 within t ∈ [0, 24.5]. The rollouts of each method are simulated from time 0 (with 5000 samples drawn) until the additional forecast horizon [25.0… view at source ↗
Figure 6
Figure 6. Figure 6: WLM Learned Friction (γ) curve for each SDEs over 100, 000 training epochs. Minimum learned friction at 100k epochs is γ = 5.28 (Styblinski-Tang paired setting), which collapses to the equilibrium distribution almost instantaneously. 19 [PITH_FULL_IMAGE:figures/full_fig_p019_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Learnable friction on the Embryoid Body (EB) dataset for different holdout times. 128 256 512 1024 2048 full Particles per batch 0.68 0.70 0.72 0.74 0.76 0.78 W (lower is better) EB: Mini-batch Size vs. Performance & Runtime OT-CFM (0.790) WLF-UOT (0.738) OT-MFM (0.713) WLM (Ours) Time per epoch (ms) 100 200 300 400 500 600 Time per epoch (ms) [PITH_FULL_IMAGE:figures/full_fig_p020_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Runtime vs. performance when performing mini-batching with WLM averaged over 3 random seeds. C.4. Boids We implement the Boids algorithm using a Vicsek-style interacting particle system based on the classic Boids algorithm. By default, we simulate N = 1000 agents in R 2 adhering to local interaction rules, which is essentially the three Boids interaction rules with a boundary condition. In particular, at e… view at source ↗
Figure 9
Figure 9. Figure 9: Predicting Boids on unseen dynamics: Qualitative comparison of ground truth Boids dynamics (top row of each panel) versus the predicted WLM dynamics (bottom row of each panel) for three unseen Gaussian mixture initial distributions. WLM was trained on 50 frames whose population was a centered Gaussian (see [PITH_FULL_IMAGE:figures/full_fig_p022_9.png] view at source ↗
read the original abstract

The population dynamics of molecules, cells, and organisms are governed by a number of unknown forces. In the last decade, population dynamics have predominantly been modeled with Wasserstein gradient flows. However, since gradient flows minimize free energy, they fail to capture important dynamical properties, such as periodicity. In this work, we propose a change in perspective by considering dynamics that minimize a population-level action under a damped Wasserstein Lagrangian. By deriving the corresponding Hamiltonian equations of motion, we formalize Wasserstein Lagrangian Mechanics, a structured class of second-order dynamics that encompasses classical mechanics, quantum mechanics, and gradient flows. We then propose WLM as the first algorithm that learns these second-order dynamics from observed marginals, without specifying the Lagrangian. By directly learning the population mechanics, WLM can both forecast and interpolate unseen marginals, and outperforms existing gradient flow and flow matching methods across a wide range of dynamics, including vortex dynamics, embryonic development, and flocking.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper introduces Wasserstein Lagrangian Mechanics (WLM) by shifting from Wasserstein gradient flows to dynamics that minimize a population-level action under a damped Wasserstein Lagrangian. It derives the corresponding Hamiltonian equations of motion, formalizes a class of second-order dynamics encompassing classical mechanics, quantum mechanics, and gradient flows, and proposes the WLM algorithm to learn these dynamics directly from observed marginal snapshots without specifying the Lagrangian form. The method is claimed to enable forecasting and interpolation of unseen marginals and to outperform gradient flow and flow matching baselines on vortex dynamics, embryonic development, and flocking datasets.

Significance. If the inverse problem of recovering the Lagrangian from marginals is well-posed and the empirical gains hold under controlled conditions, the work would meaningfully extend Wasserstein-based population modeling beyond first-order gradient flows to capture periodic and inertial behaviors. The explicit derivation of Hamiltonian equations and the data-driven learning procedure without a pre-specified functional form are strengths that could support broader applications in biology and physics if identifiability is established.

major comments (2)
  1. [§3] §3 (derivation of Hamiltonian equations): The manuscript derives the equations of motion from the damped Wasserstein Lagrangian via standard variational calculus, but provides no analysis or theorem establishing uniqueness of the recovered Lagrangian (or its parameters) from discrete-time marginal observations. This leaves the central claim—that WLM recovers the true population mechanics—vulnerable to the identifiability issue that multiple choices of potential or damping can produce statistically indistinguishable marginal trajectories.
  2. [§5] §5 (experimental evaluation): The reported outperformance on forecasting and interpolation tasks (e.g., vortex and flocking examples) relies on a marginal-matching loss; however, no ablation or sensitivity analysis is given for snapshot sparsity, noise levels, or damping coefficient, which directly tests whether the learned object is uniquely determined or merely one of several plausible second-order flows consistent with the data.
minor comments (2)
  1. [§2] Notation for the damped Wasserstein Lagrangian (kinetic plus damping terms) should be introduced with an explicit equation number in §2 to improve readability when comparing to classical mechanics.
  2. [§1] The abstract and introduction claim WLM is the 'first' such algorithm; a more precise statement of novelty relative to prior Lagrangian or action-based learning methods in the related-work section would strengthen the positioning.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the detailed and constructive report. We address each major comment below and describe the revisions planned for the next version of the manuscript.

read point-by-point responses
  1. Referee: [§3] §3 (derivation of Hamiltonian equations): The manuscript derives the equations of motion from the damped Wasserstein Lagrangian via standard variational calculus, but provides no analysis or theorem establishing uniqueness of the recovered Lagrangian (or its parameters) from discrete-time marginal observations. This leaves the central claim—that WLM recovers the true population mechanics—vulnerable to the identifiability issue that multiple choices of potential or damping can produce statistically indistinguishable marginal trajectories.

    Authors: We acknowledge that §3 presents the derivation via standard variational calculus without a formal uniqueness theorem for the Lagrangian recovered from discrete marginals. This is a valid point regarding identifiability. In the revision we will add a dedicated remark in §3 discussing identifiability conditions (e.g., sufficient temporal density to resolve inertial effects and the regularizing role of the damping term) and note that the data-driven procedure can converge to consistent dynamics across initializations, while clarifying that this does not replace a rigorous uniqueness result. revision: partial

  2. Referee: [§5] §5 (experimental evaluation): The reported outperformance on forecasting and interpolation tasks (e.g., vortex and flocking examples) relies on a marginal-matching loss; however, no ablation or sensitivity analysis is given for snapshot sparsity, noise levels, or damping coefficient, which directly tests whether the learned object is uniquely determined or merely one of several plausible second-order flows consistent with the data.

    Authors: We agree that the current experimental section would benefit from explicit sensitivity checks. In the revised manuscript we will add an ablation study in §5 that varies the number of observed snapshots, injects controlled noise into the marginals, and sweeps the damping coefficient, reporting both forecasting and interpolation metrics to assess robustness and the degree to which the recovered dynamics are uniquely determined by the data. revision: yes

Circularity Check

0 steps flagged

Derivation chain is self-contained; no circular reductions identified

full rationale

The paper derives Hamiltonian equations from a damped Wasserstein Lagrangian via standard variational mechanics, which does not reduce to fitted quantities or self-referential definitions by construction. The WLM algorithm learns the Lagrangian parameters directly from observed marginal snapshots through data-driven optimization, without renaming inputs as predictions or smuggling ansatzes via self-citations. No load-bearing uniqueness theorems or self-citation chains are invoked to force the central claims; the approach remains externally falsifiable against marginal trajectories and independent benchmarks.

Axiom & Free-Parameter Ledger

0 free parameters · 1 axioms · 0 invented entities

The central claim rests on the domain assumption that population dynamics admit a variational description via a damped Wasserstein Lagrangian; no free parameters or invented entities are identifiable from the abstract alone.

axioms (1)
  • domain assumption Population dynamics minimize a population-level action under a damped Wasserstein Lagrangian
    Stated as the core modeling choice that replaces free-energy minimization.

pith-pipeline@v0.9.0 · 5467 in / 1163 out tokens · 37749 ms · 2026-05-14T20:46:01.086949+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

260 extracted references · 260 canonical work pages · 7 internal anchors

  1. [1]

    Advances in Neural Information Processing Systems , volume=

    Variational inference via Wasserstein gradient flows , author=. Advances in Neural Information Processing Systems , volume=

  2. [2]

    Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences , volume=

    Large deviations and gradient flows , author=. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences , volume=. 2013 , publisher=

  3. [3]

    arXiv preprint arXiv:1909.12077 , year=

    Symplectic ode-net: Learning hamiltonian dynamics with control , author=. arXiv preprint arXiv:1909.12077 , year=

  4. [4]

    Handbook of Numerical Analysis , volume=

    Lagrangian schemes for Wasserstein gradient flows , author=. Handbook of Numerical Analysis , volume=. 2021 , publisher=

  5. [5]

    arXiv preprint arXiv:2410.07550 , year=

    Conditional Lagrangian Wasserstein Flow for Time Series Imputation , author=. arXiv preprint arXiv:2410.07550 , year=

  6. [6]

    arXiv preprint arXiv:2505.11823 , year=

    Variational Regularized Unbalanced Optimal Transport: Single Network, Least Action , author=. arXiv preprint arXiv:2505.11823 , year=

  7. [7]

    arXiv preprint arXiv:2504.18506 (2025)

    Action-Minimization Meets Generative Modeling: Efficient Transition Path Sampling with the Onsager-Machlup Functional , author=. arXiv preprint arXiv:2504.18506 , year=

  8. [8]

    arXiv preprint arXiv:2204.04853 , year=

    Neural lagrangian schr " odinger bridge: Diffusion modeling for population dynamics , author=. arXiv preprint arXiv:2204.04853 , year=

  9. [9]

    Advances in Neural Information Processing Systems , volume=

    Unsupervised learning of lagrangian dynamics from images for prediction and control , author=. Advances in Neural Information Processing Systems , volume=

  10. [10]

    Numerische Mathematik , volume=

    A computational fluid mechanics solution to the Monge-Kantorovich mass transfer problem , author=. Numerische Mathematik , volume=. 2000 , publisher=

  11. [11]

    SIAM journal on mathematical analysis , volume=

    The variational formulation of the Fokker--Planck equation , author=. SIAM journal on mathematical analysis , volume=. 1998 , publisher=

  12. [12]

    Partial differential equations and applications , pages=

    An extended variational principle , author=. Partial differential equations and applications , pages=. 2017 , publisher=

  13. [13]

    Journal of mathematics of Kyoto University , volume=

    The Onsager-Machlup function for diffusion processes , author=. Journal of mathematics of Kyoto University , volume=. 1982 , publisher=

  14. [14]

    Econometrica: Journal of the Econometric Society , pages=

    The dimensionality of the aliasing problem in models with rational spectral densities , author=. Econometrica: Journal of the Econometric Society , pages=. 1983 , publisher=

  15. [15]

    Statistical inference for stochastic processes , volume=

    Identification and inference for multivariate cointegrated and ergodic Gaussian diffusions , author=. Statistical inference for stochastic processes , volume=. 2004 , publisher=

  16. [16]

    Econometric Theory , volume=

    Identifying restrictions for finite parameter continuous time models with discrete time data , author=. Econometric Theory , volume=. 2017 , publisher=

  17. [17]

    Stochastic analysis and applications , volume=

    Construction of equivalent stochastic differential equation models , author=. Stochastic analysis and applications , volume=. 2008 , publisher=

  18. [18]

    IEEE Transactions on Signal Processing , volume=

    Identifying latent stochastic differential equations , author=. IEEE Transactions on Signal Processing , volume=. 2021 , publisher=

  19. [19]

    Advances in Neural Information Processing Systems , volume=

    Generator Identification for Linear SDEs with Additive and Multiplicative Noise , author=. Advances in Neural Information Processing Systems , volume=

  20. [20]

    Journal of the Royal Society Interface , volume=

    Identifiability analysis for stochastic differential equation models in systems biology , author=. Journal of the Royal Society Interface , volume=. 2020 , publisher=

  21. [21]

    Identifiability of SDEs for reaction networks

    Identifiability of SDEs for reaction networks , author=. arXiv preprint arXiv:2505.07638 , year=

  22. [22]

    Cell Systems , volume=

    Noise distorts the epigenetic landscape and shapes cell-fate decisions , author=. Cell Systems , volume=. 2022 , publisher=

  23. [23]

    Proceedings of the National Academy of Sciences , volume=

    Fundamental limits on dynamic inference from single-cell snapshots , author=. Proceedings of the National Academy of Sciences , volume=. 2018 , publisher=

  24. [24]

    2025 , school=

    On Qualitative Experimental Design for PDE Parameter Identification Inverse Problems , author=. 2025 , school=

  25. [25]

    Journal of Mathematical Biology , volume=

    Structural identifiability of linear-in-parameter parabolic PDEs through auxiliary elliptic operators , author=. Journal of Mathematical Biology , volume=. 2025 , publisher=

  26. [26]

    Mathematical biosciences , volume=

    On structural identifiability , author=. Mathematical biosciences , volume=. 1970 , publisher=

  27. [27]

    PloS one , volume=

    On finding and using identifiable parameter combinations in nonlinear dynamic systems biology models and COMBOS: a novel web implementation , author=. PloS one , volume=. 2014 , publisher=

  28. [28]

    Computer methods and programs in biomedicine , volume=

    DAISY: A new software tool to test global identifiability of biological and physiological systems , author=. Computer methods and programs in biomedicine , volume=. 2007 , publisher=

  29. [29]

    Bioinformatics , volume=

    GenSSI: a software toolbox for structural identifiability analysis of biological models , author=. Bioinformatics , volume=. 2011 , publisher=

  30. [30]

    Bioinformatics , volume=

    Structural and practical identifiability analysis of partially observed dynamical models by exploiting the profile likelihood , author=. Bioinformatics , volume=. 2009 , publisher=

  31. [31]

    SIAM Journal on Applied Dynamical Systems , volume=

    Identifiability of linear and linear-in-parameters dynamical systems from a single trajectory , author=. SIAM Journal on Applied Dynamical Systems , volume=. 2014 , publisher=

  32. [32]

    Bioinformatics , volume=

    Benchmarking tools for a priori identifiability analysis , author=. Bioinformatics , volume=. 2023 , publisher=

  33. [33]

    arXiv preprint arXiv:2409.06879 , year=

    Joint trajectory and network inference via reference fitting , author=. arXiv preprint arXiv:2409.06879 , year=

  34. [34]

    International Conference on Machine Learning , pages=

    Learning population-level diffusions with generative RNNs , author=. International Conference on Machine Learning , pages=. 2016 , organization=

  35. [35]

    Flow Matching for Generative Modeling

    Flow matching for generative modeling , author=. arXiv preprint arXiv:2210.02747 , year=

  36. [36]

    International conference on machine learning , pages=

    Trajectorynet: A dynamic optimal transport network for modeling cellular dynamics , author=. International conference on machine learning , pages=. 2020 , organization=

  37. [37]

    arXiv preprint arXiv:2307.03672 , year=

    Simulation-free schr " odinger bridges via score and flow matching , author=. arXiv preprint arXiv:2307.03672 , year=

  38. [38]

    arXiv preprint arXiv:2002.02798 , volume=

    How to train your neural ode , author=. arXiv preprint arXiv:2002.02798 , volume=

  39. [39]

    Improving and generalizing flow-based generative models with minibatch optimal transport

    Improving and generalizing flow-based generative models with minibatch optimal transport , author=. arXiv preprint arXiv:2302.00482 , year=

  40. [40]

    Advances in Neural Information Processing Systems , volume=

    Metric flow matching for smooth interpolations on the data manifold , author=. Advances in Neural Information Processing Systems , volume=

  41. [41]

    arXiv preprint arXiv:2408.14608 , year=

    Meta flow matching: Integrating vector fields on the wasserstein manifold , author=. arXiv preprint arXiv:2408.14608 , year=

  42. [42]

    arXiv preprint arXiv:2510.26645 , year=

    Curly flow matching for learning non-gradient field dynamics , author=. arXiv preprint arXiv:2510.26645 , year=

  43. [43]

    Nature communications , volume=

    Generative modeling of single-cell time series with PRESCIENT enables prediction of cell trajectories with interventions , author=. Nature communications , volume=. 2021 , publisher=

  44. [44]

    International conference on machine learning , pages=

    Action matching: Learning stochastic dynamics from samples , author=. International conference on machine learning , pages=. 2023 , organization=

  45. [45]

    arXiv preprint arXiv:2507.05107 , year=

    DICE: Discrete inverse continuity equation for learning population dynamics , author=. arXiv preprint arXiv:2507.05107 , year=

  46. [46]

    Multi-marginal

    Shen, Yunyi and Berlinghieri, Renato and Broderick, Tamara , journal=. Multi-marginal

  47. [47]

    arXiv preprint arXiv:2505.16082 , year=

    Oh SnapMMD! Forecasting Stochastic Dynamics Beyond the Schr " odinger Bridge's End , author=. arXiv preprint arXiv:2505.16082 , year=

  48. [48]

    arXiv preprint arXiv:2505.15987 , year=

    Towards Identifiability of Interventional Stochastic Differential Equations , author=. arXiv preprint arXiv:2505.15987 , year=

  49. [49]

    arXiv preprint arXiv:2410.22729 , year=

    Identifying Drift, Diffusion, and Causal Structure from Temporal Snapshots , author=. arXiv preprint arXiv:2410.22729 , year=

  50. [50]

    The 29th International Conference on Artificial Intelligence and Statistics , year=

    Gradient-flow SDEs have unique transient population dynamics , author=. The 29th International Conference on Artificial Intelligence and Statistics , year=

  51. [51]

    arXiv preprint arXiv:2406.12616 , year=

    Learning Diffusion at Lightspeed , author=. arXiv preprint arXiv:2406.12616 , year=

  52. [52]

    arXiv preprint arXiv:2506.01502 , year=

    Learning of Population Dynamics: Inverse Optimization Meets JKO Scheme , author=. arXiv preprint arXiv:2506.01502 , year=

  53. [53]

    arXiv preprint arXiv:2411.01982 , year=

    Learning Controlled Stochastic Differential Equations , author=. arXiv preprint arXiv:2411.01982 , year=

  54. [54]

    Trajectory inference via mean-field

    Chizat, L. Trajectory inference via mean-field. Advances in Neural Information Processing Systems , volume=

  55. [55]

    Cell , volume=

    Optimal-transport analysis of single-cell gene expression identifies developmental trajectories in reprogramming , author=. Cell , volume=. 2019 , publisher=

  56. [56]

    arXiv preprint arXiv:2102.09204 , year=

    Towards a mathematical theory of trajectory inference , author=. arXiv preprint arXiv:2102.09204 , year=

  57. [57]

    2007 , publisher=

    Parameter estimation in stochastic differential equations , author=. 2007 , publisher=

  58. [58]

    2011 , journal =

    Parameter Estimation for Rough Differential Equations , author =. 2011 , journal =. doi:10.1214/11-AOS893 , urldate =

  59. [59]

    17 Generalized method of moments: Econometric applications , series =

    Masao Ogaki , abstract =. 17 Generalized method of moments: Econometric applications , series =. 1993 , booktitle =. doi:https://doi.org/10.1016/S0169-7161(05)80052-5 , url =

  60. [60]

    Annual Reviews in Control , volume=

    Parameter estimation in stochastic differential equations: an overview , author=. Annual Reviews in Control , volume=. 2000 , publisher=

  61. [61]

    Journal of Sound and Vibration , volume=

    Stochastic parameter estimation in nonlinear time-delayed vibratory systems with distributed delay , author=. Journal of Sound and Vibration , volume=. 2013 , publisher=

  62. [62]

    InterStat , volume=

    Maximum likelihood estimation for short time series with replicated observations: A simulation study , author=. InterStat , volume=

  63. [63]

    Journal of the American Statistical Association , volume=

    A multiresolution method for parameter estimation of diffusion processes , author=. Journal of the American Statistical Association , volume=. 2012 , publisher=

  64. [64]

    Automatica , volume=

    Parameter estimation in stochastic grey-box models , author=. Automatica , volume=. 2004 , publisher=

  65. [65]

    Identification of continuous-time models from sampled data , pages=

    Estimation of continuous-time stochastic system parameters , author=. Identification of continuous-time models from sampled data , pages=. 2008 , publisher=

  66. [66]

    The Annals of Statistics , volume=

    NONPARAMETRIC STATISTICAL INFERENCE FOR DRIFT VECTOR FIELDS OF MULTI-DIMENSIONAL DIFFUSIONS , author=. The Annals of Statistics , volume=. 2020 , publisher=

  67. [67]

    arXiv preprint arXiv:2312.08493 , year=

    Deep learning-based estimation of time-dependent parameters in Markov models with application to nonlinear regression and SDEs , author=. arXiv preprint arXiv:2312.08493 , year=

  68. [68]

    SIAM Journal on Matrix Analysis and Applications , volume=

    Identifiability in continuous Lyapunov models , author=. SIAM Journal on Matrix Analysis and Applications , volume=. 2023 , publisher=

  69. [69]

    Bernoulli , volume=

    Graphical modeling of stochastic processes driven by correlated noise , author=. Bernoulli , volume=. 2022 , publisher=

  70. [70]

    Electronic Journal of Probability , year=

    Causal interpretation of stochastic differential equations , author=. Electronic Journal of Probability , year=

  71. [71]

    Engineering , volume=

    Data-driven discovery of stochastic differential equations , author=. Engineering , volume=. 2022 , publisher=

  72. [72]

    Journal of the Royal Statistical Society Series B: Statistical Methodology , volume=

    Graphical models for marked point processes based on local independence , author=. Journal of the Royal Statistical Society Series B: Statistical Methodology , volume=. 2008 , publisher=

  73. [73]

    The Annals of Statistics , volume=

    Markov equivalence of marginalized local independence graphs , author=. The Annals of Statistics , volume=. 2020 , publisher=

  74. [74]

    arXiv preprint arXiv:2406.01161 , year=

    Dynamic Structural Causal Models , author=. arXiv preprint arXiv:2406.01161 , year=

  75. [75]

    arXiv preprint arXiv:2402.18477 , year=

    Signature Kernel Conditional Independence Tests in Causal Discovery for Stochastic Processes , author=. arXiv preprint arXiv:2402.18477 , year=

  76. [76]

    Entropy , volume=

    Kernel-based independence tests for causal structure learning on functional data , author=. Entropy , volume=. 2023 , publisher=

  77. [77]

    arXiv preprint arXiv:2310.02366 , year=

    Stochastic force inference via density estimation , author=. arXiv preprint arXiv:2310.02366 , year=

  78. [78]

    International Conference on Artificial Intelligence and Statistics , pages=

    Causal Modeling with Stationary Diffusions , author=. International Conference on Artificial Intelligence and Statistics , pages=. 2024 , organization=

  79. [79]

    Journal of Public Administration Research and Theory , volume=

    Causal inference methods: Lessons from applied microeconomics , author=. Journal of Public Administration Research and Theory , volume=. 2019 , publisher=

  80. [80]

    Frontiers in pharmacology , volume=

    Propensity score methods in health technology assessment: principles, extended applications, and recent advances , author=. Frontiers in pharmacology , volume=. 2019 , publisher=

Showing first 80 references.