Recognition: unknown
Optimal Linear Interpolation under Differential Information: application to the prediction of perfect flows
Pith reviewed 2026-05-10 04:29 UTC · model grok-4.3
The pith
Constrained Kriging uses Lagrange multipliers to strongly enforce linear PDE constraints at prediction points.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The constrained Kriging optimization problem, solved via a Lagrangian formulation, strongly satisfies linear PDE constraints at the points of prediction while remaining optimal among linear interpolators given the primary observations and differential information.
What carries the argument
Lagrangian formulation of the constrained Kriging optimization that enforces linear PDE constraints exactly at prediction collocation points.
Load-bearing premise
Linear PDE information supplied at a finite set of collocation points can be added without creating inconsistencies with the primary field observations or prohibitive scaling costs in higher dimensions.
What would settle it
A numerical example in three or more dimensions where the constrained predictor violates the PDE at a chosen prediction point or produces values inconsistent with the given observations.
Figures
read the original abstract
Approximation of functions satisfying partial differential equations (PDEs) is paramount for simulation of physical fluid flows and other problems in physics. Recently, physics-informed machine learning approaches have proven useful as a data-driven complement to numerical models for partial differential equations, bringing faster responses and allowing us to capitalize on past observations. However, their efficiency and convergence depend on the availability of vast training datasets. For sparse observations, Gaussian process regression or Kriging has emerged as a powerful interpolation model, offering principled estimates and uncertainty quantification. Several attempts have been made to condition Gaussian processes on linear PDEs via artificial or collocation observations and kernel design.These methods suffer from scalability issues in higher dimensions and limited generalizability. The aim of this study is to explore the extension of the Kriging predictor in the presence of linear PDE information at a finite number of collocation points. Two approaches are proposed: 1) A collocated co-Kriging with primary observations of the physical field and auxiliary differential observations; 2) A constrained Kriging optimization problem strongly satisfying linear PDE constraints at the points of prediction through a Lagrangian formulation. Numerical experiments are given for ordinary differential equations, 2D harmonic PDEs and an application to perfect flows around a cylinder. This work highlights a trade-off between the computational efficiency of the Lagrange multipliers approach and the strict interpolation of observations.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript proposes two extensions of Kriging interpolation to incorporate linear PDE information at collocation points: (1) a collocated co-Kriging formulation treating differential quantities as auxiliary observations, and (2) a constrained optimization that enforces the linear PDE constraints exactly at prediction points via Lagrange multipliers. These are applied to ODEs, 2D harmonic equations, and perfect flows around a cylinder, with the abstract noting a computational-efficiency versus strict-interpolation trade-off for the Lagrangian approach.
Significance. If the central constructions are shown to be consistent with Kriging exactness and supported by error analysis, the work could offer a principled route to physics-constrained linear predictors for sparse fluid data, complementing existing co-Kriging and kernel-design methods. The explicit Lagrangian formulation for strong PDE satisfaction at prediction sites and the cylinder-flow example are potentially useful, but the absence of convergence results or quantitative validation in the current description limits immediate impact.
major comments (2)
- [Abstract] Abstract: The claim of 'optimal linear interpolation under differential information' is qualified by an acknowledged trade-off with 'the strict interpolation of observations' in the Lagrange-multipliers approach. Standard Kriging optimality rests on exact reproduction of primary observations (unbiasedness constraint). It is unclear whether the Lagrangian system augments or replaces this constraint; if the latter, the resulting predictor is no longer guaranteed to interpolate the data, undermining the optimality claim. A derivation showing the modified normal equations and the conditions under which exact interpolation is retained is required.
- [Numerical experiments] Numerical experiments section: The abstract states that experiments are performed for ODEs, 2D harmonic PDEs, and perfect flows, yet reports no error norms, convergence rates, or comparisons against standard Kriging or physics-informed baselines. Without quantitative metrics or an assessment of how the PDE constraints affect prediction accuracy versus the trade-off, it is impossible to evaluate whether the methods deliver the promised improvement for the cylinder-flow application.
minor comments (1)
- [Abstract] The abstract would be strengthened by a single sentence summarizing the key mathematical distinction between the two proposed approaches and one quantitative highlight from the experiments.
Simulated Author's Rebuttal
Thank you for the referee's constructive comments. We will revise the manuscript to address the concerns on the optimality claim and to include quantitative validation. Point-by-point responses are below.
read point-by-point responses
-
Referee: [Abstract] Abstract: The claim of 'optimal linear interpolation under differential information' is qualified by an acknowledged trade-off with 'the strict interpolation of observations' in the Lagrange-multipliers approach. Standard Kriging optimality rests on exact reproduction of primary observations (unbiasedness constraint). It is unclear whether the Lagrangian system augments or replaces this constraint; if the latter, the resulting predictor is no longer guaranteed to interpolate the data, undermining the optimality claim. A derivation showing the modified normal equations and the conditions under which exact interpolation is retained is required.
Authors: The Lagrangian formulation replaces the standard unbiasedness constraint with exact enforcement of the linear PDE constraints at the prediction points. This yields a minimum-variance predictor under the PDE constraints, but does not guarantee exact interpolation of the primary observations, which is the acknowledged trade-off. The collocated co-Kriging approach retains the standard exact-interpolation property. We will add a derivation of the modified normal equations in the revision, together with the conditions (e.g., consistency of observations with the PDE) under which exact data interpolation is recovered. revision: yes
-
Referee: [Numerical experiments] Numerical experiments section: The abstract states that experiments are performed for ODEs, 2D harmonic PDEs, and perfect flows, yet reports no error norms, convergence rates, or comparisons against standard Kriging or physics-informed baselines. Without quantitative metrics or an assessment of how the PDE constraints affect prediction accuracy versus the trade-off, it is impossible to evaluate whether the methods deliver the promised improvement for the cylinder-flow application.
Authors: We agree that quantitative metrics are essential. The revised manuscript will include relative L2 error norms, convergence rates with respect to the number of collocation points, and comparisons against standard Kriging and other physics-informed baselines for the ODE, harmonic PDE, and cylinder-flow cases. These additions will quantify accuracy improvements and clarify the efficiency-versus-interpolation trade-off. revision: yes
Circularity Check
No circularity; derivation self-contained from standard Kriging and constrained optimization
full rationale
The paper proposes two extensions of Kriging: collocated co-Kriging incorporating auxiliary differential observations and a Lagrangian-constrained optimization enforcing linear PDE conditions at prediction points. These follow directly from established Gaussian process regression, co-Kriging formulations, and standard constrained optimization without any reduction of predictions to fitted inputs by construction, self-definitional loops, or load-bearing self-citations. The abstract explicitly notes the trade-off with strict interpolation of observations, confirming the construction does not claim exactness where it is relaxed. No ansatzes are smuggled via prior work, and no uniqueness theorems or renamings of known results are invoked as derivations. The approach remains externally falsifiable against PDE solutions and standard Kriging benchmarks.
Axiom & Free-Parameter Ledger
axioms (2)
- domain assumption The unknown function satisfies a given linear partial differential equation at collocation points.
- standard math Standard Gaussian process regression assumptions hold (stationarity, positive-definiteness of kernel).
Reference graph
Works this paper leans on
-
[1]
A Review of Gaussian Random Fields and Correlation Functions
Petter Abrahamsen. A Review of Gaussian Random Fields and Correlation Functions . Norwegian Computing Center, 2 edition, April 1997
1997
-
[2]
Using parametric effec- tiveness for efficient CAD-based adjoint optimization
Dheeraj Agarwal, Christos Kapellos, Trevor Robinson, and Cecil Armstrong. Using parametric effec- tiveness for efficient CAD-based adjoint optimization. Computer-Aided Design and Applications, 16, 05 2018
2018
-
[3]
Prediction of nonlinear spatial functionals
Jeremy Aldworth and Noel Cressie. Prediction of nonlinear spatial functionals. Journal of Statistical Planning and Inference, 112:3–41, 3 2003
2003
-
[4]
M. A. Alvarez, D. Luengo, and N. D. Lawrence. Linear latent force models using gaussian processes. IEEE Transactions on Pattern Analysis and Machine Intelligence , 35:2693–2705, 11 2013
2013
-
[5]
´Alvarez, Lorenzo Rosasco, and Neil D
Mauricio A. ´Alvarez, Lorenzo Rosasco, and Neil D. Lawrence. Kernels for vector-valued functions: A review. Foundations and Trends®in Machine Learning, 4:195–266, 2012
2012
-
[6]
William F. Ames. Numerical Methods for Partial Differential Equations . Elsevier, 3, 1977. 24
1977
-
[7]
Cross validation and maximum likelihood estimations of hyper-parameters of gaussian processes with model misspecification
Fran¸ cois Bachoc. Cross validation and maximum likelihood estimations of hyper-parameters of gaussian processes with model misspecification. Computational Statistics & Data Analysis , 66:55–69, 10 2013
2013
-
[8]
L´ opez-Lopera
Fran¸ cois Bachoc, Agn` es Lagnoux, and Andr´ es F. L´ opez-Lopera. Maximum likelihood estimation for gaussian processes under inequality constraints. Electronic Journal of Statistics , 13, 1 2019
2019
-
[9]
Hemachandra, and Vivek Kumar Mishra
Shalabh Bhatnagar, N. Hemachandra, and Vivek Kumar Mishra. Stochastic approximation algorithms for constrained optimization via simulation. ACM Transactions on Modeling and Computer Simulation, 21:1–22, 3 2011
2011
-
[10]
Blitzstein and Jessica Hwang
Joseph K. Blitzstein and Jessica Hwang. Introduction to Probability. Chapman and Hall/CRC, 7 2014
2014
-
[11]
Yifan Chen, Bamdad Hosseini, Houman Owhadi, and Andrew M. Stuart. Solving and learning nonlinear pdes with gaussian processes. Journal of Computational Physics , 447:110668, December 2021
2021
-
[12]
Geostatistics: Modeling Spatial Uncertainty
Jean-Paul Chil` es and Pierre Delfiner. Geostatistics: Modeling Spatial Uncertainty . Wiley, February 2012
2012
-
[13]
The Spatial Random Field Model , pages 21–106
George Christakos. The Spatial Random Field Model , pages 21–106. Elsevier, 1992
1992
-
[14]
Aggregation in Geostatistical Problems, pages 25–36
Noel Cressie. Aggregation in Geostatistical Problems, pages 25–36. Springer Netherlands, 1993
1993
-
[15]
Noel A. C. Cressie. Statistics for Spatial Data . Wiley, 9 1993
1993
-
[16]
Cross and Timothy J
Elizabeth J. Cross and Timothy J. Rogers. Physics-derived covariance functions for machine learning in structural dynamics. IFAC-PapersOnLine, 54:168–173, 2021
2021
-
[17]
Scientific machine learning through physics–informed neural networks: Where we are and what’s next
Salvatore Cuomo, Vincenzo Schiano Di Cola, Fabio Giampaolo, Gianluigi Rozza, Maziar Raissi, and Francesco Piccialli. Scientific machine learning through physics–informed neural networks: Where we are and what’s next. Journal of Scientific Computing , 92:88–, 2022
2022
-
[18]
Gaussian process modeling with inequality constraints
S´ ebastien Da Veiga and Amandine Marrel. Gaussian process modeling with inequality constraints. Annales de la facult´ e des sciences de Toulouse Math´ ematiques, 21(3):529–555, 4 2012
2012
-
[19]
High-dimensional gaussian process inference with derivatives
Filip de Roos, Alexandra Gessner, and Philipp Hennig. High-dimensional gaussian process inference with derivatives. 2 2021
2021
-
[20]
Cross validation of kriging in a unique neighborhood
Olivier Dubrule. Cross validation of kriging in a unique neighborhood. Journal of the International Association for Mathematical Geology, 15:687–699, 12 1983
1983
-
[21]
Airfoil Design and Data
Richard Eppler. Airfoil Design and Data . Springer Berlin Heidelberg, 1990
1990
-
[22]
Panel methods: An introduction
Larry Erickson. Panel methods: An introduction. 1990
1990
-
[23]
Ertun¸ c, A.E
G. Ertun¸ c, A.E. Tercan, M.A. Hindistan, B.¨Unver, S. ¨Unal, F. Atalay, and S.Y. Kıllıo˘ glu. Geostatistical estimation of coal quality variables by using covariance matching constrained kriging. International Journal of Coal Geology , 112:14–25, 6 2013
2013
-
[24]
Partial Differential Equations
Lawrence Evans. Partial Differential Equations . American Mathematical Society, 3 2010
2010
-
[25]
Phygeonet: Physics-informed geometry-adaptive convo- lutional neural networks for solving parameterized steady-state pdes on irregular domain
Han Gao, Luning Sun, and Jian-Xun Wang. Phygeonet: Physics-informed geometry-adaptive convo- lutional neural networks for solving parameterized steady-state pdes on irregular domain. Journal of Computational Physics, 428:110079, 3 2021
2021
-
[26]
Solving noisy linear operator equations by gaussian processes: Application to ordinary and partial differential equations
Thore Graepel. Solving noisy linear operator equations by gaussian processes: Application to ordinary and partial differential equations. volume 1, page 234 – 241, 2003
2003
-
[27]
Characterization of the second order random fields subject to linear distributional pde constraints
Iain Henderson, Pascal Noble, and Olivier Roustant. Characterization of the second order random fields subject to linear distributional pde constraints. Bernoulli, 29(4):3396–3422, November 2023
2023
-
[28]
Covariance models and gaussian process regression for the wave equation
Iain Henderson, Pascal Noble, and Olivier Roustant. Covariance models and gaussian process regression for the wave equation. application to related inverse problems. Journal of Computational Physics , 494:112519, December 2023
2023
-
[29]
constrainedkriging: An r-package for customary, constrained and covariance-matching constrained point or block kriging
Christoph Hofer and Andreas Papritz. constrainedkriging: An r-package for customary, constrained and covariance-matching constrained point or block kriging. Computers & Geosciences, 37:1562–1569, 10 2011
2011
-
[30]
Carl Jidling, Niklas Wahlstrom, Adrian Wills, and Thomas B. Sch¨ on. Linearly constrained gaussian pro- cesses. In Proceedings of the 31st International Conference on Neural Information Processing Systems , NIPS’17, page 1215–1224, Red Hook, NY, USA, 2017. Curran Associates Inc. 25
2017
-
[31]
Physics- informed machine learning
George Karniadakis, Yannis Kevrekidis, Lu Lu, Paris Perdikaris, Sifan Wang, and Liu Yang. Physics- informed machine learning. Nature Reviews Physics, pages 1–19, 05 2021
2021
-
[32]
Interpolation and extrapolation of stationary sequences
Andre¨ ı Nikola¨ ıevitch Kolmogorov. Interpolation and extrapolation of stationary sequences. Izvestiia Akademii Nauk SSSR, Seriia Matematicheskaia , pages 3–14, 1941
1941
-
[33]
Neural operator: Learning maps between function spaces
Nikola B. Kovachki, Zongyi Li, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, An- drew M. Stuart, and Anima Anandkumar. Neural operator: Learning maps between function spaces. CoRR, abs/2108.08481, 2021
-
[34]
Fourier neural operator with learned deformations for pdes on general geometries
Zongyi Li, Daniel Zhengyu Huang, Burigede Liu, and Anima Anandkumar. Fourier neural operator with learned deformations for pdes on general geometries. J. Mach. Learn. Res. , 24(1), January 2023
2023
-
[35]
A dual-dimer method for training physics-constrained neural networks with minimax architecture
Dehao Liu and Yan Wang. A dual-dimer method for training physics-constrained neural networks with minimax architecture. Neural Networks, 136:112–125, 4 2021
2021
-
[36]
The ubiquitous kronecker product
Charles F.Van Loan. The ubiquitous kronecker product. Journal of Computational and Applied Math- ematics, 123:85–100, 11 2000
2000
-
[37]
Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators
Lu Lu, Pengzhan Jin, Guofei Pang, Zhongqiang Zhang, and George Em Karniadakis. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Machine Intelligence, 3(3):218–229, 2021
2021
-
[38]
L´ opez-Lopera, Fran¸ cois Bachoc, Nicolas Durrande, and Olivier Roustant
Andr´ es F. L´ opez-Lopera, Fran¸ cois Bachoc, Nicolas Durrande, and Olivier Roustant. Finite-dimensional gaussian approximation with linear inequality constraints. SIAM/ASA Journal on Uncertainty Quan- tification, 6(3):1224–1255, January 2018
2018
-
[39]
Probabilistic surrogate modeling by gaussian process: A new esti- mation algorithm for more robust prediction
Amandine Marrel and Bertrand Iooss. Probabilistic surrogate modeling by gaussian process: A new esti- mation algorithm for more robust prediction. Reliability Engineering & System Safety, 247:110120, 7 2024
2024
-
[40]
Narcowich and Joseph D
Francis J. Narcowich and Joseph D. Ward. Generalized hermite interpolation via matrix-valued condi- tionally positive definite functions. Mathematics of Computation , 63:661, 10 1994
1994
-
[41]
Automatic differentiation in pytorch
Adam Paszke, Sam Gross, Soumith Chintala, Gregory Chanan, Edward Yang, Zachary DeVito, Zeming Lin, Alban Desmaison, Luca Antiga, and Adam Lerer. Automatic differentiation in pytorch. 2017
2017
-
[42]
Raissi, P
M. Raissi, P. Perdikaris, and G.E. Karniadakis. Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations. Journal of Computational Physics , 378:686–707, February 2019
2019
-
[43]
Machine learning of linear differential equations using gaussian processes
Maziar Raissi, Paris Perdikaris, and George Em Karniadakis. Machine learning of linear differential equations using gaussian processes. Journal of Computational Physics , 348:683–693, November 2017
2017
-
[44]
Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning . The MIT Press, nov 2005
2005
-
[45]
A joint kriging model with application to constrained classifica- tion
Didier Rulli` ere and Marc Grossouvre. A joint kriging model with application to constrained classifica- tion. Statistics and Computing , 35, 12 2025
2025
-
[46]
vol 6792
Simo S¨ arkk¨ a.Linear Operators and Stochastic Partial Differential Equations in Gaussian Process Regression, pages 151–158. vol 6792. Springer Berlin Heidelberg, 2011
2011
-
[47]
Covariance models for divergence-free and curl-free random vector fields
Michael Scheuerer and Martin Schlather. Covariance models for divergence-free and curl-free random vector fields. Stochastic Models, 28(3):433–451, 2012
2012
-
[48]
Bayesian Solution of Ordinary Differential Equations , page 23–37
John Skilling. Bayesian Solution of Ordinary Differential Equations , page 23–37. Springer Netherlands, 1992
1992
-
[49]
Solak, R
E. Solak, R. Murray-smith, W. Leithead, D. Leith, and Carl Rasmussen. Derivative observations in gaussian process models of dynamic systems. In S. Becker, S. Thrun, and K. Obermayer, editors, Advances in Neural Information Processing Systems , volume 15. MIT Press, 2002
2002
-
[50]
Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data
Luning Sun, Han Gao, Shaowu Pan, and Jian-Xun Wang. Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data. Computer Methods in Applied Mechanics and Engineering, 361:112732, 4 2020
2020
-
[51]
Swiler, Mamikon Gulian, Ari L
Laura P. Swiler, Mamikon Gulian, Ari L. Frankel, Cosmin Safta, and John D. Jakeman. A survey of constrained gaussian process regression: Approaches and implementation challenges. Journal of Machine Learning for Modeling and Computing , 1:119–156, 2020. 26
2020
-
[52]
Pdebench: An extensive benchmark for scientific machine learning
Makoto Takamoto, Timothy Praditia, Raphael Leiteritz, Daniel MacKinlay, Francesco Alesiani, Dirk Pfl¨ uger, and Mathias Niepert. Pdebench: An extensive benchmark for scientific machine learning. Advances in Neural Information Processing Systems , 35:1596–1611, 2022
2022
-
[53]
From pinns to pikans: Recent advances in physics-informed machine learning
Juan Diego Toscano, Vivek Oommen, Alan John Varghese, Zongren Zou, Nazanin Ahmadi Daryakenari, Chenxi Wu, and George Em Karniadakis. From pinns to pikans: Recent advances in physics-informed machine learning. Machine Learning for Computational Science and Engineering , 1(1):1–43, 2025
2025
-
[54]
Multivariate Geostatistics
Hans Wackernagel. Multivariate Geostatistics. Springer Berlin Heidelberg, 2003
2003
-
[55]
When and why pinns fail to train: A neural tangent kernel perspective
Sifan Wang, Xinling Yu, and Paris Perdikaris. When and why pinns fail to train: A neural tangent kernel perspective. Journal of Computational Physics , 449:110768, 1 2022
2022
-
[56]
Physics informed deep kernel learning
Zheng Wang, Wei Xing, Robert Kirby, and Shandian Zhe. Physics informed deep kernel learning. In Gustau Camps-Valls, Francisco J. R. Ruiz, and Isabel Valera, editors, Proceedings of The 25th International Conference on Artificial Intelligence and Statistics , volume 151 of Proceedings of Machine Learning Research, pages 1206–1218. PMLR, 28–30 Mar 2022
2022
-
[57]
Divergence-free kernel methods for approximating the stokes problem
Holger Wendland. Divergence-free kernel methods for approximating the stokes problem. SIAM Journal on Numerical Analysis, 47:3158–3179, 1 2009
2009
-
[58]
Extrapolation, interpolation, and smoothing of stationary time series, with engineer- ing applications/
Norbert Wiener. Extrapolation, interpolation, and smoothing of stationary time series, with engineer- ing applications/ . M.I.T. paperback series ; 9. Technology Press of the Massachusetts Institute of Technology, Cambridge, Massachusetts, First M.I.T. Press paperback edition, 1949
1949
-
[59]
B-pinns: Bayesian physics-informed neural networks for forward and inverse pde problems with noisy data
Liu Yang, Xuhui Meng, and George Em Karniadakis. B-pinns: Bayesian physics-informed neural networks for forward and inverse pde problems with noisy data. Journal of Computational Physics , 425:109913, 1 2021
2021
-
[60]
Physics-informed generative adversarial networks for stochastic differential equations
Liu Yang, Dongkun Zhang, and George Em Karniadakis. Physics-informed generative adversarial networks for stochastic differential equations. SIAM Journal on Scientific Computing , 42:A292–A317, 1 2020
2020
-
[61]
Joint Kriging weights under a predicted values constraint
Hao Zhongkai, Jiachen Yao, Chang Su, Hang Su, Ziao Wang, Fanzhi Lu, Zeyu Xia, Yichi Zhang, Songming Liu, Lu Lu, et al. Pinnacle: A comprehensive benchmark of physics-informed neural networks for solving pdes. Advances in Neural Information Processing Systems , 37:76721–76774, 2024. A Appendix A.1 Proofs Proof of Proposition 1. Consider a prediction locati...
2024
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.