Neural operators approximate continuous operators from H^s to H^t with O(N^{-s/d}) error in H^t norm; FNOs on Burgers achieve H^1 errors to 10^{-7} and follow a power-law scaling with exponent ~1.4.
Neural operator: Learning maps between function spaces
6 Pith papers cite this work. Polarity classification is still indexing.
years
2026 6verdicts
UNVERDICTED 6representative citing papers
Hybrid FNO-LBM accelerates porous media flow convergence by up to 70% via neural initialization and stabilizes unsteady simulations through embedded FNO rollouts, allowing small models to match larger ones in accuracy.
Physics-informed Fourier neural operators recover plasmoid formation in sparse SRRMHD vortex data where data-only models fail, and transformer operators approximate AMR jet evolution, marking first reported uses in these relativistic MHD settings.
EquiNO with Q-DEIM creates reduced-order physics-informed surrogates for 3D hyperelastic RVEs that enforce equilibrium and periodicity by construction, achieve 10^3 speedups, and accurately interpolate and extrapolate stresses from few snapshots.
LESnets integrates LES equations and the law of the wall into F-FNO to enable data-free, stable long-term predictions of wall-bounded turbulence at Re_tau up to 1000 on coarse grids, matching traditional LES accuracy at higher efficiency.
Two Kriging variants add linear PDE constraints at collocation points for better interpolation of functions satisfying those equations, tested on ODEs, harmonic PDEs, and cylinder flows.
citing papers explorer
-
Quantitative Sobolev Approximation Bounds for Neural Operators with Empirical Validation on Burgers Equation
Neural operators approximate continuous operators from H^s to H^t with O(N^{-s/d}) error in H^t norm; FNOs on Burgers achieve H^1 errors to 10^{-7} and follow a power-law scaling with exponent ~1.4.
-
Hybrid Fourier Neural Operator-Lattice Boltzmann Method
Hybrid FNO-LBM accelerates porous media flow convergence by up to 70% via neural initialization and stabilizes unsteady simulations through embedded FNO rollouts, allowing small models to match larger ones in accuracy.
-
Learning Neural Operator Surrogates for the Black Hole Accretion Code
Physics-informed Fourier neural operators recover plasmoid formation in sparse SRRMHD vortex data where data-only models fail, and transformer operators approximate AMR jet evolution, marking first reported uses in these relativistic MHD settings.
-
Physics-Informed Reduced-Order Operator Learning for Hyperelasticity in Continuum Micromechanics
EquiNO with Q-DEIM creates reduced-order physics-informed surrogates for 3D hyperelastic RVEs that enforce equilibrium and periodicity by construction, achieve 10^3 speedups, and accurately interpolate and extrapolate stresses from few snapshots.
-
Large-eddy simulation nets (LESnets) based on physics-informed neural operator for wall-bounded turbulence
LESnets integrates LES equations and the law of the wall into F-FNO to enable data-free, stable long-term predictions of wall-bounded turbulence at Re_tau up to 1000 on coarse grids, matching traditional LES accuracy at higher efficiency.
-
Optimal Linear Interpolation under Differential Information: application to the prediction of perfect flows
Two Kriging variants add linear PDE constraints at collocation points for better interpolation of functions satisfying those equations, tested on ODEs, harmonic PDEs, and cylinder flows.