KANs with learnable univariate spline activations on edges achieve better accuracy than MLPs with fewer parameters, faster scaling, and direct visualization for scientific discovery.
hub
Fourier Neural Operator for Parametric Partial Differential Equations
88 Pith papers cite this work. Polarity classification is still indexing.
abstract
The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation. The Fourier neural operator is the first ML-based method to successfully model turbulent flows with zero-shot super-resolution. It is up to three orders of magnitude faster compared to traditional PDE solvers. Additionally, it achieves superior accuracy compared to previous learning-based solvers under fixed resolution.
hub tools
citation-role summary
citation-polarity summary
claims ledger
- abstract The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fou
co-cited works
fields
cs.LG 44 math.NA 9 physics.flu-dyn 6 cs.CE 5 cs.AI 2 cs.CL 2 cs.CV 2 cs.RO 2 math.OC 2 physics.comp-ph 2roles
other 1polarities
unclear 1representative citing papers
Hodge Spectral Duality provides a topology-preserving neural operator by isolating unlearnable topological components via Hodge orthogonality and operator splitting.
Local neural operators on 3x3x3 patches, composed via Schwarz iteration, solve large-scale nonlinear elasticity on arbitrary geometries without domain-specific retraining.
Laplacian eigenfunction-based neural operators approximate the solution operator of the generalized Gierer-Meinhardt reaction-diffusion system with error bounds that imply only polynomial growth in parameters as accuracy improves.
A single-network fixed-point formulation for neural optimal transport eliminates adversarial min-max optimization and implicit differentiation while enforcing dual feasibility exactly.
A latent Structured Spectral Propagator enables stable autoregressive PDE forecasting by decoupling spatial details from recurrent modal dynamics.
A new Intent Fidelity Score and refinement loop verify that LLM-generated simulation code matches the intended PDEs, improving performance on a 220-case benchmark where execution alone fails to ensure correctness.
CATO learns a continuous latent chart for efficient axial attention on PDE meshes and adds derivative-aware supervision to improve accuracy and reduce oversmoothing on general geometries.
Spatio-Temporal MeanFlow adapts MeanFlow to PDEs by replacing the generative velocity field with the physical operator and extending the integral constraint to the spatio-temporal domain, yielding a unified solver for time-dependent and stationary equations with improved accuracy and generalization.
Commutativity regularization on Jacobians reduces transient error amplification in neural simulators, enabling stable rollouts over thousands of steps on physical and climate data.
PerFlow embeds physics constraints into rectified flow sampling through guidance-free conditioning and constraint-preserving projections, achieving efficient sparse reconstruction and uncertainty quantification for spatiotemporal dynamics.
PODiff performs conditional diffusion in a fixed, variance-ordered POD latent space to enable efficient probabilistic super-resolution of high-dimensional scientific fields with lower memory and better-calibrated uncertainty than pixel-space or dropout baselines.
Neural operators approximate continuous operators from H^s to H^t with O(N^{-s/d}) error in H^t norm; FNOs on Burgers achieve H^1 errors to 10^{-7} and follow a power-law scaling with exponent ~1.4.
Isotropic Fourier Neural Operators enforce spatial symmetries in Fourier layers, improving PDE-solving performance while reducing parameters by up to 16x in 2D and 96x in 3D.
A horizon-agnostic neural operator paired with a boundary control barrier function creates a real-time safety filter that raises safe trajectory rates by up to 22% on fluid manipulation tasks in simulation.
A physics-encoded Fourier neural operator (PeFNO) uses stress potentials to enforce divergence-free stress fields by architecture design, yielding better equilibrium satisfaction than physics-informed or physics-guided FNOs for polycrystalline materials under uniaxial tension.
Hybrid FNO-LBM accelerates porous media flow convergence by up to 70% via neural initialization and stabilizes unsteady simulations through embedded FNO rollouts, allowing small models to match larger ones in accuracy.
A safeguarded hybrid of Levenberg-Marquardt and learned operators achieves equivalent reconstruction quality for PGET in roughly one-third the iterations, with architecture-dependent robustness.
Physics-informed Fourier neural operators recover plasmoid formation in sparse SRRMHD vortex data where data-only models fail, and transformer operators approximate AMR jet evolution, marking first reported uses in these relativistic MHD settings.
PI-LNO is a physics-informed neural operator that uses Laplace transforms and fluid physics constraints to accurately and rapidly predict droplet spreading dynamics on complex surfaces.
AI models of viscous fingering exhibit hallucinations from spectral bias; DeepFingers combines FNO and DeepONet with time-contrast conditioning to predict accurate finger dynamics while preserving mixing metrics.
A graph-based neural operator trained on expert-validated race-car CFD data reaches accuracy levels usable for early-stage interactive aerodynamic design exploration.
A DeepRitzSplit neural operator trained on energy-split variational forms enforces dissipation in phase-field models and outperforms data-driven training in generalization while running faster than Fourier spectral methods on Allen-Cahn and dendritic growth cases.
VS-GNO delivers 0.71-1.04% reconstruction error at 15-24.5% spiking rates versus 0.4% for a non-spiking baseline in sparse-to-dense virtual sensing.
citing papers explorer
-
Fixed-Point Neural Optimal Transport without Implicit Differentiation
A single-network fixed-point formulation for neural optimal transport eliminates adversarial min-max optimization and implicit differentiation while enforcing dual feasibility exactly.
-
Man, Machine, and Mathematics
A high-level outline is given for a unified theory that reduces learning to a small set of ideas from dynamical systems, geometry, and physics via definitions of solvable problems and parametrized methods.