Recognition: 3 theorem links
· Lean TheoremFourCastNet: A Global Data-driven High-resolution Weather Model using Adaptive Fourier Neural Operators
Pith reviewed 2026-05-12 09:59 UTC · model grok-4.3
The pith
FourCastNet matches the ECMWF IFS forecasting accuracy at short lead times for large-scale variables while outperforming it on fine-scale features like precipitation and generating a week-long global forecast in under two seconds.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
FourCastNet is a global data-driven weather forecasting model that provides accurate short to medium-range global predictions at 0.25° resolution using adaptive Fourier neural operators. It accurately forecasts high-resolution, fast-timescale variables such as the surface wind speed, precipitation, and atmospheric water vapor. FourCastNet matches the forecasting accuracy of the ECMWF Integrated Forecasting System (IFS) at short lead times for large-scale variables, while outperforming IFS for variables with complex fine-scale structure, including precipitation. FourCastNet generates a week-long forecast in less than 2 seconds, orders of magnitude faster than IFS.
What carries the argument
Adaptive Fourier Neural Operators that learn resolution-invariant mappings between atmospheric state functions to enable efficient high-resolution forecasting.
If this is right
- Enables creation of rapid and inexpensive large-ensemble forecasts with thousands of members to improve probabilistic forecasting.
- Supports better prediction and planning for extreme weather events such as tropical cyclones, extra-tropical cyclones, and atmospheric rivers.
- Provides accurate surface wind speed forecasts that aid planning of wind energy resources.
- Acts as a valuable addition to the meteorology toolkit that can augment rather than replace traditional NWP models.
Where Pith is reading between the lines
- The approach could be hybridized with physics-based constraints to enforce conservation laws and improve stability at longer lead times.
- Low computational cost opens possibilities for real-time ensemble nowcasting at scales not feasible with IFS.
- The same operator framework might accelerate ensemble climate projections by running far more members than current physics models allow.
Load-bearing premise
The neural operator trained on historical reanalysis data will generalize accurately to future unseen weather states, including rare extreme events, without explicit enforcement of physical conservation laws or stability constraints.
What would settle it
A side-by-side evaluation on an independent extreme precipitation event or tropical cyclone where FourCastNet errors exceed those of IFS at the same lead time would show the generalization claim does not hold.
read the original abstract
FourCastNet, short for Fourier Forecasting Neural Network, is a global data-driven weather forecasting model that provides accurate short to medium-range global predictions at $0.25^{\circ}$ resolution. FourCastNet accurately forecasts high-resolution, fast-timescale variables such as the surface wind speed, precipitation, and atmospheric water vapor. It has important implications for planning wind energy resources, predicting extreme weather events such as tropical cyclones, extra-tropical cyclones, and atmospheric rivers. FourCastNet matches the forecasting accuracy of the ECMWF Integrated Forecasting System (IFS), a state-of-the-art Numerical Weather Prediction (NWP) model, at short lead times for large-scale variables, while outperforming IFS for variables with complex fine-scale structure, including precipitation. FourCastNet generates a week-long forecast in less than 2 seconds, orders of magnitude faster than IFS. The speed of FourCastNet enables the creation of rapid and inexpensive large-ensemble forecasts with thousands of ensemble-members for improving probabilistic forecasting. We discuss how data-driven deep learning models such as FourCastNet are a valuable addition to the meteorology toolkit to aid and augment NWP models.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper introduces FourCastNet, a global data-driven weather forecasting model based on Adaptive Fourier Neural Operators (AFNO) trained on ERA5 reanalysis. It claims accurate 0.25° resolution short-to-medium range forecasts that match ECMWF IFS accuracy for large-scale variables at short lead times, outperform IFS on fine-scale variables including precipitation, and enable week-long forecasts in under 2 seconds, with implications for ensemble forecasting and extreme weather prediction.
Significance. If the results hold under rigorous validation, this work establishes that neural operator models can achieve competitive skill with operational NWP systems without explicit physics constraints, while offering orders-of-magnitude speedups that enable large ensembles. This positions data-driven approaches as a practical complement to traditional models for probabilistic forecasting and rapid scenario generation.
major comments (2)
- [§4 (Results)] §4 (Results): Performance comparisons to IFS report RMSE and ACC metrics but omit details on the exact held-out validation years from ERA5, statistical error bars or significance tests on the metrics, and explicit checks for distribution shift. This directly affects the load-bearing claim that the model matches or exceeds IFS on held-out data for key variables.
- [§3 (Methods) and §4.3] §3 (Methods) and §4.3: No verification is provided that forecasts conserve quantities such as total column water vapor, mass, or energy to within observational uncertainty, nor are there tests on out-of-distribution extremes (e.g., record events after the training cutoff). These omissions are load-bearing for the generalization assumption underlying multi-day accuracy claims.
minor comments (2)
- [Abstract] Abstract: Mentions implications for atmospheric rivers and tropical cyclones but the results do not include dedicated metrics or case studies for these phenomena.
- [Figures] Figure captions and legends: Some figures comparing FourCastNet and IFS forecasts would benefit from explicit lead-time annotations and variable units to improve clarity.
Simulated Author's Rebuttal
We thank the referee for their constructive feedback, which has helped us identify areas to improve the clarity and rigor of our manuscript. Below, we provide point-by-point responses to the major comments.
read point-by-point responses
-
Referee: [§4 (Results)] Performance comparisons to IFS report RMSE and ACC metrics but omit details on the exact held-out validation years from ERA5, statistical error bars or significance tests on the metrics, and explicit checks for distribution shift. This directly affects the load-bearing claim that the model matches or exceeds IFS on held-out data for key variables.
Authors: We thank the referee for highlighting these omissions. The manuscript specifies in Section 3 that the model is trained on ERA5 data from 1979 to 2017 and evaluated on 2018-2020. We will revise the results section to explicitly state the validation years, include statistical error bars (e.g., via bootstrapping) on the reported RMSE and ACC values, perform significance tests (e.g., paired t-tests) to compare FourCastNet and IFS, and add checks for distribution shift by comparing means and variances of key variables between training and validation periods. These additions will be incorporated in the revised manuscript. revision: yes
-
Referee: [§3 (Methods) and §4.3] No verification is provided that forecasts conserve quantities such as total column water vapor, mass, or energy to within observational uncertainty, nor are there tests on out-of-distribution extremes (e.g., record events after the training cutoff). These omissions are load-bearing for the generalization assumption underlying multi-day accuracy claims.
Authors: We agree that verifying physical consistency and robustness to extremes is important. Although FourCastNet is data-driven and does not hard-code conservation, we will add in the revised paper an evaluation of approximate conservation by tracking the evolution of integrated quantities like total column water vapor, total mass, and energy over forecast lead times and comparing deviations to ERA5 observational uncertainties. For out-of-distribution extremes, we will include additional analysis of performance on record events in the test period (post-2017), such as the most extreme precipitation events or temperature records in 2018-2020. This will be added to Section 4.3 to support the generalization claims. revision: yes
Circularity Check
No circularity detected; claims rest on empirical training and external benchmark comparison.
full rationale
The paper describes a data-driven AFNO-based neural network trained on ERA5 reanalysis and evaluated via direct RMSE/ACC metrics against held-out years and the independent ECMWF IFS model. No derivation chain reduces predictions to fitted parameters by construction, no self-citation load-bearing uniqueness theorems are invoked to force the architecture, and performance claims are falsifiable against external operational forecasts rather than tautological. Generalization risk to extremes is a standard ML limitation but does not create circularity in the reported results.
Axiom & Free-Parameter Ledger
free parameters (1)
- neural network weights
axioms (1)
- domain assumption Historical reanalysis data sufficiently represents the distribution of future atmospheric states for the forecast horizons considered
Lean theorems connected to this paper
-
Cost.FunctionalEquationwashburn_uniqueness_aczel unclearFourCastNet... is a global data-driven weather forecasting model... trained on historical reanalysis data... without explicit enforcement of physical conservation laws or stability constraints.
-
Foundation.HierarchyEmergencehierarchy_emergence_forces_phi unclearFourCastNet matches the forecasting accuracy of the ECMWF Integrated Forecasting System (IFS)... at short lead times for large-scale variables, while outperforming IFS for variables with complex fine-scale structure, including precipitation.
-
Foundation.DimensionForcingdimension_forced unclearFourCastNet generates a week-long forecast in less than 2 seconds, orders of magnitude faster than IFS.
Forward citations
Cited by 33 Pith papers
-
Convergent Stochastic Training of Attention and Understanding LoRA
Attention and LoRA regression losses induce Poincaré inequalities under mild regularization, so SGD-mimicking SDEs converge to minimizers with no assumptions on data or model size.
-
Stable Attention Response for Reliable Precipitation Nowcasting
HARECast stabilizes cross-sample variance in attention-response energy via group-wise regularization to reduce prediction errors in precipitation nowcasting.
-
WindINR: Latent-State INR for Fast Local Wind Query and Correction in Complex Terrain
WindINR achieves continuous high-resolution local wind queries and sparse-observation correction in complex terrain by updating only a compact latent state, delivering 2.6x speedup over full-network fine-tuning in OSS...
-
CATO: Charted Attention for Neural PDE Operators
CATO learns a continuous latent chart for efficient axial attention on PDE meshes and adds derivative-aware supervision to improve accuracy and reduce oversmoothing on general geometries.
-
Controlling Transient Amplification Improves Long-horizon Rollouts
Commutativity regularization on Jacobians reduces transient error amplification in neural simulators, enabling stable rollouts over thousands of steps on physical and climate data.
-
QuadNorm: Resolution-Robust Normalization for Neural Operators
QuadNorm uses quadrature-based moments instead of uniform averaging in normalization layers, achieving O(h²) consistency across resolutions and better cross-resolution transfer in neural operators.
-
GPROF-IR: An Improved Single-Channel Infrared Precipitation Retrieval for Merged Satellite Precipitation Products
GPROF-IR is a CNN-based retrieval that uses temporal context in geostationary IR observations to produce precipitation estimates with lower error than prior IR methods and climatological consistency with PMW retrieval...
-
PODiff: Latent Diffusion in Proper Orthogonal Decomposition Space for Scientific Super-Resolution
PODiff performs conditional diffusion in a fixed, variance-ordered POD latent space to enable efficient probabilistic super-resolution of high-dimensional scientific fields with lower memory and better-calibrated unce...
-
Faster by Design: Interactive Aerodynamics via Neural Surrogates Trained on Expert-Validated CFD
A graph-based neural operator trained on expert-validated race-car CFD data reaches accuracy levels usable for early-stage interactive aerodynamic design exploration.
-
Conflated Inverse Modeling to Generate Diverse and Temperature-Change Inducing Urban Vegetation Patterns
A diffusion generative inverse model conditioned on temperature targets produces diverse, physically plausible urban vegetation patterns that achieve specified regional temperature shifts.
-
McCast: Memory-Guided Latent Drift Correction for Long-Horizon Precipitation Nowcasting
McCast uses a Drift-Corrective Memory Bank to actively correct latent drift in autoregressive precipitation nowcasting for more coherent long-horizon forecasts.
-
Multi-Quantile Regression for Extreme Precipitation Downscaling
Q-SRDRN multi-quantile network with pinball loss and per-quantile heads detects extreme precipitation events up to 18 times more effectively than deterministic baselines while preserving augmentation benefits for the median.
-
UFO: A Domain-Unification-Free Operator Framework for Generalized Operator Learning
UFO is a cross-domain neural operator framework that achieves discretization decoupling via adaptive jointly-conditioned interactions among distinct domain representations.
-
Don't Fix the Basis -- Learn It: Spectral Representation with Adaptive Basis Learning for PDEs
ABLE learns a spatially adaptive Parseval frame from data via an ancillary density to replace fixed bases in spectral neural operators for PDEs.
-
PnP-Corrector: A Universal Correction Framework for Coupled Spatiotemporal Forecasting
PnP-Corrector decouples physics simulation from error correction to counter reciprocal error amplification in coupled spatiotemporal forecasting, cutting error by 29% in a 300-day ocean-atmosphere test.
-
PnP-Corrector: A Universal Correction Framework for Coupled Spatiotemporal Forecasting
PnP-Corrector decouples physics simulation from error correction via a plug-and-play agent, cutting error by 29% in 300-day global ocean-atmosphere forecasts.
-
WeatherSyn: An Instruction Tuning MLLM For Weather Forecasting Report Generation
WeatherSyn is the first instruction-tuned MLLM for weather forecasting report generation, outperforming closed-source models on a new dataset of 31 US cities across 8 weather aspects.
-
Sparse Random-Feature Neural Networks with Krylov-Based SVD for Singularly Perturbed ODE
Sparse RFNNs with sSVD via Lanczos-Golub-Kahan bidiagonalization maintain accuracy while improving efficiency and robustness for 1D steady convection-diffusion equations with strong advection.
-
Tyche: One Step Flow for Efficient Probabilistic Weather Forecasting
Tyche achieves competitive probabilistic weather forecasting skill and calibration using a single-step flow model with JVP-regularized training and rollout finetuning.
-
Towards accurate extreme event likelihoods from diffusion model climate emulators
Diffusion model climate emulators provide probability density estimates that allow likelihood calculations and odds-ratio-based importance sampling for extreme events such as tropical cyclones.
-
M-CaStLe: Uncovering Local Causal Structures in Multivariate Space-Time Gridded Data
M-CaStLe generalizes local stencil-based causal discovery to the multivariate case and decomposes resulting graphs into reaction and spatial components for interpretation in space-time gridded data.
-
Skillful Global Ocean Emulation and the Role of Correlation-Aware Loss
A GraphCast-based ocean emulator achieves skillful 10-15 day forecasts, with a Mahalanobis loss that accounts for variable correlations improving performance over MSE and acting as a statistical-dynamical regularizer.
-
Earth System Foundation Model (ESFM): A unified framework for heterogeneous data integration and forecasting
ESFM is a single open foundation model that unifies heterogeneous Earth data sources and forecasts missing regions while preserving inter-variable physical relationships.
-
Global Attention with Linear Complexity for Exascale Generative Data Assimilation in Earth System Prediction
STORM achieves linear-complexity global attention in a generative DA framework, scaling to 20 billion tokens and 1.6 ExaFLOPs on 32k GPUs for km-scale Earth modeling.
-
Neural Stochastic Processes for Satellite Precipitation Refinement
NSP model fuses satellite and gauge data with neural processes and SDEs, outperforming 13 baselines and JAXA's operational product on a new 43k-sample US benchmark across six metrics.
-
A PMP-inspired Evaluation Framework for Assessing Deep-Learning Earth System Models
The paper presents a PMP-based evaluation framework to test deep-learning Earth system models on climatology and modes of variability using observational data.
-
Neural Operators for Multi-Task Control and Adaptation
Neural operators approximate the solution operator for multi-task optimal control, generalizing to new tasks and enabling efficient adaptation via branch-trunk structure and meta-training.
-
A Multimodal Vision Transformer-based Modeling Framework for Prediction of Fluid Flows in Energy Systems
A multimodal SwinV2-UNet vision transformer conditioned on data modality and time predicts spatiotemporal fluid flows and reconstructs unobserved fields from limited views using CFD data of argon jet injection.
-
QuantWeather: Quantile-Aware Probabilistic Forecasting for Subseasonal Precipitation
QuantWeather is an end-to-end dual-head neural network that produces calibrated quantile-based probabilistic forecasts for subseasonal precipitation, achieving higher skill and lower inference cost than ensemble metho...
-
PINN-Cast: Exploring the Role of Continuous-Depth NODE in Transformers and Physics Informed Loss as Soft Physical Constraints in Short-term Weather Forecasting
PINN-Cast combines continuous-depth Neural ODEs inside transformer blocks with a two-branch attention module and physics-informed loss to produce short-term weather forecasts that respect governing physical principles.
-
PDE-regularized Dynamics-informed Diffusion with Uncertainty-aware Filtering for Long-Horizon Dynamics
PDYffusion combines PDE-regularized diffusion interpolation with UKF-based uncertainty-aware forecasting to deliver more stable and accurate long-horizon dynamical predictions than standard approaches.
-
Regimes of Scale in AI Meteorology
AI/ML weather tools face integration challenges from mismatched 'regimes of scale' in how data and models are organized compared to traditional meteorology practices.
-
What if AI systems weren't chatbots?
Chatbot AI systems often fail complex needs while projecting authority, contributing to deskilling, labor displacement, economic concentration, and high environmental costs, so alternative pluralistic and task-specifi...
Reference graph
Works this paper leans on
-
[1]
The quiet revolution of numerical weather prediction
Peter Bauer, Alan Thorpe, and Gilbert Brunet. The quiet revolution of numerical weather prediction. Nature, 525 0 (7567): 0 47--55, 2015
work page 2015
-
[2]
Advances in weather prediction
Richard B Alley, Kerry A Emanuel, and Fuqing Zhang. Advances in weather prediction. Science, 363 0 (6425): 0 342--344, 2019
work page 2019
-
[3]
Weather prediction by numerical process
Lewis Fry Richardson. Weather prediction by numerical process. Cambridge university press, 2007
work page 2007
-
[4]
MG Schultz, C Betancourt, B Gong, F Kleinert, M Langguth, LH Leufen, Amirpasha Mozaffari, and S Stadtler. Can deep learning beat numerical weather prediction? Philosophical Transactions of the Royal Society A, 379 0 (2194): 0 20200097, 2021
work page 2021
-
[5]
V Balaji. Climbing down charney’s ladder: machine learning and the post-dennard era of computational climate science. Philosophical Transactions of the Royal Society A, 379 0 (2194): 0 20200085, 2021
work page 2021
-
[6]
Christopher Irrgang, Niklas Boers, Maike Sonnewald, Elizabeth A Barnes, Christopher Kadow, Joanna Staneva, and Jan Saynisch-Wagner. Towards neural E arth system modelling by integrating artificial intelligence in E arth system science. Nature Machine Intelligence, 3 0 (8): 0 667--674, 2021
work page 2021
-
[7]
Deep learning and process understanding for data-driven earth system science
Markus Reichstein, Gustau Camps-Valls, Bjorn Stevens, Martin Jung, Joachim Denzler, Nuno Carvalhais, et al. Deep learning and process understanding for data-driven earth system science. Nature, 566 0 (7743): 0 195--204, 2019
work page 2019
-
[8]
Predicting weather forecast uncertainty with machine learning
Sebastian Scher and Gabriele Messori. Predicting weather forecast uncertainty with machine learning. Quarterly Journal of the Royal Meteorological Society, 144 0 (717): 0 2830--2841, 2018
work page 2018
-
[9]
Sebastian Scher and Gabriele Messori. Weather and climate forecasting with neural networks: using general circulation models (gcms) with different complexity as a study ground. Geoscientific Model Development, 12 0 (7): 0 2797--2809, 2019
work page 2019
-
[10]
Analog forecasting of extreme-causing weather patterns using deep learning
Ashesh Chattopadhyay, Ebrahim Nabizadeh, and Pedram Hassanzadeh. Analog forecasting of extreme-causing weather patterns using deep learning. Journal of Advances in Modeling Earth Systems, 12 0 (2): 0 e2019MS001958, 2020 a
work page 2020
-
[11]
Jonathan A Weyn, Dale R Durran, and Rich Caruana. Can machines learn to predict weather? using deep learning to predict gridded 500-hpa geopotential height from historical weather data. Journal of Advances in Modeling Earth Systems, 11 0 (8): 0 2680--2693, 2019
work page 2019
-
[12]
Jonathan A Weyn, Dale R Durran, and Rich Caruana. Improving data-driven global weather prediction using deep convolutional neural networks on a cubed sphere. Journal of Advances in Modeling Earth Systems, 12 0 (9): 0 e2020MS002109, 2020
work page 2020
-
[13]
Sub-seasonal forecasting with a large ensemble of deep-learning weather prediction models
Jonathan A Weyn, Dale R Durran, Rich Caruana, and Nathaniel Cresswell-Clay. Sub-seasonal forecasting with a large ensemble of deep-learning weather prediction models. arXiv preprint arXiv:2102.05107, 2021
-
[14]
Weatherbench: a benchmark data set for data-driven weather forecasting
Stephan Rasp, Peter D Dueben, Sebastian Scher, Jonathan A Weyn, Soukayna Mouatadid, and Nils Thuerey. Weatherbench: a benchmark data set for data-driven weather forecasting. Journal of Advances in Modeling Earth Systems, 12 0 (11): 0 e2020MS002203, 2020
work page 2020
-
[15]
Stephan Rasp and Nils Thuerey. Data-driven medium-range weather prediction with a resnet pretrained on climate simulations: A new model for weatherbench. Journal of Advances in Modeling Earth Systems, 13 0 (2): 0 e2020MS002405, 2021 a
work page 2021
-
[16]
Stephan Rasp and Nils Thuerey. Purely data-driven medium-range weather forecasting achieves comparable skill to physical models at similar resolution. arXiv preprint arXiv:2008.08626, 2020
-
[17]
Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, Eviatar Bach, and Karthik Kashinath. Towards physically consistent data-driven weather forecasting: Integrating data assimilation with equivariance-preserving spatial transformers in a case study with era5. Geoscientific Model Development Discussions, pages 1--23, 2021
work page 2021
-
[18]
A machine learning-based global atmospheric forecast model
Troy Arcomano, Istvan Szunyogh, Jaideep Pathak, Alexander Wikner, Brian R Hunt, and Edward Ott. A machine learning-based global atmospheric forecast model. Geophysical Research Letters, 47 0 (9): 0 e2020GL087776, 2020
work page 2020
-
[19]
Matthew Chantry, Hannah Christensen, Peter Dueben, and Tim Palmer. Opportunities and challenges for machine learning in weather and climate modelling: hard, medium and soft ai. Philosophical Transactions of the Royal Society A, 379 0 (2194): 0 20200083, 2021
work page 2021
-
[20]
Deep learning for post-processing ensemble weather forecasts
Peter Gr \"o nquist, Chengyuan Yao, Tal Ben-Nun, Nikoli Dryden, Peter Dueben, Shigang Li, and Torsten Hoefler. Deep learning for post-processing ensemble weather forecasts. Philosophical Transactions of the Royal Society A, 379 0 (2194): 0 20200092, 2021
work page 2021
-
[21]
Stephan Rasp and Nils Thuerey. Data-driven medium-range weather prediction with a resnet pretrained on climate simulations: A new model for weatherbench. Journal of Advances in Modeling Earth Systems, page e2020MS002405, 2021 b
work page 2021
-
[22]
Adaptive Fourier Neural Operators : Efficient token mixers for transformers
John Guibas, Morteza Mardani, Zongyi Li, Andrew Tao, Anima Anandkumar, and Bryan Catanzaro. Adaptive Fourier Neural Operators : Efficient token mixers for transformers. International Conference on Representation Learning (to appear), April 2022
work page 2022
-
[23]
An image is worth 16x16 words: Transformers for image recognition at scale, 2021
Alexey Dosovitskiy, Lucas Beyer, Alexander Kolesnikov, Dirk Weissenborn, Xiaohua Zhai, Thomas Unterthiner, Mostafa Dehghani, Matthias Minderer, Georg Heigold, Sylvain Gelly, Jakob Uszkoreit, and Neil Houlsby. An image is worth 16x16 words: Transformers for image recognition at scale, 2021
work page 2021
-
[24]
Fourier neural operator for parametric partial differential equations
Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, and Anima Anandkumar. Fourier neural operator for parametric partial differential equations. In International Conference on Learning Representations (ICLR), 2021 a
work page 2021
-
[25]
Hans Hersbach, Bill Bell, Paul Berrisford, Shoji Hirahara, Andr \'a s Hor \'a nyi, Joaqu \'i n Mu \ n oz-Sabater, Julien Nicolas, Carole Peubey, Raluca Radu, Dinand Schepers, Adrian Simmons, Cornel Soci, Saleh Abdalla, Xavier Abellan, Gianpaolo Balsamo, Peter Bechtold, Gionata Biavati, Jean Bidlot, Massimo Bonavita, Giovanna De Chiara, Per Dahlgren, Dick ...
work page 1999
-
[26]
The ncep/ncar 40-year reanalysis project
Eugenia Kalnay, Masao Kanamitsu, Robert Kistler, William Collins, Dennis Deaven, Lev Gandin, Mark Iredell, Suranjana Saha, Glenn White, John Woollen, et al. The ncep/ncar 40-year reanalysis project. Bulletin of the American meteorological Society, 77 0 (3): 0 437--472, 1996
work page 1996
-
[27]
Atmospheric modeling, data assimilation and predictability
Eugenia Kalnay. Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003
work page 2003
-
[28]
J.L. Beven II, R. Berg, and A. Hagen. Tropical cyclone report hurricane michael, April 2019
work page 2019
-
[29]
Tim Palmer. The ecmwf ensemble prediction system: Looking back (more than) 25 years and projecting forward 25 years. Quarterly Journal of the Royal Meteorological Society, 145: 0 12--24, 2019
work page 2019
-
[30]
The ecmwf scalability programme: Progress and plans
Peter Bauer, Tiago Quintino, Nils Wedi, Antonio Bonanni, Marcin Chrust, Willem Deconinck, Michail Diamantakis, Peter D \"u ben, Stephen English, Johannes Flemming, et al. The ecmwf scalability programme: Progress and plans. European Centre for Medium Range Weather Forecasts, 2020. doi:10.21957/gdit22ulm. URL https://www.ecmwf.int/node/19380
-
[31]
The ensemble kalman filter: Theoretical formulation and practical implementation
Geir Evensen. The ensemble kalman filter: Theoretical formulation and practical implementation. Ocean dynamics, 53 0 (4): 0 343--367, 2003
work page 2003
-
[32]
Benjamin Fildier, William D. Collins, and Caroline Muller. Distortions of the rain distribution with warming, with and without self-aggregation. Journal of Advances in Modeling Earth Systems, 13 0 (2): 0 e2020MS002256, 2021. doi:https://doi.org/10.1029/2020MS002256. URL https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2020MS002256. e2020MS002256 20...
-
[33]
Deep spatial transformers for autoregressive data-driven forecasting of geophysical turbulence
Ashesh Chattopadhyay, Mustafa Mustafa, Pedram Hassanzadeh, and Karthik Kashinath. Deep spatial transformers for autoregressive data-driven forecasting of geophysical turbulence. In Proceedings of the 10th International Conference on Climate Informatics, pages 106--112, 2020 b
work page 2020
-
[34]
Zero: Memory optimizations toward training trillion parameter models
Samyam Rajbhandari, Jeff Rasley, Olatunji Ruwase, and Yuxiong He. Zero: Memory optimizations toward training trillion parameter models. In SC20: International Conference for High Performance Computing, Networking, Storage and Analysis, pages 1--16. IEEE, 2020
work page 2020
-
[35]
Deep S peed: Accelerating large-scale model inference and training via system optimizations and compression, 2021. URL https://www.microsoft.com/en-us/research/blog/deepspeed-accelerating-large-scale-model-inference-and-training-via-system-optimizatio/ns-and-compression/
work page 2021
-
[36]
Physics-informed machine learning: case studies for weather and climate modelling
K Kashinath, M Mustafa, A Albert, JL Wu, C Jiang, S Esmaeilzadeh, K Azizzadenesheli, R Wang, A Chattopadhyay, A Singh, et al. Physics-informed machine learning: case studies for weather and climate modelling. Philosophical Transactions of the Royal Society A, 379 0 (2194): 0 20200093, 2021
work page 2021
-
[37]
Physics-informed neural operator for learning partial differential equations, 2021 b
Zongyi Li, Hongkai Zheng, Nikola Kovachki, David Jin, Haoxuan Chen, Burigede Liu, Kamyar Azizzadenesheli, and Anima Anandkumar. Physics-informed neural operator for learning partial differential equations, 2021 b
work page 2021
-
[38]
Troy Arcomano, Istvan Szunyogh, Alexander Wikner, Jaideep Pathak, Brian R Hunt, and Edward Ott. A hybrid approach to atmospheric modeling that combines machine learning with a physics-based numerical model. Journal of Advances in Modeling Earth Systems, 2021
work page 2021
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.