pith. machine review for the scientific record. sign in

arxiv: 2604.13992 · v1 · submitted 2026-04-15 · 💻 cs.LG

Recognition: unknown

Physics-Informed Neural Networks for Methane Sorption: Cross-Gas Transfer Learning, Ensemble Collapse Under Physics Constraints, and Monte Carlo Dropout Uncertainty Quantification

Authors on Pith no claims yet

Pith reviewed 2026-05-10 14:00 UTC · model grok-4.3

classification 💻 cs.LG
keywords physics-informed neural networksmethane sorptiontransfer learninguncertainty quantificationcoalMonte Carlo dropoutensemble methodsthermodynamic consistency
0
0 comments X

The pith

A physics-informed neural network transfers hydrogen sorption knowledge to predict methane uptake in coal with R2 of 0.932 while Monte Carlo dropout supplies calibrated uncertainty.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper establishes a transfer learning framework that starts with a hydrogen sorption PINN and adapts it to methane via Elastic Weight Consolidation, coal-specific features, and a staged curriculum that balances preservation of prior knowledge with thermodynamic fine-tuning. Trained on 993 equilibrium points from 114 coal experiments spanning lignite to anthracite, the model reaches R2 = 0.932 on held-out samples, a 227 percent improvement over pressure-only classical isotherms, with hydrogen pre-training cutting RMSE by 18.9 percent and accelerating convergence by 19.4 percent. Among five Bayesian uncertainty methods, Monte Carlo dropout remains well-calibrated at low cost, whereas deep ensembles degrade because the shared physics constraints shrink the range of allowable solutions. SHAP and ALE analyses show that the learned features stay physically interpretable, with moisture-volatile interactions and pressure-temperature coupling emerging as dominant.

Core claim

Trained on 993 equilibrium measurements from 114 independent coal experiments spanning lignite to anthracite, the physics-informed transfer learning framework achieves R2 = 0.932 on held-out coal samples, a 227% improvement over pressure-only classical isotherms, while hydrogen pre-training delivers 18.9% lower RMSE and 19.4% faster convergence than random initialization. Monte Carlo Dropout achieves well-calibrated uncertainty at minimal overhead, while deep ensembles exhibit performance degradation because shared physics constraints narrow the admissible solution manifold.

What carries the argument

The physics-informed transfer learning framework that adapts a hydrogen sorption PINN to methane via Elastic Weight Consolidation, coal-specific feature engineering, and a three-phase curriculum.

If this is right

  • Methane sorption can be predicted more accurately across heterogeneous coal ranks without collecting large new datasets for each gas.
  • Monte Carlo dropout emerges as the preferred uncertainty method when physics constraints are enforced in neural network architectures.
  • Learned representations remain aligned with known coal sorption mechanisms such as moisture-volatile interactions and pressure-temperature coupling.
  • Cross-gas transfer learning offers a data-efficient route for modeling other geological materials where direct measurements are scarce.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The same transfer approach could be tested on sorption of other gases or in different porous media such as shales or zeolites.
  • Hybrid uncertainty methods that combine dropout with limited ensemble diversity might avoid the observed collapse while retaining calibration.
  • Integration of the trained model into reservoir simulators could reduce uncertainty in methane storage and recovery estimates.

Load-bearing premise

The enforced thermodynamic consistency and coal-specific feature engineering capture the domain shift from hydrogen to methane sorption without introducing systematic biases that affect generalization across coal ranks.

What would settle it

New methane sorption measurements on coal samples from ranks or conditions outside the 114-experiment training distribution that produce RMSE substantially higher than the reported value or uncertainty intervals that fail to cover the observed scatter would falsify the generalization and calibration claims.

Figures

Figures reproduced from arXiv: 2604.13992 by Mohammad Nooraiepour, Sarah Perez, Wei Li, Zezhang Song.

Figure 1
Figure 1. Figure 1: Pairwise relationships reveal nonlinear compositional controls on methane adsorption capacity. [PITH_FULL_IMAGE:figures/full_fig_p020_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Physics-informed engineered features span physically meaningful ranges and encode [PITH_FULL_IMAGE:figures/full_fig_p021_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: Robust scaling and log-transformation satisfy the statistical assumptions of the heteroscedastic [PITH_FULL_IMAGE:figures/full_fig_p022_3.png] view at source ↗
Figure 4
Figure 4. Figure 4: Classical pressure-only isotherms reach a universal performance ceiling of R [PITH_FULL_IMAGE:figures/full_fig_p023_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Within-rank stratification progressively improves isotherm performance, confirming that [PITH_FULL_IMAGE:figures/full_fig_p024_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Three-phase training curriculum achieves stable, non-overfitting convergence in 1,129 epochs, [PITH_FULL_IMAGE:figures/full_fig_p025_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: The transfer-learned PINN achieves R2 = 0.932 on held-out coals with near-Gaussian, unbiased residuals across the full rank spectrum. (A) Training predictions in log-transformed space (R2 = 0.964, RMSE= 0.131, 𝑛 = 794). (B) Test predictions in log space (R2 = 0.953, RMSE= 0.153, 𝑛 = 199), showing tight clustering around the 1:1 line with Δ𝑅 2 < 0.011 between train and test, confirming generalization rather… view at source ↗
Figure 8
Figure 8. Figure 8: Residual diagnostics confirm homoscedasticity, approximate normality, and absence of [PITH_FULL_IMAGE:figures/full_fig_p027_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: H2 transfer learning outperforms all baselines in accuracy, calibration, and convergence speed, while deep ensembles provide no benefit over a single physics-constrained model. All variants were trained under identical conditions; test set 𝑛 = 199. (A) Validation loss trajectories: transfer learning achieves consistently lower loss from the first epoch. (B) Validation R2 progression: transfer exceeds 0.95 … view at source ↗
Figure 10
Figure 10. Figure 10: Aleatoric uncertainty dominates (98.3%) and calibrated total uncertainty reliably identifies [PITH_FULL_IMAGE:figures/full_fig_p031_10.png] view at source ↗
Figure 11
Figure 11. Figure 11: SHAP attribution confirms that engineered compositional and thermodynamic features [PITH_FULL_IMAGE:figures/full_fig_p035_11.png] view at source ↗
Figure 12
Figure 12. Figure 12: ALE causal effect analysis confirms non-monotonic feature effects and corroborates SHAP [PITH_FULL_IMAGE:figures/full_fig_p036_12.png] view at source ↗
read the original abstract

Accurate methane sorption prediction across heterogeneous coal ranks requires models that combine thermodynamic consistency, efficient knowledge transfer across data-scarce geological systems, and calibrated uncertainty estimates, capabilities that are rarely addressed together in existing frameworks. We present a physics-informed transfer learning framework that adapts a hydrogen sorption PINN to methane sorption prediction via Elastic Weight Consolidation, coal-specific feature engineering, and a three-phase curriculum that progressively balances transfer preservation with thermodynamic fine-tuning. Trained on 993 equilibrium measurements from 114 independent coal experiments spanning lignite to anthracite, the framework achieves R2 = 0.932 on held-out coal samples, a 227% improvement over pressure-only classical isotherms, while hydrogen pre-training delivers 18.9% lower RMSE and 19.4% faster convergence than random initialization. Five Bayesian uncertainty quantification approaches reveal a systematic divergence in performance across physics-constrained architectures. Monte Carlo Dropout achieves well-calibrated uncertainty at minimal overhead, while deep ensembles, regardless of architectural diversity or initialization strategy, exhibit performance degradation because shared physics constraints narrow the admissible solution manifold. SHAP and ALE analyses confirm that learned representations remain physically interpretable and aligned with established coal sorption mechanisms: moisture-volatile interactions are most influential, pressure-temperature coupling captures thermodynamic co-dependence, and features exhibit non-monotonic effects. These results identify Monte Carlo Dropout as the best-performing UQ method in this physics-constrained transfer learning framework, and demonstrate cross-gas transfer learning as a data-efficient strategy for geological material modeling.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

1 major / 2 minor

Summary. The manuscript proposes a physics-informed neural network (PINN) framework for methane sorption prediction in coal that uses transfer learning from a hydrogen pre-trained model via Elastic Weight Consolidation, coal-specific feature engineering, and a three-phase curriculum. Trained on 993 equilibrium measurements from 114 independent coal experiments, it reports R²=0.932 on held-out samples (227% improvement over pressure-only isotherms), 18.9% RMSE reduction and faster convergence from hydrogen pre-training, and compares five Bayesian UQ methods, concluding that Monte Carlo Dropout is well-calibrated while deep ensembles degrade due to physics constraints narrowing the solution space. SHAP/ALE analyses confirm physical interpretability of learned features.

Significance. If the reported gains are robust to proper experiment-level partitioning, the work demonstrates a practical route to data-efficient, thermodynamically consistent modeling for heterogeneous geological materials, with the ensemble-collapse observation under shared physics constraints offering a useful caution for PINN design. The combination of cross-gas transfer, curriculum balancing, and UQ calibration addresses multiple gaps in existing sorption modeling.

major comments (1)
  1. [Abstract and Methods] Abstract and Methods (data partitioning description): The held-out evaluation yielding R²=0.932 and the transfer-learning gains are described only as 'on held-out coal samples' drawn from the same 114 experiments. No explicit statement confirms that the split is performed at the experiment or coal-rank level rather than the individual measurement level. If intra-experiment replicates leak across train/test, the performance lift and cross-gas benefit could be inflated by memorization of experiment-specific offsets rather than by the physics-informed features or EWC. Please provide the exact splitting protocol (e.g., by experiment ID) and, if necessary, re-evaluate with experiment-level cross-validation.
minor comments (2)
  1. [Abstract] The abstract mentions 'five Bayesian uncertainty quantification approaches' but does not list them explicitly; the main text should enumerate the exact methods compared (MC Dropout, deep ensembles, etc.) with their hyperparameters.
  2. [Methods] Notation for the curriculum phase-transition thresholds and EWC regularization coefficient should be defined at first use and collected in a table of hyperparameters for reproducibility.

Simulated Author's Rebuttal

1 responses · 0 unresolved

We thank the referee for their detailed and constructive review. The concern regarding data partitioning is well-taken, and we address it directly below. We will revise the manuscript to make the splitting protocol explicit.

read point-by-point responses
  1. Referee: [Abstract and Methods] Abstract and Methods (data partitioning description): The held-out evaluation yielding R²=0.932 and the transfer-learning gains are described only as 'on held-out coal samples' drawn from the same 114 experiments. No explicit statement confirms that the split is performed at the experiment or coal-rank level rather than the individual measurement level. If intra-experiment replicates leak across train/test, the performance lift and cross-gas benefit could be inflated by memorization of experiment-specific offsets rather than by the physics-informed features or EWC. Please provide the exact splitting protocol (e.g., by experiment ID) and, if necessary, re-evaluate with experiment-level cross-validation.

    Authors: We agree that an explicit description of the splitting protocol is necessary to rule out leakage. The 993 measurements come from 114 independent coal experiments, and the train/test split was performed at the experiment level: each experiment (i.e., all replicate measurements from a single coal sample under its experimental conditions) was assigned wholly to either the training or the held-out test set. No measurements from the same experiment appear in both sets. This was done to ensure generalization across distinct coal samples rather than memorization of experiment-specific offsets. We will add a clear statement of this protocol, including the use of experiment ID for partitioning, to the Methods section of the revised manuscript. Because the split is already experiment-level, no re-evaluation is required. revision: yes

Circularity Check

0 steps flagged

No significant circularity; claims rest on held-out empirical evaluation

full rationale

The paper reports R2 = 0.932 on held-out coal samples drawn from 114 independent experiments, with hydrogen pre-training from a separate domain and comparisons to classical isotherms. No derivation step, equation, or performance metric reduces by construction to its own fitted inputs or self-citations. The transfer-learning gains, UQ divergence, and SHAP/ALE interpretability are presented as observational results on independent test data rather than self-definitional or fitted-input predictions. The framework is self-contained against external benchmarks with no load-bearing self-citation chains or ansatz smuggling evident in the provided text.

Axiom & Free-Parameter Ledger

2 free parameters · 1 axioms · 0 invented entities

The framework rests on standard neural network optimization plus domain assumptions about thermodynamic consistency in the loss and the validity of feature engineering for coal properties; no new physical entities are postulated.

free parameters (2)
  • EWC regularization coefficient
    Controls the strength of transfer preservation versus fine-tuning; value chosen during training.
  • Curriculum phase transition thresholds
    Determine when to shift emphasis from transfer to thermodynamic fine-tuning; tuned for the dataset.
axioms (1)
  • domain assumption Enforcing thermodynamic consistency via physics-informed loss terms produces physically plausible sorption predictions
    Invoked to justify the PINN architecture and curriculum design.

pith-pipeline@v0.9.0 · 5584 in / 1367 out tokens · 62083 ms · 2026-05-10T14:00:56.903683+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

80 extracted references · 40 canonical work pages · 1 internal anchor

  1. [1]

    Investigation into the variation characteristics and influencing factors of coalbed methane gas content in deep coal seams

    Q. Zhu, X. Du, X. Liu, T. Zhang, H. Yu, and L. Xiaobo. “Investigation into the variation characteristics and influencing factors of coalbed methane gas content in deep coal seams”. In:Scientific Reports14 (2024), p. 27360.doi:10.1038/s41598-024-66011-2

  2. [2]

    Integrated pore structure analysis and methane adsorption and desorption investigation in deep multi seam coal systems

    A. Yuan, R. Wang, X. Yang, and P. Wang. “Integrated pore structure analysis and methane adsorption and desorption investigation in deep multi seam coal systems”. In:Scientific Reports15.1 (2025), p. 34102.doi:10.1038/s41598-025-19347-2

  3. [3]

    Research on the molecular dynamics of coalbed methane diffusion and adsorption in reservoir pores under different factors

    X. Wang, J. Zhong, Z. Zhang, X. Zeng, R. Shen, Z. Deng, and N. Wang. “Research on the molecular dynamics of coalbed methane diffusion and adsorption in reservoir pores under different factors”. In:Energy Exploration & Exploitation43.1 (2025), pp. 105–117.doi: 10.1177/01445987241272563

  4. [4]

    Methane Adsorption in Heteroge- neous Potential Wells of Coal: Characterization Model and Applications

    D. Zhou, G. Wang, J. Wang, Y. Ji, X. Zheng, and Z. Feng. “Methane Adsorption in Heteroge- neous Potential Wells of Coal: Characterization Model and Applications”. In:Langmuir41.1 (2025), pp. 450–461.doi:10.1021/acs.langmuir.4c03721

  5. [5]

    Mini-Review on Influence of CO2-Enhanced Coalbed Methane Recovery and CO2 Geological Storage on Physical Properties of Coal Reservoir

    B. Cao, X. Fu, J. Kang, J. Lu, P. Tang, H. Xu, and M. Huang. “Mini-Review on Influence of CO2-Enhanced Coalbed Methane Recovery and CO2 Geological Storage on Physical Properties of Coal Reservoir”. In:Energy & Fuels38.24 (2024), pp. 23268–23280.doi:10.1021/ acs.energyfuels.4c04394

  6. [6]

    Hybrid Chemisorption–Physisorption of Subcritical CO2 on Coals: Implications for Safe and Long- Term Underground CO2 Sequestration

    M. Safaei-Farouji, D. Misch, R. F. Sachsenhofer, F. Knabl, and N. Kostoglou. “Hybrid Chemisorption–Physisorption of Subcritical CO2 on Coals: Implications for Safe and Long- Term Underground CO2 Sequestration”. In:Energy & Fuels39.25 (2025), pp. 12054–12063.doi: 10.1021/acs.energyfuels.5c02015

  7. [7]

    Research advances in enhanced coal seam gas extraction by controllable shock wave fracturing

    C. Fan, H. Sun, S. Li, L. Yang, B. Xiao, Z. Yang, M. Luo, X. Jiang, and L. Zhou. “Research advances in enhanced coal seam gas extraction by controllable shock wave fracturing”. In: InternationalJournalofCoalScience&Technology11.1(2024).doi: 10.1007/s40789-024-00680- 2

  8. [8]

    Theaccuracyofhydrogensorptionmeasurementsonpotentialstoragematerials

    D.Broom.“Theaccuracyofhydrogensorptionmeasurementsonpotentialstoragematerials”. In:International Journal of Hydrogen Energy32.18 (2007), pp. 4871–4888

  9. [9]

    Revisiting Hydrogen Sorption–Desorption in Natural Rocks

    M. Masoudi, A. Meyra, M. Nooraiepour, M. Mousavi Nezhad, A. Hassanpouryouzband, and H. Hellevang. “Revisiting Hydrogen Sorption–Desorption in Natural Rocks”. In:Industrial & Engineering Chemistry Research (in-press)(2026)

  10. [10]

    A review of common practices in gravimetric and volumetric adsorption kinetic experiments

    J.-Y. Wang, E. Mangano, S. Brandani, and D. M. Ruthven. “A review of common practices in gravimetric and volumetric adsorption kinetic experiments”. In:Adsorption27.3 (2021), pp. 295–318

  11. [11]

    Impactofhydration on hydrogen sorption in clay minerals and shale caprocks: implications for hydrogen energy and waste storage

    M.Masoudi,M.Nooraiepour,R.Blom,K.Xi,P.Cerasi,andH.Hellevang.“Impactofhydration on hydrogen sorption in clay minerals and shale caprocks: implications for hydrogen energy and waste storage”. In:International Journal of Hydrogen Energy99 (2025), pp. 661–670.doi: 10.1016/j.ijhydene.2024.12.247

  12. [12]

    Supercritical methane diffusion in shale nanopores: effects of pressure, mineral types, and moisture content

    S. Wang, Q. Feng, M. Zha, F. Javadpour, and Q. Hu. “Supercritical methane diffusion in shale nanopores: effects of pressure, mineral types, and moisture content”. In:Energy & fuels32.1 (2018), pp. 169–180

  13. [13]

    Physics-InformedNeuralNetworks forPredictingHydrogenSorptioninGeologicalFormations:ThermodynamicallyConstrained Deep Learning Integrating Classical Adsorption Theory

    M.Nooraiepour,M.Masoudi,Z.Song,andH.Hellevang.“Physics-InformedNeuralNetworks forPredictingHydrogenSorptioninGeologicalFormations:ThermodynamicallyConstrained Deep Learning Integrating Classical Adsorption Theory”. In: (2026). arXiv:2603.28328 [cs.LG].url:https://arxiv.org/abs/2603.28328

  14. [14]

    Insights into the modeling of adsorption isotherm systems

    K. Y. Foo and B. H. Hameed. “Insights into the modeling of adsorption isotherm systems”. In:Chemical engineering journal156.1 (2010), pp. 2–10. 40

  15. [15]

    Modeling of experimental adsorption isotherm data

    X. Chen. “Modeling of experimental adsorption isotherm data”. In:information6.1 (2015), pp. 14–22

  16. [16]

    AdaptivePhysics-InformedNeural Networks with Multi-Category Feature Engineering for Hydrogen Sorption Prediction in Clays, Shales, and Coals

    M.Nooraiepour,M.Masoudi,Z.Song,andH.Hellevang.“AdaptivePhysics-InformedNeural Networks with Multi-Category Feature Engineering for Hydrogen Sorption Prediction in Clays, Shales, and Coals”. In: (2025). arXiv:2509.00049 [cs.LG].url: https://arxiv.org/ abs/2509.00049

  17. [17]

    Machinelearningfordata-driven discovery in solid Earth geoscience

    K.J.Bergen,P.A.Johnson,M.V.deHoop,andG.C.Beroza.“Machinelearningfordata-driven discovery in solid Earth geoscience”. In:Science363.6433 (2019), eaau0323

  18. [18]

    Machine learning for the geosciences: Challenges and opportunities

    A. Karpatne, I. Ebert-Uphoff, S. Ravela, H. A. Babaie, and V. Kumar. “Machine learning for the geosciences: Challenges and opportunities”. In:IEEE Transactions on Knowledge and Data Engineering31.8 (2018), pp. 1544–1554

  19. [19]

    PartialDifferentialEquations in the Age of Machine Learning: A Critical Synthesis of Classical, Machine Learning, and Hybrid Methods

    M.Nooraiepour,J.W.Both,T.Kadeethum,andS.Sadeghnejad.“PartialDifferentialEquations in the Age of Machine Learning: A Critical Synthesis of Classical, Machine Learning, and Hybrid Methods”. In: (2026). arXiv:2603.07655 [cs.LG].url: https://arxiv.org/abs/ 2603.07655

  20. [20]

    Machinelearningmethodforshalegasadsorption capacity prediction and key influencing factors evaluation

    Y.Zhou,B.Hui,J.Shi,H.Shi,andD.Jing.“Machinelearningmethodforshalegasadsorption capacity prediction and key influencing factors evaluation”. In:Physics of Fluids36.1 (2024)

  21. [21]

    Machine Learning Algorithm to Predict Methane Adsorption Capacity of Coal

    W. Li, W. Li, A. Busch, L. Wang, F. Anggara, and S. Yang. “Machine Learning Algorithm to Predict Methane Adsorption Capacity of Coal”. In:Energy & Fuels38.24 (2024), pp. 23422– 23432

  22. [22]

    Adsorption characteristics of supercritical CO2/CH4 on different types of coal and a machine learning approach

    M. Meng, Z. Qiu, R. Zhong, Z. Liu, Y. Liu, and P. Chen. “Adsorption characteristics of supercritical CO2/CH4 on different types of coal and a machine learning approach”. In: Chemical Engineering Journal368 (2019), pp. 847–864

  23. [23]

    Data-driven framework for predicting the sorption capacity of carbon dioxide and methane in tight reservoirs

    F. M. Alqahtani, M. R. Youcefi, H. Djema, M. Nait Amar, and M. Ghasemi. “Data-driven framework for predicting the sorption capacity of carbon dioxide and methane in tight reservoirs”. In:Greenhouse Gases: Science and Technology14.6 (2024), pp. 1092–1112

  24. [24]

    Data-driven predictive model of coal permeability based on microscopic fracture structure characterization

    T. Yan, X. Xu, J. Liu, Y. Zhang, M. Arif, X. Xu, and Q. Wang. “Data-driven predictive model of coal permeability based on microscopic fracture structure characterization”. In:Journal of Rock Mechanics and Geotechnical Engineering(2025).doi:10.1016/j.jrmge.2024.11.056

  25. [25]

    Fitting elephants in modern machine learning by statistically consistent interpo- lation

    P. P. Mitra. “Fitting elephants in modern machine learning by statistically consistent interpo- lation”. In:Nature Machine Intelligence3.5 (2021), pp. 378–386

  26. [26]

    P. P. Angelov and X. Gu.Empirical approach to machine learning. Springer, 2019

  27. [27]

    A comprehensive review of advances in physics-informed neural networks and their applications in complex fluid dynamics

    C. Zhao, F. Zhang, W. Lou, X. Wang, and J. Yang. “A comprehensive review of advances in physics-informed neural networks and their applications in complex fluid dynamics”. In: Physics of Fluids36.10 (2024)

  28. [28]

    Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations

    M. Raissi, P. Perdikaris, and G. E. Karniadakis. “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations”. In:Journal of Computational Physics378 (2019), pp. 686–707.doi: 10.1016/j.jcp.2018.10.045

  29. [29]

    Scientific machine learning through physics-informed neural networks: where we are and what’s next.Journal of Scientific Computing, 92(3):88, 2022

    S.Cuomo,V.S.DiCola,F.Giampaolo,G.Rozza,M.Raissi,andF.Piccialli.“Scientificmachine learning through physics-informed neural networks: Where we are and what’s next”. In: Journal of Scientific Computing92.3 (2022), p. 88.doi:10.1007/s10915-022-01939-z

  30. [30]

    Physics- informed machine learning

    G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang. “Physics- informed machine learning”. In:Nature Reviews Physics3.6 (2021), pp. 422–440.doi:10.1038/ s42254-021-00314-5

  31. [31]

    Physics-informed neural networks (pinns) for fluid mechanics: a review

    S. Cai, Z. Mao, Z. Wang, M. Yin, and G. E. Karniadakis. “Physics-informed neural networks (PINNs) for fluid mechanics: A review”. In:Acta Mechanica Sinica37.12 (2021), pp. 1727–1738. doi:10.1007/s10409-021-01148-1. 41

  32. [32]

    IEEE Transactions on Knowledge and Data Engineering 22, 1345–1359

    S. J. Pan and Q. Yang. “A survey on transfer learning”. In:IEEE Transactions on Knowledge and Data Engineering22.10 (2009), pp. 1345–1359.doi:10.1109/TKDE.2009.191

  33. [33]

    Transfer learning

    L. Torrey and J. Shavlik. “Transfer learning”. In:Handbook of research on machine learning applications and trends: algorithms, methods, and techniques. IGI Global Scientific Publishing, 2010, pp. 242–264.doi:10.4018/978-1-60566-766-9.ch011

  34. [34]

    A comprehensive survey on transfer learning

    F. Zhuang, Z. Qi, K. Duan, D. Xi, Y. Zhu, H. Zhu, H. Xiong, and Q. He. “A comprehensive survey on transfer learning”. In:Proceedings of the IEEE109.1 (2020), pp. 43–76.doi:10.1109/ JPROC.2020.3004555

  35. [35]

    Molecular simulation on H2 adsorption in nanopores and effects of cushion gas: implications for underground hydrogen storageinshalereservoirs

    M. Zhang, Y. Yang, B. Pan, Z. Liu, Z. Jin, and S. Iglauer. “Molecular simulation on H2 adsorption in nanopores and effects of cushion gas: implications for underground hydrogen storageinshalereservoirs”.In:Fuel361(2024),p.130621.doi: 10.1016/j.fuel.2023.130621

  36. [36]

    Molecular simulation of CO2/CH4/H2O competitiveadsorptionanddiffusioninbrowncoal

    W. Zhou, H. Wang, Z. Zhang, H. Chen, and X. Liu. “Molecular simulation of CO2/CH4/H2O competitiveadsorptionanddiffusioninbrowncoal”.In:RSCAdvances9(2019),pp.3004–3011. doi:10.1039/c8ra10243k

  37. [37]

    Thermodynamicsofadsorptioninporousmaterials

    A.Myers.“Thermodynamicsofadsorptioninporousmaterials”.In:AIChEjournal48.1(2002), pp. 145–160

  38. [38]

    Adaptive transfer learning for PINN

    Y. Liu, W. Liu, X. Yan, S. Guo, and C. Zhang. “Adaptive transfer learning for PINN”. In: Journal of Computational Physics490 (2023), p. 112291.doi:10.1016/j.jcp.2023.112291

  39. [39]

    Modeling unobserved geothermal structures using a physics-informed neural network with transfer learning of prior knowledge

    A. Shima, K. Ishitsuka, W. Lin, E. K. Bjarkason, and A. Suzuki. “Modeling unobserved geothermal structures using a physics-informed neural network with transfer learning of prior knowledge”. In:Geothermal Energy12.1 (2024), p. 38.doi:10.1186/s40517-024-00312- 7

  40. [40]

    Machinelearningthroughphysics–informed neural networks: Progress and challenges

    X.W.KlapaAntonion,M.Raissi,andL.Joshie.“Machinelearningthroughphysics–informed neural networks: Progress and challenges”. In:Academic Journal of Science and Technology9.1 (2024), p. 2024

  41. [41]

    A survey on machine learning approaches for uncertainty quantification of engineering systems

    Y. Shi, P. Wei, K. Feng, D.-C. Feng, and M. Beer. “A survey on machine learning approaches for uncertainty quantification of engineering systems”. In:Machine Learning for Computational Science and Engineering1.1 (2025), p. 11.doi:10.1007/s44379-024-00011-x

  42. [42]

    Bayesian calibration of computer models

    M. C. Kennedy and A. O’Hagan. “Bayesian calibration of computer models”. In:Journal of the Royal Statistical Society: Series B (Statistical Methodology)63.3 (2001), pp. 425–464.doi: 10.1111/1467-9868.00294

  43. [43]

    In: Handbook of Uncertainty Quantification, pp

    R. Ghanem, D. Higdon, and H. Owhadi.Handbook of uncertainty quantification. Cham, Switzerland: Springer, 2017.doi:10.1007/978-3-319-12385-1

  44. [44]

    BayesianParametricMatrixModels:PrincipledUncertaintyQuantification for Spectral Learning

    M.Nooraiepour.“BayesianParametricMatrixModels:PrincipledUncertaintyQuantification for Spectral Learning”. In: (2025). arXiv:2509.12406 [cs.LG].url: https://arxiv.org/ abs/2509.12406

  45. [45]

    Dropout as a Bayesian approximation: Representing model uncertainty in deep learning

    Y. Gal and Z. Ghahramani. “Dropout as a Bayesian approximation: Representing model uncertainty in deep learning”. In:International Conference on Machine Learning (ICML). 2016, pp. 1050–1059

  46. [46]

    Simpleandscalablepredictiveuncertainty estimation using deep ensembles

    B.Lakshminarayanan,A.Pritzel,andC.Blundell.“Simpleandscalablepredictiveuncertainty estimation using deep ensembles”. In:Advances in Neural Information Processing Systems (NeurIPS). 2017, pp. 6402–6413

  47. [47]

    Learnable uncertainty under laplace approximations

    A. Kristiadi, M. Hein, and P. Hennig. “Learnable uncertainty under laplace approximations”. In:Uncertainty in Artificial Intelligence. PMLR. 2021, pp. 344–353

  48. [48]

    B-PINNs: Bayesian physics-informed neural networksforforwardandinversePDEproblemswithnoisydata

    L. Yang, X. Meng, and G. E. Karniadakis. “B-PINNs: Bayesian physics-informed neural networksforforwardandinversePDEproblemswithnoisydata”.In:JournalofComputational Physics425 (2021), p. 109913.doi:https://doi.org/10.1016/j.jcp.2020.109913. 42

  49. [49]

    Bayesian Physics Informed Neural Networks for real-world nonlinear dynamical systems

    K. Linka, A. Schäfer, X. Meng, Z. Zou, G. E. Karniadakis, and E. Kuhl. “Bayesian Physics Informed Neural Networks for real-world nonlinear dynamical systems”. In:Computer Methods in Applied Mechanics and Engineering402 (2022), p. 115346.doi:https://doi.org/ 10.1016/j.cma.2022.115346

  50. [50]

    2018.10.045

    S. Perez, S. Maddu, I. F. Sbalzarini, and P. Poncet. “Adaptive weighting of Bayesian physics informed neural networks for multitask and multiscale forward and inverse problems”. In: Journal ofComputational Physics491 (2023),p. 112342.doi:https://doi.org/10.1016/j.jcp. 2023.112342

  51. [51]

    Auto-weighted Bayesian Physics-Informed Neural Networks and robust estimations for multitask inverse problems in pore-scale imaging of dissolution

    S. Perez and P. Poncet. “Auto-weighted Bayesian Physics-Informed Neural Networks and robust estimations for multitask inverse problems in pore-scale imaging of dissolution”. In: ComputationalGeosciences28.6(2024),pp.1175–1215.doi: https://doi.org/10.1007/s10596- 024-10313-x

  52. [52]

    Bayesian neural network and Bayesian physics-informed neural network via variational inference for seismic petrophysical inversion

    P. Li, D. Grana, and M. Liu. “Bayesian neural network and Bayesian physics-informed neural network via variational inference for seismic petrophysical inversion”. In:Geophysics89.6 (2024), pp. M185–M196.doi:10.1190/geo2023-0737.1

  53. [53]

    WhenCubicLawandDarcy Fail:BayesianCorrectionofModelMisspecificationinFractureConductivities

    S.Perez,F.Doster,J.Maes,H.Menke,A.ElSheikh,andA.Busch.“WhenCubicLawandDarcy Fail:BayesianCorrectionofModelMisspecificationinFractureConductivities”.In:Geophysical ResearchLetters52.18(2025),e2025GL117776.doi: https://doi.org/10.1029/2025GL117776

  54. [54]

    EfficientBayesianPhysicsInformedNeuralNetworksforinverse problems via Ensemble Kalman Inversion

    A.PensoneaultandX.Zhu.“EfficientBayesianPhysicsInformedNeuralNetworksforinverse problems via Ensemble Kalman Inversion”. In:Journal of Computational Physics508 (2024), p. 113006.doi:https://doi.org/10.1016/j.jcp.2024.113006

  55. [55]

    E-PINNs: Epistemic physics- informed neural networks

    B. Jacob, A. S. Nair, A. A. Howard, J. Drgona, and P. Stinis. “E-PINNs: Epistemic physics- informed neural networks”. In:arXiv preprint arXiv:2503.19333(2025)

  56. [56]

    Neural embedding: learning the embedding of the manifold of physics data

    S. E. Park, P. Harris, and B. Ostdiek. “Neural embedding: learning the embedding of the manifold of physics data”. In:Journal of High Energy Physics2023.7 (2023), pp. 1–38.doi: 10.1007/JHEP07(2023)108

  57. [57]

    Modeling parametric uncertainty in PDEs models via Physics-Informed Neural Networks

    M. Panahi, G. M. Porta, M. Riva, and A. Guadagnini. “Modeling parametric uncertainty in PDEs models via Physics-Informed Neural Networks”. In:Advances in Water Resources195 (2025), p. 104870

  58. [58]

    Leakage in data mining: Formulation, detection, and avoidance

    S. Kaufman, S. Rosset, C. Perlich, and O. Stitelman. “Leakage in data mining: Formulation, detection, and avoidance”. In:ACM Transactions on Knowledge Discovery from Data (TKDD)6.4 (2012), pp. 1–21

  59. [59]

    Leakage and the reproducibility crisis in machine-learning- based science

    S. Kapoor and A. Narayanan. “Leakage and the reproducibility crisis in machine-learning- based science”. In:Patterns4.9 (2023)

  60. [60]

    Cross-validation strategies for data with temporal, spatial, hierarchical, or phylogenetic structure

    D. R. Roberts, V. Bahn, S. Ciuti, M. S. Boyce, J. Elith, G. Guillera-Arroita, S. Hauenstein, J. J. Lahoz-Monfort, B. Schröder, W. Thuiller, et al. “Cross-validation strategies for data with temporal, spatial, hierarchical, or phylogenetic structure”. In:Ecography40.8 (2017), pp. 913–929

  61. [61]

    Role of coal type and rank on methane sorption characteristics of Bowen Basin, Australia coals

    C. Laxminarayana and P. J. Crosdale. “Role of coal type and rank on methane sorption characteristics of Bowen Basin, Australia coals”. In:International Journal of Coal Geology40.4 (1999), pp. 309–325

  62. [62]

    Departures from assumptions: diagnosis and remedies

    G. Seber and A. J. Lee. “Departures from assumptions: diagnosis and remedies”. In:Linear regression analysis. Vol. 329. John Wiley & Sons, 2012

  63. [63]

    Theoretical models of sorption kinetics including a surface reaction mechanism: a review

    W. Plazinski, W. Rudzinski, and A. Plazinska. “Theoretical models of sorption kinetics including a surface reaction mechanism: a review”. In:Advances in colloid and interface science 152.1-2 (2009), pp. 2–13.doi:10.1016/j.cis.2009.07.009. 43

  64. [64]

    Sorption isotherms: A review on physical bases, modeling and measurement

    G. Limousin, J.-P. Gaudet, L. Charlet, S. Szenknect, V. Barthès, and M. Krimissa. “Sorption isotherms: A review on physical bases, modeling and measurement”. In:Applied geochemistry 22.2 (2007), pp. 249–275

  65. [65]

    Overcoming catastrophic forgetting in neural networks.Proceedings of the National Academy of Sciences, 114(13): 3521–3526, 2017

    J.Kirkpatricketal.“Overcomingcatastrophicforgettinginneuralnetworks”.In:Proceedingsof theNationalAcademyofSciences114.13(2017),pp.3521–3526.doi: 10.1073/pnas.1611835114

  66. [66]

    Decoupled Weight Decay Regularization

    I. Loshchilov and F. Hutter. “Fixing Weight Decay Regularization in Adam”. In:CoRR abs/1711.05101 (2017). arXiv:1711.05101.url:http://arxiv.org/abs/1711.05101

  67. [67]

    Curriculum learning

    Y. Bengio, J. Louradour, R. Collobert, and J. Weston. “Curriculum learning”. In:Proceedings of the 26th annual international conference on machine learning. 2009, pp. 41–48

  68. [68]

    How transferable are features in deep neural networks?

    J. Yosinski, J. Clune, Y. Bengio, and H. Lipson. “How transferable are features in deep neural networks?” In:Advances in neural information processing systems27 (2014)

  69. [69]

    Progress&compress:Ascalableframeworkforcontinuallearning

    J. Schwarz, W. Czarnecki, J. Luketina, A. Grabska-Barwinska, Y. W. Teh, R. Pascanu, and R. Hadsell.“Progress&compress:Ascalableframeworkforcontinuallearning”.In:International conference on machine learning. PMLR. 2018, pp. 4528–4537

  70. [70]

    On calibration of modern neural networks

    C. Guo, G. Pleiss, Y. Sun, and K. Q. Weinberger. “On calibration of modern neural networks”. In:International Conference on Machine Learning (ICML). 2017, pp. 1321–1330

  71. [71]

    A unified approach to interpreting model predictions

    S. M. Lundberg and S.-I. Lee. “A unified approach to interpreting model predictions”. In: Advances in neural information processing systems30 (2017)

  72. [72]

    Visualizingtheeffectsofpredictorvariablesinblackboxsupervised learning models

    D.W.ApleyandJ.Zhu.“Visualizingtheeffectsofpredictorvariablesinblackboxsupervised learning models”. In:Journal of the Royal Statistical Society Series B: Statistical Methodology82.4 (2020), pp. 1059–1086

  73. [73]

    Molnar.Interpretable Machine Learning

    C. Molnar.Interpretable Machine Learning. Leanpub, 2020.url:https://books.google.no/ books?id=jBm3DwAAQBAJ

  74. [74]

    Uncertainty quantification in scientific machine learning: Methods, metrics, and comparisons

    A. F. Psaros, X. Meng, Z. Zou, L. Guo, and G. E. Karniadakis. “Uncertainty quantification in scientific machine learning: Methods, metrics, and comparisons”. In:Journal of Computational Physics477 (2023), p. 111902

  75. [75]

    From aleatoric to epistemic: Exploring uncertainty quantification techniques in artificial intelligence

    T. Wang, Y. Wang, J. Zhou, B. Peng, X. Song, C. Zhang, X. Sun, Q. Niu, J. Liu, S. Chen, et al. “From aleatoric to epistemic: Exploring uncertainty quantification techniques in artificial intelligence”. In:arXiv preprint arXiv:2501.03282(2025)

  76. [76]

    Bayesian Neural Network Priors Revisited

    V.Fortuin,A.Garriga-Alonso,S.W.Ober,F.Wenzel,G.Rätsch,R.E.Turner,M.vanderWilk, and L. Aitchison. “Bayesian Neural Network Priors Revisited”. In: 2022. arXiv:2102.06571 [stat.ML].url:https://arxiv.org/abs/2102.06571

  77. [77]

    Zhou.Ensemble methods: Foundations and algorithms

    Z.-H. Zhou.Ensemble methods: Foundations and algorithms. CRC press, 2012

  78. [78]

    Deep ensembles: A loss landscape perspective, 2019

    S. Fort, H. Hu, and B. Lakshminarayanan. “Deep ensembles: A loss landscape perspective”. In:arXiv preprint arXiv:1912.02757(2019)

  79. [79]

    Bayesian deep learning and a probabilistic perspective of generalization

    A. G. Wilson and P. Izmailov. “Bayesian deep learning and a probabilistic perspective of generalization”. In:Advances in Neural Information Processing Systems (NeurIPS)33 (2020), pp. 4697–4708

  80. [80]

    Laplace redux–EffortlessBayesiandeeplearning

    E. Daxberger, A. Kristiadi, A. Immer, R. Eschenhagen, M. Bauer, and P. Hennig. “Laplace redux–EffortlessBayesiandeeplearning”.In:AdvancesinNeuralInformationProcessingSystems (NeurIPS)34 (2021), pp. 20089–20103. 44