pith. machine review for the scientific record. sign in

arxiv: 2604.04636 · v1 · submitted 2026-04-06 · ❄️ cond-mat.dis-nn · cond-mat.mtrl-sci· cs.LG

Recognition: 2 theorem links

· Lean Theorem

Interpretation of Crystal Energy Landscapes with Kolmogorov-Arnold Networks

Authors on Pith no claims yet

Pith reviewed 2026-05-10 19:13 UTC · model grok-4.3

classification ❄️ cond-mat.dis-nn cond-mat.mtrl-scics.LG
keywords Kolmogorov-Arnold Networksformation energy predictionband gapcrystal energy landscapesmaterials interpretabilityperiodic table trendscomposition-based modelingquantum mechanical principles
0
0 comments X

The pith

Kolmogorov-Arnold Networks predict crystal properties accurately while revealing periodic table trends without any physics rules built in.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

This paper introduces an Element-Weighted Kolmogorov-Arnold Network that uses only material composition to predict formation energy, band gap, and work function at state-of-the-art accuracy levels. Unlike standard neural networks, KANs replace fixed activations with learnable functions on network edges, which can then be inspected along with the node embeddings. Analysis of these functions and embeddings through correlations and principal components shows chemical patterns that line up with the periodic table and quantum mechanical principles, even though no such rules were provided during training. A reader would care because this turns a predictive tool into one that can generate new scientific understanding about material behavior.

Core claim

The authors develop an Element-Weighted KAN that achieves state-of-the-art accuracy for predicting formation energy, band gap, and work function on large datasets. Without imposing any physical constraints, analysis of the learned embeddings, correlations, and principal components reveals interpretable chemical trends that match the periodic table and quantum principles. This establishes KANs as a framework that combines high predictive performance with scientific interpretability for materials informatics.

What carries the argument

The Element-Weighted Kolmogorov-Arnold Network, which replaces fixed activation functions with learnable univariate functions on the edges of the network, enabling direct inspection of the functions and embeddings for physical insights.

If this is right

  • Machine learning models for materials can deliver both accurate predictions and human-readable explanations of chemical behavior.
  • Composition-only inputs suffice to recover trends aligned with known quantum mechanical principles.
  • Principal component analysis of learned embeddings can directly connect representations to elemental properties listed in the periodic table.
  • Interpretability arises from the network architecture itself rather than separate explanation techniques.
  • The method opens a route to transparent, chemistry-driven materials informatics.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Retraining the same architecture on datasets for other properties such as thermal conductivity or magnetism could test whether the interpretability benefit generalizes.
  • If the extracted trends prove reliable, they might guide targeted searches for new stable compounds by highlighting favorable elemental combinations.
  • The same KAN structure could be applied to molecular rather than crystalline systems to see whether similar periodic trends emerge.
  • Comparing the learned univariate functions against explicit quantum calculations for simple binaries would provide an independent check on physical fidelity.

Load-bearing premise

The patterns found in the learned functions and embeddings reflect genuine physical relationships rather than artifacts from the model, the data, or the analysis methods chosen.

What would settle it

A demonstration that the principal components or correlations extracted from the embeddings show no statistically significant alignment with known elemental properties or quantum mechanical trends across independent datasets would undermine the interpretability claim.

Figures

Figures reproduced from arXiv: 2604.04636 by Claudia Felser, Gen Zu, Ning Mao, Yang Zhang.

Figure 1
Figure 1. Figure 1: FIG. 1. Workflow schematic of the Element-Weighted Kolmogorov–Arnold Network (EWKAN). Elemental compositions are [PITH_FULL_IMAGE:figures/full_fig_p002_1.png] view at source ↗
Figure 1
Figure 1. Figure 1: Elemental compositions are mapped to learned [PITH_FULL_IMAGE:figures/full_fig_p004_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: FIG. 2. Bandgap prediction under EWKAN model. (a) [PITH_FULL_IMAGE:figures/full_fig_p004_2.png] view at source ↗
Figure 3
Figure 3. Figure 3: FIG. 3. Work function prediction under EWKAN model. (a) [PITH_FULL_IMAGE:figures/full_fig_p005_3.png] view at source ↗
Figure 5
Figure 5. Figure 5: FIG. 5. Comparison of (a) the original PC1 values obtained from the KAN model and (b) normalized Pauling electronegativity [PITH_FULL_IMAGE:figures/full_fig_p007_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: FIG. 6. Comparison of model efficiency and performance across different materials property prediction tasks. (a) Mean absolute [PITH_FULL_IMAGE:figures/full_fig_p007_6.png] view at source ↗
read the original abstract

Characterizing crystalline energy landscapes is essential to predicting thermodynamic stability, electronic structure, and functional behavior. While machine learning (ML) enables rapid property predictions, the "black-box" nature of most models limits their utility for generating new scientific insights. Here, we introduce Kolmogorov-Arnold Networks (KANs) as an interpretable framework to bridge this gap. Unlike conventional neural networks with fixed activation functions, KANs employ learnable functions that reveal underlying physical relationships. We developed the Element-Weighted KAN, a composition-only model that achieves state-of-the-art accuracy in predicting formation energy, band gap, and work function across large-scale datasets. Crucially, without any explicit physical constraints, KANs uncover interpretable chemical trends aligned with the periodic table and quantum mechanical principles through embedding analysis, correlation studies, and principal component analysis. These results demonstrate that KANs provide a powerful framework with high predictive performance and scientific interpretability, establishing a new paradigm for transparent, chemistry-based materials informatics.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 1 minor

Summary. The manuscript introduces Kolmogorov-Arnold Networks (KANs), specifically an Element-Weighted variant, as an interpretable framework for predicting formation energies, band gaps, and work functions of crystalline materials from composition-only inputs. It claims state-of-the-art predictive accuracy while demonstrating that the learned univariate functions and element embeddings recover chemical trends aligned with the periodic table and quantum-mechanical principles through post-hoc embedding analysis, correlation studies, and principal component analysis, all without explicit physical constraints.

Significance. If the predictive performance claims are rigorously validated with baselines and error analysis, and if the interpretability results are shown to be robust rather than post-hoc artifacts, the work could provide a useful bridge between high-accuracy machine learning and scientific insight in materials informatics.

major comments (3)
  1. [Abstract] Abstract: the claim of 'state-of-the-art accuracy' for formation energy, band gap, and work function predictions is unsupported by any quantitative metrics, baseline comparisons, validation protocols, or error statistics; this is load-bearing for the central performance claim and must be addressed with explicit tables or figures in the results section.
  2. [Abstract] Abstract and interpretability analysis: the assertion that KANs 'uncover interpretable chemical trends aligned with the periodic table and quantum mechanical principles' without explicit constraints rests entirely on post-training embedding/PCA/correlation analysis; no ablation against an MLP with identical compositional featurization or against label-shuffled controls is described, leaving open the possibility that observed alignments are dataset artifacts rather than KAN-specific discoveries.
  3. [Methods/Results] Methods/Results (interpretability section): the weakest assumption—that learned univariate functions and embeddings encode genuine physical relationships rather than statistical regularities already present in the composition-only training data—requires explicit falsification tests (e.g., comparison of correlation strengths before and after training, or sensitivity to elemental frequency biases) to substantiate the 'without any explicit physical constraints' claim.
minor comments (1)
  1. [Abstract] Abstract: the specific architecture of the 'Element-Weighted KAN' is referenced but not defined; a brief equation or diagram in the methods would clarify how element weights are incorporated into the KAN spline basis.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for their constructive comments. We have revised the manuscript to provide explicit quantitative support for the performance claims and to include the requested ablation and falsification studies that strengthen the interpretability analysis.

read point-by-point responses
  1. Referee: [Abstract] Abstract: the claim of 'state-of-the-art accuracy' for formation energy, band gap, and work function predictions is unsupported by any quantitative metrics, baseline comparisons, validation protocols, or error statistics; this is load-bearing for the central performance claim and must be addressed with explicit tables or figures in the results section.

    Authors: We agree that the abstract claim requires explicit backing. The revised manuscript adds a dedicated results subsection with Table 2 reporting MAE, RMSE, and R2 values (with 5-fold CV standard deviations) for formation energy (0.031 eV/atom), band gap, and work function, directly comparing the Element-Weighted KAN against composition-only baselines including ElemNet, Roost, and CrabNet on identical datasets. The abstract has been updated to cite these metrics and the validation protocol. revision: yes

  2. Referee: [Abstract] Abstract and interpretability analysis: the assertion that KANs 'uncover interpretable chemical trends aligned with the periodic table and quantum mechanical principles' without explicit constraints rests entirely on post-training embedding/PCA/correlation analysis; no ablation against an MLP with identical compositional featurization or against label-shuffled controls is described, leaving open the possibility that observed alignments are dataset artifacts rather than KAN-specific discoveries.

    Authors: We accept that ablations are required to rule out dataset artifacts. The revised manuscript includes a new ablation subsection: an MLP with identical element-fraction inputs underperforms the KAN on all three properties and yields weaker periodic-table alignments in its activations; label-shuffled controls reduce embedding-PCA correlations with periodic properties from r > 0.8 to r < 0.15. These results are now reported with statistical significance tests. revision: yes

  3. Referee: [Methods/Results] Methods/Results (interpretability section): the weakest assumption—that learned univariate functions and embeddings encode genuine physical relationships rather than statistical regularities already present in the composition-only training data—requires explicit falsification tests (e.g., comparison of correlation strengths before and after training, or sensitivity to elemental frequency biases) to substantiate the 'without any explicit physical constraints' claim.

    Authors: We agree that explicit falsification is needed. The revised interpretability section now contains: (i) pre- versus post-training correlation analysis showing average Pearson r between learned univariate functions and elemental properties (electronegativity, atomic radius) rising from 0.22 to 0.81; (ii) frequency-bias sensitivity tests on balanced elemental subsets confirming that periodic trends in embeddings and PCA projections remain statistically unchanged. These tests are presented with controls for training dynamics. revision: yes

Circularity Check

0 steps flagged

No circularity: interpretability arises from post-training analysis of a data-driven model, not by construction from inputs.

full rationale

The paper trains a composition-only Element-Weighted KAN on formation-energy and related datasets to achieve predictive accuracy, then performs separate embedding inspection, correlation analysis, and PCA on the learned univariate functions and element representations. These steps are additional empirical procedures applied after optimization; they do not reduce the claimed chemical trends to the training labels or architecture definition by tautology. No equations equate a derived quantity to a fitted parameter, no self-citation supplies a load-bearing uniqueness theorem, and no ansatz is smuggled via prior work. The derivation chain therefore remains self-contained against external benchmarks.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Abstract-only review provides no equations, training details, or explicit assumptions; therefore no free parameters, axioms, or invented entities can be identified with certainty.

pith-pipeline@v0.9.0 · 5480 in / 1143 out tokens · 53958 ms · 2026-05-10T19:13:35.473826+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

66 extracted references · 7 canonical work pages · 1 internal anchor

  1. [1]

    Chen and S

    C. Chen and S. P. Ong, A universal graph deep learn- ing interatomic potential for the periodic table, Nature Computational Science2, 718 (2022)

  2. [2]

    A. S. Anker, E. T. Kjær, M. Juelsholt, T. L. Christiansen, S. L. Skjærvø, M. R. V. Jørgensen, I. Kantor, D. R. Sørensen, S. J. Billinge, R. Selvan,et al., Extracting structural motifs from pair distribution function data of nanostructures using explainable machine learning, npj Computational Materials8, 213 (2022)

  3. [3]

    2024-11.https://arxiv.org/abs/2411

    Andreeva, N.A.; Chaban, V.V., Potential Energy Land- scape as a Framework for Developing Innovative Ma- terials, arXiv. 2024-11.https://arxiv.org/abs/2411. 03732, (accessed 2026-03-19)

  4. [4]

    N. L. Allan, S. Conejeros, J. N. Hart, and C. E. Mohn, Energy landscapes of perfect and defective solids: from structure prediction to ion conduction, Theoretical Chemistry Accounts140, 151 (2021)

  5. [5]

    Choudhary, B

    K. Choudhary, B. DeCost, and F. Tavazza, Machine learning with force-field-inspired descriptors for materi- als: Fast screening and mapping energy landscape, Phys- ical review materials2, 083801 (2018)

  6. [6]

    Y. Zuo, M. Qin, C. Chen, W. Ye, X. Li, J. Luo, and S. P. Ong, Accelerating materials discovery with bayesian optimization and graph deep learning, Materials Today 51, 126 (2021)

  7. [7]

    I. A. Digdaya, G. W. Adhyaksa, B. J. Trze´ sniewski, E. C. Garnett, and W. A. Smith, Interfacial engineer- ing of metal-insulator-semiconductor junctions for effi- cient and stable photoelectrochemical water oxidation, Nature communications8, 15968 (2017)

  8. [8]

    Schindler, E

    P. Schindler, E. R. Antoniuk, G. Cheon, Y. Zhu, and E. J. Reed, Discovery of stable surfaces with extreme work functions by high-throughput density functional theory and machine learning, Advanced Functional Materials 34, 2401764 (2024)

  9. [9]

    L. Lin, R. Jacobs, T. Ma, D. Chen, J. Booske, and D. Morgan, Work function: Fundamentals, measure- ment, calculation, engineering, and applications, Phys- ical Review Applied19, 037001 (2023)

  10. [10]

    Weston and C

    L. Weston and C. Stampfl, Machine learning the band gap properties of kesterite i 2-ii-iv-v 4 quaternary com- pounds for photovoltaics applications, Physical Review Materials2, 085407 (2018)

  11. [11]

    Nassiri Nazif, A

    K. Nassiri Nazif, A. Daus, J. Hong, N. Lee, S. Vaziri, A. Kumar, F. Nitta, M. E. Chen, S. Kananian, R. Is- lam,et al., High-specific-power flexible transition metal dichalcogenide solar cells, nature Communications12, 7034 (2021)

  12. [12]

    Musil, S

    F. Musil, S. De, J. Yang, J. E. Campbell, G. M. Day, and M. Ceriotti, Machine learning for the structure–energy– property landscapes of molecular crystals, Chemical sci- ence9, 1289 (2018)

  13. [13]

    A. Jain, Y. Shin, and K. A. Persson, Computational pre- dictions of energy materials using density functional the- ory, Nature Reviews Materials1, 1 (2016)

  14. [14]

    K. T. Butler, D. W. Davies, H. Cartwright, O. Isayev, and A. Walsh, Machine learning for molecular and ma- terials science, Nature559, 547 (2018). 9

  15. [15]

    V. Wang, N. Xu, J.-C. Liu, G. Tang, and W.-T. Geng, Vaspkit: A user-friendly interface facilitating high- throughput computing and analysis using vasp code, Computer Physics Communications267, 108033 (2021)

  16. [16]

    S. Guo, R. Morrow, J. van den Brink, and O. Janson, Machine learning facilitated by microscopic features for discovery of novel magnetic double perovskites, Journal of Materials Chemistry A12, 6103 (2024)

  17. [17]

    A. J. Cohen, P. Mori-S´ anchez, and W. Yang, Insights into current limitations of density functional theory, Science 321, 792 (2008)

  18. [18]

    B. G. del Rio, B. Phan, and R. Ramprasad, A deep learn- ing framework to emulate density functional theory, npj Computational Materials9, 158 (2023)

  19. [19]

    V. I. Hegde, C. K. Borg, Z. Del Rosario, Y. Kim, M. Hutchinson, E. Antono, J. Ling, P. Saxe, J. E. Saal, and B. Meredig, Quantifying uncertainty in high- throughput density functional theory: A comparison of aflow, materials project, and oqmd, Physical Review Ma- terials7, 053805 (2023)

  20. [20]

    Nakata, D

    A. Nakata, D. R. Bowler, and T. Miyazaki, Large-scale dft methods for calculations of materials with complex structures, Journal of the Physical Society of Japan91, 091011 (2022)

  21. [21]

    L. Ward, R. Liu, A. Krishna, V. I. Hegde, A. Agrawal, A. Choudhary, and C. Wolverton, Including crystal struc- ture attributes in machine learning models of formation energies via voronoi tessellations, Physical Review B96, 024104 (2017)

  22. [22]

    Schmidt, M

    J. Schmidt, M. R. Marques, S. Botti, and M. A. Marques, Recent advances and applications of machine learning in solid-state materials science, npj computational materi- als5, 83 (2019)

  23. [23]

    Curtarolo, W

    S. Curtarolo, W. Setyawan, S. Wang, J. Xue, K. Yang, R. H. Taylor, L. J. Nelson, G. L. Hart, S. San- vito, M. Buongiorno-Nardelli,et al., Aflowlib. org: A distributed materials properties repository from high- throughput ab initio calculations, Computational Mate- rials Science58, 227 (2012)

  24. [24]

    Kirklin, J

    S. Kirklin, J. E. Saal, B. Meredig, A. Thompson, J. W. Doak, M. Aykol, S. R¨ uhl, and C. Wolverton, The open quantum materials database (oqmd): assessing the accu- racy of dft formation energies, npj Computational Mate- rials1, 1 (2015)

  25. [25]

    Choudhary, K

    K. Choudhary, K. F. Garrity, A. C. Reid, B. DeCost, A. J. Biacchi, A. R. Hight Walker, Z. Trautt, J. Hattrick- Simpers, A. G. Kusne, A. Centrone,et al., The joint automated repository for various integrated simulations (jarvis) for data-driven materials design, npj computa- tional materials6, 173 (2020)

  26. [26]

    M. D. Witman, A. Goyal, T. Ogitsu, A. H. McDaniel, and S. Lany, Defect graph neural networks for materials discovery in high-temperature clean-energy applications, Nature Computational Science3, 675 (2023)

  27. [27]

    Haastrup, M

    S. Haastrup, M. Strange, M. Pandey, T. Deilmann, P. S. Schmidt, N. F. Hinsche, M. N. Gjerding, D. Torelli, P. M. Larsen, A. C. Riis-Jensen,et al., The computational 2d materials database: high-throughput modeling and dis- covery of atomically thin crystals, 2D Materials5, 042002 (2018)

  28. [28]

    Davariashtiyani and S

    A. Davariashtiyani and S. Kadkhodaei, Formation energy prediction of crystalline compounds using deep convolu- tional network learning on voxel image representation, Communications Materials4, 105 (2023)

  29. [29]

    N. Mao, C. Xu, J. Li, T. Bao, P. Liu, Y. Xu, C. Felser, L. Fu, and Y. Zhang, Transfer learning relaxation, elec- tronic structure and continuum model for twisted bilayer mote2, Communications Physics7, 262 (2024)

  30. [30]

    X. Luo, Z. Wang, P. Gao, J. Lv, Y. Wang, C. Chen, and Y. Ma, Deep learning generative model for crystal structure prediction, npj Computational Materials10, 254 (2024)

  31. [31]

    A. E. Allen and A. Tkatchenko, Machine learning of ma- terial properties: Predictive and interpretable multilinear models, Science advances8, eabm7185 (2022)

  32. [32]

    2025-01.https://arxiv

    Bao, T.; Mao, N.; Duan, W.; Xu, Y.; Del Mae- stro, A.; Zhang, Y., Transfer learning electronic struc- ture: millielectron volt accuracy for sub-million-atom moir´ e semiconductor, arXiv. 2025-01.https://arxiv. org/abs/2501.12452, (accessed 2026-03-19)

  33. [33]

    Pilania, C

    G. Pilania, C. Wang, X. Jiang, S. Rajasekaran, and R. Ramprasad, Accelerating materials property predic- tions using machine learning, Scientific reports3, 2810 (2013)

  34. [34]

    L. Ward, A. Agrawal, A. Choudhary, and C. Wolverton, A general-purpose machine learning framework for pre- dicting properties of inorganic materials, npj Computa- tional Materials2, 1 (2016)

  35. [35]

    Y. Zhuo, A. Mansouri Tehrani, A. O. Oliynyk, A. C. Duke, and J. Brgoch, Identifying an efficient, thermally robust inorganic phosphor host via machine learning, Na- ture communications9, 4377 (2018)

  36. [36]

    2025-03.https://arxiv.org/abs/2503

    Wetzel, S.J.; Ha, S.; Iten, R.; Klopotek, M.; Liu, Z., Interpretable Machine Learning in Physics: A Re- view, arXiv. 2025-03.https://arxiv.org/abs/2503. 23616, (accessed 2026-03-19)

  37. [37]

    2024-05.https://arxiv.org/abs/2405

    Du, Z.; Jin, L.; Shu, L.; Cen, Y.; Xu, Y.; Mei, Y.; Zhang, H., CTGNN: Crystal Transformer Graph Neural Network for Crystal Material Property Pre- diction, arXiv. 2024-05.https://arxiv.org/abs/2405. 11502, (accessed 2026-03-19)

  38. [38]

    Bechtel, D

    T. Bechtel, D. T. Speckhard, J. Godwin, and C. Draxl, Band-gap regression with architecture- optimized message-passing neural networks, Chemistry of Materials37, 1358 (2025)

  39. [39]

    2023-01.https://arxiv.org/abs/2301

    Sanyal, S.; Sagotra, A.K.; Kumar, N.; Rathi, S.; Krishna, M.; Somayajula, N.; Palanisamy, D.; Rat- nakar, R.R.; Sanyal, S.; Talukdar, P.; Waghmare, U.; Balachandran, J., Potential energy surface predic- tion of Alumina polymorphs using graph neural net- work, arXiv. 2023-01.https://arxiv.org/abs/2301. 12059, (accessed 2026-03-19)

  40. [40]

    S. C. Selvaraj, Graph neural networks based deep learn- ing for predicting structural and electronic properties, arXiv preprint arXiv:2411.02331 (2024)

  41. [41]

    Merchant, S

    A. Merchant, S. Batzner, S. S. Schoenholz, M. Aykol, G. Cheon, and E. D. Cubuk, Scaling deep learning for materials discovery, Nature624, 80 (2023)

  42. [42]

    C. J. Bartel, A. Trewartha, Q. Wang, A. Dunn, A. Jain, and G. Ceder, A critical examination of compound stabil- ity predictions from machine-learned formation energies, npj computational materials6, 97 (2020)

  43. [43]

    2022-06.https://arxiv.org/abs/2206

    Tian, S.I.P.; Walsh, A.; Ren, Z.; Li, Q.; Buonas- sisi, T., What Information is Necessary and Sufficient to Predict Materials Properties using Machine Learn- ing?, arXiv. 2022-06.https://arxiv.org/abs/2206. 04968, (accessed 2026-03-19). 10

  44. [44]

    L. M. Antunes, R. Grau-Crespo, and K. T. Butler, Distributed representations of atoms and materials for machine learning, npj Computational Materials8, 44 (2022)

  45. [45]

    A. Ma, Y. Zhang, T. Christensen, H. C. Po, L. Jing, L. Fu, and M. Soljacic, Topogivity: A machine-learned chemical rule for discovering topological materials, Nano Letters23, 772 (2023)

  46. [46]

    S. G. Jung, G. Jung, and J. M. Cole, Automatic pre- diction of band gaps of inorganic materials using a gra- dient boosted and statistical feature selection workflow, Journal of Chemical Information and Modeling64, 1187 (2024)

  47. [47]

    Ma, A.; Dugan, O.; Soljaˇ ci´ c, M., Predicting band gap from chemical composition: A simple learned model for a material property with atypical statistics, arXiv. 2025-01. https://arxiv.org/abs/2501.02932, (accessed 2026- 03-19)

  48. [48]

    G. G. Peterson and J. Brgoch, Materials discovery through machine learning formation energy, Journal of Physics: Energy3, 022002 (2021)

  49. [49]

    Isayev, C

    O. Isayev, C. Oses, C. Toher, E. Gossett, S. Curtarolo, and A. Tropsha, Universal fragment descriptors for pre- dicting properties of inorganic crystals, Nature commu- nications8, 15679 (2017)

  50. [50]

    Damewood, J

    J. Damewood, J. Karaguesian, J. R. Lunger, A. R. Tan, M. Xie, J. Peng, and R. G´ omez-Bombarelli, Representa- tions of materials for machine learning, Annual Review of Materials Research53, 399 (2023)

  51. [51]

    Z. Liu, Y. Wang, S. Vaidya, F. Ruehle, J. Halver- son, M. Soljaˇ ci´ c, T. Y. Hou, and M. Tegmark, Kan: Kolmogorov-arnold networks, arXiv preprint arXiv:2404.19756 (2024)

  52. [52]

    A. N. Kolmogorov, On the representations of continuous functions of many variables by superposition of continu- ous functions of one variable and addition, inDokl. Akad. Nauk USSR, Vol. 114 (1957) pp. 953–956

  53. [53]

    Zhong, B

    X. Zhong, B. Gallagher, S. Liu, B. Kailkhura, A. Hisz- panski, and T. Y.-J. Han, Explainable machine learning in materials science, npj computational materials8, 204 (2022)

  54. [54]

    Z. Chen, Y. Liu, and H. Sun, Physics-informed learning of governing equations from scarce data, Nature commu- nications12, 6136 (2021)

  55. [55]

    S. R. Xie, M. Rupp, and R. G. Hennig, Ultra-fast inter- pretable machine-learning potentials, npj Computational Materials9, 162 (2023)

  56. [56]

    Y. Gao, Z. Hu, W.-A. Chen, M. Liu, and Y. Ruan, A revolutionary neural network architecture with inter- pretability and flexibility based on kolmogorov–arnold for solar radiation and temperature forecasting, Applied En- ergy378, 124844 (2025)

  57. [57]

    2025-02.https://arxiv.org/abs/2502.14681, (ac- cessed 2026-03-19)

    Boura, T.; Konstantopoulos, S., seqKAN: Sequence processing with Kolmogorov-Arnold Networks, arXiv. 2025-02.https://arxiv.org/abs/2502.14681, (ac- cessed 2026-03-19)

  58. [58]

    Ranasinghe, N.; Xia, Y.; Seneviratne, S.; Halgamuge, S., GINN-KAN: Interpretability pipelining with applications in Physics Informed Neural Networks, arXiv. 2024-08. https://arxiv.org/abs/2408.14780, (accessed 2026- 03-19)

  59. [59]

    2024-11.https://arxiv.org/abs/2411.08414, (accessed 2026-03-19)

    Huang, C.; Chen, C.; Shi, L.; Chen, C., Material Property Prediction with Element Attribute Knowl- edge Graphs and Multimodal Representation Learning, arXiv. 2024-11.https://arxiv.org/abs/2411.08414, (accessed 2026-03-19)

  60. [60]

    Tshitoyan, J

    V. Tshitoyan, J. Dagdelen, L. Weston, A. Dunn, Z. Rong, O. Kononova, K. A. Persson, G. Ceder, and A. Jain, Unsupervised word embeddings capture latent knowledge from materials science literature, Nature571, 95 (2019)

  61. [61]

    2018-11.https://arxiv.org/abs/1811

    Sanyal, S.; Balachandran, J.; Yadati, N.; Kumar, A.; Rajagopalan, P.; Sanyal, S.; Talukdar, P., MT-CGCNN: Integrating Crystal Graph Convolutional Neural Net- work with Multitask Learning for Material Property Pre- diction, arXiv. 2018-11.https://arxiv.org/abs/1811. 05660, (accessed 2026-03-19)

  62. [62]

    O. T. Unke and M. Meuwly, Physnet: A neural network for predicting energies, forces, dipole moments, and par- tial charges, Journal of chemical theory and computation 15, 3678 (2019)

  63. [63]

    P. Roy, L. Rekhi, S. W. Koh, H. Li, and T. S. Choksi, Pre- dicting the work function of 2d mxenes using machine- learning methods, Journal of Physics: Energy5, 034005 (2023)

  64. [64]

    A. Wong, Netscore: towards universal metrics for large- scale performance analysis of deep neural networks for practical on-device edge usage, inInternational Confer- ence on Image Analysis and Recognition(Springer, 2019) pp. 15–26

  65. [65]

    2020-01.https://arxiv.org/abs/2001

    Kaplan, J.; McCandlish, S.; Henighan, T.; Brown, T.B.; Chess, B.; Child, R.; Gray, S.; Radford, A.; Wu, J.; Amodei, D., Scaling Laws for Neural Language Models, arXiv. 2020-01.https://arxiv.org/abs/2001. 08361, (accessed 2026-03-19)

  66. [66]

    electronegativity–radius

    L. Mentel,mendeleev– a python resource for proper- ties of chemical elements, ions and isotopes,https:// github.com/lmmentel/mendeleev(2014), version 1.1.0. 11 TOC Graphic 12 Supplemental Materials DETAILS OF DATA PREPARATION The datasets utilized in this study were sourced from established materials databases to ensure reproducibility and comparability w...