pith. machine review for the scientific record. sign in

arxiv: 2605.02267 · v1 · submitted 2026-05-04 · ❄️ cond-mat.mtrl-sci · physics.comp-ph

Recognition: 3 theorem links

· Lean Theorem

Composition-Weighted Symbolic Regression for General-Purpose Property Prediction

Authors on Pith no claims yet

Pith reviewed 2026-05-08 18:50 UTC · model grok-4.3

classification ❄️ cond-mat.mtrl-sci physics.comp-ph
keywords symbolic regressionmaterials property predictionchemical compositioninterpretable machine learningMatBench benchmarkselemental weightingproperty screening
0
0 comments X

The pith

A composition-weighted symbolic regression method predicts materials properties from chemical formulas by jointly learning analytical expressions and elemental weights.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces a symbolic regression approach that weights elements according to composition to predict material properties directly from stoichiometry. It learns both the functional form of each property and task-specific elemental weights at the same time, without relying on hand-crafted descriptors. Max and min operators are built in to respect physical limits such as non-negative band gaps. On standard MatBench benchmarks the resulting models reach accuracy levels comparable to black-box machine learning while returning explicit, human-readable equations. When applied to III-V semiconductor alloys the expressions produce smooth composition trends and elemental weights that follow periodic chemical patterns.

Core claim

We introduce a composition-weighted symbolic regression framework for interpretable prediction of materials properties directly from chemical composition. The method jointly learns analytical functional forms and task-dependent elemental weightings without predefined descriptors. By incorporating max/min operators, it naturally enforces constraints such as non-negative band gaps and bounded classification probabilities, unifying regression and classification tasks. Efficient search is achieved through a hybrid Monte Carlo tree search--genetic programming algorithm with gradient-based refinement and parallel computation. Benchmarks on MatBench tasks show competitive accuracy relative to state

What carries the argument

Composition-weighted symbolic regression, which embeds learned elemental weightings inside symbolic expressions and uses max/min operators to enforce physical constraints.

If this is right

  • Predictions can be made directly from composition without manual feature engineering or predefined descriptors.
  • Both regression and classification tasks are handled in one framework with built-in enforcement of physical bounds.
  • Explicit analytical expressions are produced that remain competitive in accuracy with state-of-the-art black-box models.
  • Application to semiconductor alloys yields smooth composition trends and elemental weights exhibiting periodic chemical behavior.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The explicit formulas could let researchers quickly inspect which elements most influence a target property and use that insight to propose new candidate materials.
  • Because the method stays close to composition alone, it may serve as a fast first filter before more expensive structure-based calculations are run.
  • The observed periodic patterns in learned weights point to possible links with established periodic-table descriptors that could be tested by comparing against known electronegativity or size trends.
  • Extending the same weighting idea to include simple structural features such as crystal system might further improve accuracy for properties sensitive to ordering.

Load-bearing premise

The jointly learned analytical forms and composition-dependent elemental weightings will generalize to new compositions and tasks without overfitting.

What would settle it

Testing the trained models on a set of compositions held completely out of the training distribution and finding that prediction errors exceed those of black-box models while the extracted expressions violate known chemical trends or fail to reproduce smooth alloy behavior.

Figures

Figures reproduced from arXiv: 2605.02267 by Jingrun Chen, Yang Huang.

Figure 1
Figure 1. Figure 1: FIG. 1: Hybrid Monte Carlo tree search and genetic view at source ↗
Figure 3
Figure 3. Figure 3: FIG. 3: Scatter plot of view at source ↗
Figure 2
Figure 2. Figure 2: FIG. 2: Learned elemental weights visualized on the view at source ↗
Figure 4
Figure 4. Figure 4: FIG. 4: Band gap comparison for selected III–V semiconductor alloys (adapted from Ref. [33]). Experimental data view at source ↗
read the original abstract

We introduce a composition-weighted symbolic regression framework for interpretable prediction of materials properties directly from chemical composition. The method jointly learns analytical functional forms and task-dependent elemental weightings without predefined descriptors. By incorporating max/min operators, it naturally enforces constraints such as non-negative band gaps and bounded classification probabilities, unifying regression and classification tasks. Efficient search is achieved through a hybrid Monte Carlo tree search--genetic programming algorithm with gradient-based refinement and parallel computation. Benchmarks on MatBench tasks show competitive accuracy relative to state-of-the-art black-box models while yielding explicit analytical expressions. Applied to III--V semiconductor alloys, the model produces smooth composition-dependent trends and learned elemental weights with chemically meaningful periodic behavior. This framework provides a scalable and interpretable route for materials discovery and property screening.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper introduces a composition-weighted symbolic regression framework that jointly learns analytical functional forms and task-dependent elemental weightings directly from chemical composition for materials property prediction. It employs a hybrid Monte Carlo tree search-genetic programming algorithm with gradient-based refinement, incorporates max/min operators to enforce physical constraints, and unifies regression and classification. Benchmarks on MatBench tasks are reported to achieve competitive accuracy relative to black-box models while producing explicit expressions; the method is further demonstrated on III-V semiconductor alloys yielding smooth composition trends and chemically interpretable weights.

Significance. If the reported MatBench performance and generalization hold, the work offers a meaningful advance in interpretable ML for materials science by delivering explicit, composition-dependent analytical expressions without relying on hand-crafted descriptors. The hybrid search strategy and constraint-enforcing operators address key limitations of standard symbolic regression, and the emphasis on chemically meaningful weights could support physical insight and screening applications.

major comments (2)
  1. [Results (MatBench benchmarks)] MatBench benchmark results (likely in the results section): the manuscript reports competitive accuracy but provides no error bars from repeated runs, no details on the exact train/validation/test splits used, and no ablation studies isolating the contribution of the learned elemental weightings versus the symbolic forms alone. This makes it difficult to evaluate whether the performance is robust or sensitive to post-hoc choices in the joint optimization.
  2. [Application to III-V alloys / Discussion] Generalization discussion (likely in the III-V alloys application or conclusions): the central claim that the jointly learned forms and composition-dependent weights generalize relies on in-distribution MatBench cross-validation, but no explicit out-of-distribution tests on compositions or chemical spaces absent from training are presented. For symbolic methods with per-element parameters, standard CV does not rule out overfitting to training composition statistics.
minor comments (2)
  1. [Methods] Notation for the composition-weighted expressions should be clarified with an explicit equation defining how elemental weights enter the functional form (e.g., as multipliers inside the symbolic tree).
  2. [Figures] Figure captions for the III-V alloy trends should include the specific MatBench task identifiers and the number of compositions used in training versus prediction.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for their constructive and detailed comments, which have helped clarify several aspects of our evaluation and claims. We address each major comment point by point below, indicating the revisions made to the manuscript.

read point-by-point responses
  1. Referee: [Results (MatBench benchmarks)] MatBench benchmark results (likely in the results section): the manuscript reports competitive accuracy but provides no error bars from repeated runs, no details on the exact train/validation/test splits used, and no ablation studies isolating the contribution of the learned elemental weightings versus the symbolic forms alone. This makes it difficult to evaluate whether the performance is robust or sensitive to post-hoc choices in the joint optimization.

    Authors: We agree that reporting error bars, explicit split details, and ablation studies would improve the transparency and robustness assessment of the MatBench results. In the revised manuscript, we have added error bars from five independent runs using different random seeds for the Monte Carlo tree search and genetic programming components. We have also included a supplementary table with the precise train/validation/test splits employed for each task, following the official MatBench protocol. Additionally, we performed ablation experiments comparing the full model (jointly learned weights and forms) against a baseline with fixed uniform elemental weights; these results are now reported in Section 4.2 and demonstrate that the learned weightings provide measurable gains on multiple tasks. These changes address the concern about sensitivity to optimization choices. revision: yes

  2. Referee: [Application to III-V alloys / Discussion] Generalization discussion (likely in the III-V alloys application or conclusions): the central claim that the jointly learned forms and composition-dependent weights generalize relies on in-distribution MatBench cross-validation, but no explicit out-of-distribution tests on compositions or chemical spaces absent from training are presented. For symbolic methods with per-element parameters, standard CV does not rule out overfitting to training composition statistics.

    Authors: The referee correctly identifies a potential limitation: standard cross-validation on MatBench may not fully exclude overfitting to element co-occurrence statistics when per-element parameters are learned. While the diversity of MatBench tasks provides some protection and the III-V demonstration shows chemically plausible weights, we acknowledge that explicit out-of-distribution tests would offer stronger support. In the revised manuscript we have added a targeted out-of-distribution experiment (detailed in the updated Section 5), training on compositions excluding a subset of elements and evaluating on held-out chemical spaces; performance degrades gracefully but remains competitive, and the learned weights retain periodic trends. We have also expanded the discussion to explicitly note the limitations of in-distribution CV for this class of models and the value of such tests for future work. revision: partial

Circularity Check

0 steps flagged

No circularity; algorithmic framework with external benchmarks

full rationale

The paper introduces a composition-weighted symbolic regression method using hybrid MCTS-GP search and gradient refinement to jointly learn analytical forms and elemental weightings. Performance claims rest on MatBench benchmarks and III-V alloy applications, which are external to the fitted parameters. No equations reduce predictions to inputs by construction, no self-citations bear the central load, and no uniqueness theorems or ansatzes are smuggled in. The derivation chain is self-contained as a new algorithmic procedure rather than a tautological renaming or fit.

Axiom & Free-Parameter Ledger

1 free parameters · 2 axioms · 1 invented entities

The central claim rests on standard assumptions of symbolic regression search plus the novel weighting and operator choices; no new physical axioms or particles are introduced.

free parameters (1)
  • task-dependent elemental weightings
    Learned per prediction task to scale elemental contributions inside the symbolic expressions.
axioms (2)
  • domain assumption Symbolic regression via genetic programming and Monte Carlo tree search can discover useful analytical forms from composition-property data.
    Invoked as the core search mechanism throughout the framework description.
  • domain assumption Max and min operators can be inserted into symbolic expressions to enforce physical bounds such as non-negative band gaps.
    Used to unify regression and classification tasks.
invented entities (1)
  • composition-weighted symbolic expressions no independent evidence
    purpose: To enable interpretable property prediction directly from chemical composition without predefined descriptors.
    The core modeling innovation introduced by the paper.

pith-pipeline@v0.9.0 · 5422 in / 1436 out tokens · 76371 ms · 2026-05-08T18:50:35.379072+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Lean theorems connected to this paper

Citations machine-checked in the Pith Canon. Every link opens the source theorem in the public Lean library.

What do these tags mean?
matches
The paper's claim is directly supported by a theorem in the formal canon.
supports
The theorem supports part of the paper's argument, but the paper may add assumptions or extra steps.
extends
The paper goes beyond the formal theorem; the theorem is a base layer rather than the whole result.
uses
The paper appears to rely on the theorem as machinery.
contradicts
The paper's claim conflicts with a theorem or certificate in the canon.
unclear
Pith found a possible connection, but the passage is too broad, indirect, or ambiguous to say the theorem truly supports the claim.

Reference graph

Works this paper leans on

62 extracted references · 8 canonical work pages

  1. [1]

    Gradient-Based Optimization Strategy A gradient-based refinement strategy is adopted for continuous parameter optimization. Although operators such as max(·,·) and min(·,·) introduce non-smoothness, the resulting objective remains piecewise differentiable, with derivatives defined almost everywhere except at switching boundaries. This structure permits th...

  2. [2]

    stage- jumping

    Hybrid MCTS and GP To improve search efficiency, we employ a hybrid Monte Carlo tree search–genetic programming (MCTS– GP) framework, extending recent symbolic-regression search strategies [23–26]. The method combines the directed exploration of MCTS with the “stage- jumping”[23] of GP, enabling efficient traversal of the enlarged symbolic and parametric ...

  3. [3]

    In the GP stage, we select a batch of expressions from the root expression queue (with twice the target batch size to facilitate crossover operations)

    Parallelism To further improve computational efficiency, we imple- ment parallelism in both the GP and MCTS components of the framework. In the GP stage, we select a batch of expressions from the root expression queue (with twice the target batch size to facilitate crossover operations). These expressions are then subjected to mutation or crossover to gen...

  4. [4]

    Xie and J

    T. Xie and J. C. Grossman, Crystal graph convolutional neural networks for an accurate and interpretable predic- tion of material properties, Phys. Rev. Lett.120, 145301 (2018)

  5. [5]

    K. T. Sch¨ utt, H. E. Sauceda, P.-J. Kindermans, A. Tkatchenko, and K.-R. M¨ uller, Schnet–a deep learn- ing architecture for molecules and materials, J. Chem. Phys.148(2018)

  6. [6]

    C. Chen, W. Ye, Y. Zuo, C. Zheng, and S. P. Ong, Graph networks as a universal machine learning framework for molecules and crystals, Chem. Mater.31, 3564 (2019)

  7. [7]

    Albert Gu and Tri Dao

    J. Gasteiger, J. Groß, and S. G¨ unnemann, Directional message passing for molecular graphs, arXiv preprint arXiv:2003.03123 (2020)

  8. [8]

    A. Dunn, Q. Wang, A. Ganose, D. Dopp, and A. Jain, Benchmarking materials property prediction methods: the Matbench test set and Automatminer reference al- gorithm, Npj Comput. Mater.6, 138 (2020)

  9. [9]

    Liao and T

    Y.-L. Liao and T. Smidt, Equiformer: Equivariant graph attention transformer for 3d atomistic graphs, arXiv preprint arXiv:2206.11990 (2022)

  10. [10]

    J. Kim, J. You, Y. Park, Y. Lim, Y. Kang, J. Kim, H. Jeon, S. Ju, D. Hong, S. Y. Lee,et al., Optimizing cross-domain transfer for universal machine learning in- teratomic potentials, Nat. Commun. (2026)

  11. [11]

    Y. Park, J. Kim, S. Hwang, and S. Han, Scalable parallel algorithm for graph neural network interatomic poten- tials in molecular dynamics simulations, J. Chem. Theory Comput.20, 4857 (2024)

  12. [12]

    Zhang, X

    D. Zhang, X. Liu, X. Zhang, C. Zhang, C. Cai, H. Bi, Y. Du, X. Qin, A. Peng, J. Huang,et al., DPA-2: a large atomic model as a multi-task learner, Npj Comput. Mater.10, 293 (2024)

  13. [13]

    Zhang, H

    D. Zhang, H. Bi, F.-Z. Dai, W. Jiang, X. Liu, L. Zhang, and H. Wang, Pretraining of attention-based deep learn- ing potential model for molecular simulation, Npj Com- put. Mater.10, 94 (2024)

  14. [14]

    De Breuck, G

    P.-P. De Breuck, G. Hautier, and G.-M. Rignanese, Mate- rials property prediction for limited datasets enabled by feature selection and joint learning with MODNet, Npj Comput. Mater.7, 83 (2021)

  15. [15]

    De Breuck, M

    P.-P. De Breuck, M. L. Evans, and G.-M. Rignanese, Ro- bust model benchmarking and bias-imbalance in data- driven materials science: a case study on MODNet, J. Phys. Condens. Matter33, 404002 (2021)

  16. [16]

    R. Ruff, P. Reiser, J. St¨ uhmer, and P. Friederich, Con- nectivity optimized nested line graph networks for crystal structures, Digit. Discov.3, 594 (2024)

  17. [17]

    Ihalage and Y

    A. Ihalage and Y. Hao, Formula Graph Self-Attention Network for Representation-Domain Independent Mate- rials Discovery, Adv. Sci.9, 2200164 (2022)

  18. [18]

    A. Y.-T. Wang, S. K. Kauwe, R. J. Murdock, and T. D. Sparks, Compositionally restricted attention-based net- work for materials property predictions, Npj Comput. Mater.7, 77 (2021)

  19. [19]

    A. Y.-T. Wang, M. S. Mahmoud, M. Czasny, and A. Gurlo, CrabNet for Explainable Deep Learning in Ma- terials Science: Bridging the Gap Between Academia and Industry, Integr. Mater. Manuf. Innov.11, 41 (2022)

  20. [20]

    Meredig, A

    B. Meredig, A. Agrawal, S. Kirklin, J. E. Saal, J. W. Doak, A. Thompson, K. Zhang, A. Choudhary, and C. Wolverton, Combinatorial screening for new materials in unconstrained composition space with machine learn- ing, Phys. Rev. B89, 094104 (2014)

  21. [21]

    L. Ward, A. Agrawal, A. Choudhary, and C. Wolverton, A general-purpose machine learning framework for pre- dicting properties of inorganic materials, Npj Comput. Mater.2, 16028 (2016)

  22. [22]

    R. E. A. Goodall and A. A. Lee, Predicting Materials Properties without Crystal Structure: Deep Represen- tation Learning from Stoichiometry, Nat. Commun.11, 6280 (2020)

  23. [23]

    Z. Guo, S. Hu, Z.-K. Han, and R. Ouyang, Improving symbolic regression for predicting materials properties with iterative variable selection, J. Chem. Theory Com- put.18, 4945 (2022)

  24. [24]

    A. Ma, O. Dugan, and M. Soljaˇ ci´ c, Predicting band gap from chemical composition: A simple learned model for a material property with atypical statistics, arXiv preprint arXiv:2501.02932 (2025)

  25. [25]

    D. R. S. Saputro and P. Widyaningsih, Limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method for the parameter estimation on geographically weighted ordinal logistic regression model (GWOLR), inAIP con- ference proceedings, Vol. 1868 (AIP Publishing LLC,

  26. [26]

    Huang, D

    Z. Huang, D. Z. Huang, T. Xiao, D. Ma, Z. Ming, H. Shi, and Y. Wen, Improving Monte Carlo Tree Search for Symbolic Regression, arXiv preprint arXiv:2509.15929 (2025)

  27. [27]

    Y. Xu, Y. Liu, and H. Sun, Rsrm: Reinforce- ment symbolic regression machine, arXiv preprint arXiv:2305.14656 (2023)

  28. [28]

    Landajuela, C

    M. Landajuela, C. S. Lee, J. Yang, R. Glatt, C. P. San- tiago, I. Aravena, T. Mundhenk, G. Mulcahy, and B. K. Petersen, A unified framework for deep symbolic regres- sion, Adv. Neural Inf. Process. Syst.35, 33985 (2022)

  29. [29]

    T. N. Mundhenk, M. Landajuela, R. Glatt, C. P. Santi- ago, D. M. Faissol, and B. K. Petersen, Symbolic regres- sion via neural-guided genetic programming population 8 seeding, arXiv preprint arXiv:2111.00053 (2021)

  30. [30]

    A. Dunn, Q. Wang, A. Ganose, D. Dopp, and A. Jain, MatBench v0.1 Leaderboard: matbench expt gap, https://matbench.materialsproject.org/ Leaderboards%20Per-Task/matbench_v0.1_matbench_ expt_gap/(), accessed: 2026-05-04

  31. [31]

    A. Dunn, Q. Wang, A. Ganose, D. Dopp, and A. Jain, Matbench v0.1 Leaderboard: matbench expt is metal, https://matbench.materialsproject.org/ Leaderboards%20Per-Task/matbench_v0.1_matbench_ expt_is_metal/(), materials Project MatBench leaderboard, accessed: 2026-05-04

  32. [32]

    A. Dunn, Q. Wang, A. Ganose, D. Dopp, and A. Jain, Matbench v0.1 Leaderboard: matbench glass,https: //matbench.materialsproject.org/Leaderboards% 20Per-Task/matbench_v0.1_matbench_glass/(), materials Project MatBench leaderboard, accessed: 2026-05-04

  33. [33]

    T. Xie, Y. Wan, W. Huang, Z. Yin, Y. Liu, S. Wang, Q. Linghu, C. Kit, C. Grazian, W. Zhang, I. Raz- zak, and B. Hoex, DARWIN Series: Domain Specific Large Language Models for Natural Science (2023), arXiv:2308.13565 [cs.CL]

  34. [34]

    T. Xie, Y. Wan, Y. Liu, Y. Zeng, S. Wang, W. Zhang, C. Grazian, C. Kit, W. Ouyang, D. Zhou, and B. Hoex, DARWIN 1.5: Large Language Models as Materials Sci- ence Adapted Learners (2025), arXiv:2412.11970 [cs.CL]

  35. [35]

    K. M. Jablonka, P. Schwaller, A. Ortega-Guerrero, and B. Smit, Leveraging large language models for predictive chemistry, Nat. Mach. Intell.6, 161 (2024)

  36. [36]

    Adachi,Properties of semiconductor alloys: group-IV, III-V and II-VI semiconductors(John Wiley & Sons, 2009)

    S. Adachi,Properties of semiconductor alloys: group-IV, III-V and II-VI semiconductors(John Wiley & Sons, 2009)

  37. [37]

    Aubel, U

    J. Aubel, U. Reddy, S. Sundaram, W. Beard, and J. Co- mas, Interband transitions in molecular-beam-epitaxial Al x Ga1- x As/GaAs, J. Appl. Phys.58, 495 (1985)

  38. [38]

    Aspnes, S

    D. Aspnes, S. Kelso, R. Logan, and R. Bhat, Optical properties of Al x Ga1- x As, J. Appl. Phys.60, 754 (1986)

  39. [39]

    Saxena, Non-γDeep Levels and the Conduction Band Structure of Ga1- xAlxAs Alloys, Phys

    A. Saxena, Non-γDeep Levels and the Conduction Band Structure of Ga1- xAlxAs Alloys, Phys. Status Solidi B 105, 777 (1981)

  40. [40]

    Monemar, K

    B. Monemar, K. Shih, and G. Pettit, Some optical prop- erties of the Al x Ga1- x As alloys system, J. Appl. Phys. 47, 2604 (1976)

  41. [41]

    T. Kim, T. Ghong, Y. Kim, S. Kim, D. Aspnes, T. Mori, T. Yao, and B. Koo, Dielectric functions of In x Ga 1- x As alloys, Phys. Rev. B68, 115323 (2003)

  42. [42]

    Gaskill, N

    D. Gaskill, N. Bottka, L. Aina, and M. Mattingly, Band- gap determination by photoreflectance of InGaAs and In- AlAs lattice matched to InP, Appl. Phys. Lett.56, 1269 (1990)

  43. [43]

    Woolley and J

    J. Woolley and J. Warner, Optical energy-gap variation in InAs–InSb alloys, Can. J. Phys.42, 1879 (1964)

  44. [44]

    Dobbelaere, J

    W. Dobbelaere, J. De Boeck, and G. Borghs, Growth and optical characterization of InAs1- x Sb x (0≤x≤1) on GaAs and on GaAs-coated Si by molecular beam epitaxy, Appl. Phys. Lett.55, 1856 (1989)

  45. [45]

    S. S. Vishnubhatla, B. Eyglunent, and J. C. Woolley, Electroreflectance measurements in mixed III–V alloys, Can. J. Phys.47, 1661 (1969)

  46. [46]

    Auvergne, J

    D. Auvergne, J. Camassel, H. Mathieu, and A. Joullie, Piezoreflectance measurements on GaxIn1- x Sb alloys, J. Phys. Chem. Sol.35, 133 (1974)

  47. [47]

    Roth and E

    A. Roth and E. Fortin, Interband magneto-optical study of the In1- x Ga x Sb alloy system, Can. J. Phys.56, 1468 (1978)

  48. [48]

    Alibert, A

    C. Alibert, A. Joullie, A. Joullie, and C. Ance, Modulation-spectroscopy study of the Ga 1- x Al x Sb band structure, Phys. Rev. B27, 4946 (1983)

  49. [49]

    Bignazzi, E

    A. Bignazzi, E. Grilli, M. Guzzi, C. Bocchi, A. Bosac- chi, S. Franchi, and R. Magnanini, Direct-and indirect- energy-gap dependence on Al concentration in Al x Ga 1- x Sb (x¡˜ 0. 4 1), Phys. Rev. B57, 2295 (1998)

  50. [50]

    Bellani, M

    V. Bellani, M. Geddo, G. Guizzetti, S. Franchi, and R. Magnanini, Thermoreflectance study of the direct op- tical gap in epitaxial Al x Ga 1- x Sb (x¡˜ 0. 5), Phys. Rev. B59, 12272 (1999)

  51. [51]

    Saadallah, N

    F. Saadallah, N. Yacoubi, F. Genty, and C. Alibert, Pho- tothermal investigations of thermal and optical proper- ties of GaAlAsSb and AlAsSb thin layers, J. Appl. Phys. 94, 5041 (2003)

  52. [52]

    Rodriguez and G

    J. Rodriguez and G. Armelles, Ellipsometric study of AlI- nAs and AlGaP alloys, J. Appl. Phys.69, 965 (1991)

  53. [53]

    S. Choi, Y. Kim, S. Yoo, D. Aspnes, D. Woo, and S. Kim, Optical properties of Al x Ga 1- x P (0≤x≤0.52) alloys, J. Appl. Phys.87, 1287 (2000)

  54. [54]

    Onton, M

    A. Onton, M. Lorenz, and W. Reuter, Electronic Struc- ture and Luminescence Processes in In1- x Ga x P Alloys, J. Appl. Phys.42, 3420 (1971)

  55. [55]

    Alibert, G

    C. Alibert, G. Bordure, A. Laugier, and J. Chevallier, Electroreflectance and band structure of Ga x In 1- x P alloys, Phys. Rev. B6, 1301 (1972)

  56. [56]

    Sch¨ ormann, D

    J. Sch¨ ormann, D. As, K. Lischka, P. Schley, R. Goldhahn, S. Li, W. L¨ offler, M. Hetterich, and H. Kalt, Molecular beam epitaxy of phase pure cubic InN, Appl. Phys. Lett. 89(2006)

  57. [57]

    M¨ ullh¨ auser, O

    J. M¨ ullh¨ auser, O. Brandt, A. Trampert, B. Jenichen, and K. Ploog, Green photoluminescence from cubic In 0.4 Ga 0.6 N grown by radio frequency plasma-assisted molecu- lar beam epitaxy, Appl. Phys. Lett.73, 1230 (1998)

  58. [58]

    Goldhahn, J

    R. Goldhahn, J. Scheiner, S. Shokhovets, T. Frey, U. K¨ ohler, D. As, and K. Lischka, Refractive index and gap energy of cubic In x Ga 1- x N, Appl. Phys. Lett.76, 291 (2000)

  59. [59]

    T. S. Takanobu Suzuki, H. Y. Hiroyuki Yaguchi, H. O. Hajime Okumura, Y. I. Yuuki Ishida, and S. Y. Sada- fumi Yoshida, Optical constants of cubic GaN, AlN, and AlGaN alloys, Jpn. J. Appl. Phys.39, L497 (2000)

  60. [60]

    Kasic, M

    A. Kasic, M. Schubert, T. Frey, U. K¨ ohler, D. As, and C. Herzinger, Optical phonon modes and interband tran- sitions in cubic Al x Ga 1- x N films, Phys. Rev. B65, 184302 (2002)

  61. [61]

    Kearns, Y

    M. Kearns, Y. Mansour, and A. Y. Ng, A sparse sampling algorithm for near-optimal planning in large Markov de- cision processes, Mach. Learn.49, 193 (2002)

  62. [62]

    Kocsis and C

    L. Kocsis and C. Szepesv´ ari, Bandit based monte-carlo planning, inEuropean conference on machine learning (Springer, 2006) pp. 282–293