pith. machine review for the scientific record. sign in

arxiv: 2604.25241 · v1 · submitted 2026-04-28 · 💻 cs.LG

Recognition: unknown

Categorical Optimization with Bayesian Anchored Latent Trust Regions for Structural Design under High-Dimensional Uncertainty

Huanhuan Gao, Zhangyong Liang

Authors on Pith no claims yet

Pith reviewed 2026-05-07 16:35 UTC · model grok-4.3

classification 💻 cs.LG
keywords categorical optimizationBayesian optimizationstructural designlatent embeddingtrust regionsfinite element analysisrobust optimizationaleatoric uncertainty
0
0 comments X

The pith

COBALT anchors latent optimization to valid catalog points to handle high-dimensional categorical structural design under uncertainty.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper introduces the COBALT framework to solve categorical optimization problems in structural design where each variable comes from a finite catalog and evaluations are expensive stochastic finite-element analyses. It embeds the catalog into a low-dimensional latent space but anchors the points as a discrete graph to avoid rounding any continuous optimum back to catalog entries, which can change performance or validity. A random tree decomposition allows additive modeling of high-dimensional interactions, and an additive SAAS-GP surrogate guides a trust-region search that always picks valid designs. This approach is demonstrated on bar structures optimizing weight, strain energy, and buckling resistance under uncertainty. A sympathetic reader would care because it maintains physical admissibility throughout the optimization loop and potentially speeds up finding robust designs compared to methods that relax to continuous spaces.

Core claim

COBALT first embeds the physical catalog into a low-dimensional latent representation and locks the mapped instances as a discrete anchored graph. A data-independent random tree decomposition is then used to provide bounded-complexity additive modeling over high-dimensional categorical variables. On this anchored domain, an additive SAAS-GP surrogate is fitted to heteroscedastic MC-FEA observations, and a trust-region discrete graph acquisition search selects the next admissible catalog configuration without continuous relaxation or rounding-off. The proposed strategy is applied to robust design optimization of complex bar structures, considering structural weight, strain energy, and local 2

What carries the argument

The anchored graph formed by locking catalog instances in the low-dimensional latent space, combined with random tree decomposition for additive modeling and discrete trust-region acquisition on the graph.

If this is right

  • Only valid catalog designs are evaluated through the MC-FEA oracle, preserving physical admissibility throughout the active learning loop.
  • The method improves the efficiency of robust categorical structural optimization for problems with high-dimensional variables.
  • Bounded-complexity additive modeling via tree decomposition enables handling of catalog interactions that would otherwise be intractable.
  • The framework applies directly to multi-objective robust design of bar structures involving weight, strain energy, and buckling.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • The anchored-graph approach could transfer to other discrete-choice engineering tasks such as material catalog selection or assembly configuration under uncertainty.
  • If the random tree decomposition misses key higher-order interactions, adding adaptive decomposition steps might further improve fidelity on new catalogs.
  • Extending the trust-region search to incorporate gradient information from the surrogate could accelerate convergence on larger graphs.

Load-bearing premise

The low-dimensional latent embedding with random tree decomposition and anchored graph captures high-dimensional categorical interactions and physical performance without significant loss of fidelity or introduction of spurious correlations.

What would settle it

Running COBALT on a test catalog where the final selected design violates a physical constraint or shows higher uncertainty than a rounding-based baseline would indicate the anchoring and embedding do not preserve admissibility as claimed.

Figures

Figures reproduced from arXiv: 2604.25241 by Huanhuan Gao, Zhangyong Liang.

Figure 1
Figure 1. Figure 1: The instance geometric parameters subordinate to the Gaussian distribution. view at source ↗
Figure 2
Figure 2. Figure 2: The active learning loop of the proposed COBALT framework. The physical catalog is embedded and locked as the discrete latent grid ΩD. A random tree decomposition and SAAS-GP surrogate guide trust-region acquisition over admissible anchored designs, while MC-FEA evaluates the selected configuration under aleatoric uncertainty and updates the Bayesian loop. categorical design description used in determinist… view at source ↗
Figure 3
Figure 3. Figure 3: The load-case illustration of the ten-beam structure. view at source ↗
Figure 4
Figure 4. Figure 4: The convergence history of the 10-beam structure optimization: (a) robust objective convergence with the best feasible trajectory, (b) view at source ↗
Figure 5
Figure 5. Figure 5: The failure probabilities and constraint violation of di view at source ↗
Figure 6
Figure 6. Figure 6: Uncertainty sensitivity of deterministic manifold-based optimization for the ten-beam problem. The rows show latent search results, view at source ↗
Figure 7
Figure 7. Figure 7: The optimization path in the original physical attribute space for the ten-beam optimization problem. view at source ↗
Figure 8
Figure 8. Figure 8: The optimization path in low dimensional design space with uncertainties for the ten-beam optimization problem. view at source ↗
Figure 9
Figure 9. Figure 9: The load-case illustration of the dome structure. view at source ↗
Figure 10
Figure 10. Figure 10: The convergence history of the dome optimization: (a) robust objective convergence, (b) mass and buckling-constraint evolution, and view at source ↗
Figure 11
Figure 11. Figure 11: The optimization path in the original physical attribute space for the dome optimization problem. view at source ↗
Figure 12
Figure 12. Figure 12: The optimization path in low dimensional design space with uncertainties for the dome optimization problem. view at source ↗
Figure 13
Figure 13. Figure 13: The load-case illustration of the six-story frame structure. view at source ↗
Figure 14
Figure 14. Figure 14: The optimization path in the original physical attribute space for the six-story frame optimization problem. view at source ↗
Figure 15
Figure 15. Figure 15: The optimization path in low dimensional design space with uncertainties for the six-story frame optimization problem. view at source ↗
Figure 16
Figure 16. Figure 16: The load-case illustration of the 105-beam structure. view at source ↗
Figure 17
Figure 17. Figure 17: The optimization path in the original physical attribute space for the 105-beam optimization problem. view at source ↗
Figure 18
Figure 18. Figure 18: The optimization path in low dimensional design space with uncertainties for the 105-beam optimization problem. view at source ↗
Figure 19
Figure 19. Figure 19: The load-case illustration of the high dimensional 1564-beam structure. view at source ↗
Figure 20
Figure 20. Figure 20: The optimization path in the original physical attribute space for the 1564-beam optimization problem. view at source ↗
Figure 21
Figure 21. Figure 21: The optimization path in low dimensional design space with uncertainties for the 1564-beam optimization problem. view at source ↗
read the original abstract

Categorical structural optimization under aleatoric uncertainty is challenging because each design variable must be selected from a finite catalog of admissible instances, while each candidate design may require expensive stochastic finite-element evaluations. Existing latent-space optimization strategies can reduce the dimensionality of catalog attributes, but they often treat the reduced space as a continuous search domain. The resulting continuous optimum must then be rounded off to a nearby catalog instance, which may alter the objective value, constraint status, or physical interpretation of the design. To address this issue, this paper proposes the \textbf{C}ategorical \textbf{O}ptimization with \textbf{B}ayesian \textbf{A}nchored \textbf{L}atent \textbf{T}rust Regions (\textbf{COBALT}) framework for high-dimensional categorical Optimization Under Uncertainty. COBALT first embeds the physical catalog into a low-dimensional latent representation and locks the mapped instances as a discrete anchored graph. A data-independent random tree decomposition is then used to provide bounded-complexity additive modeling over high-dimensional categorical variables. On this anchored domain, an additive SAAS-GP surrogate is fitted to heteroscedastic MC-FEA observations, and a trust-region discrete graph acquisition search selects the next admissible catalog configuration without continuous relaxation or rounding-off. The proposed strategy is applied to robust design optimization of complex bar structures, considering structural weight, strain energy, and local buckling performance. By evaluating only valid catalog designs through the MC-FEA oracle, COBALT preserves physical admissibility throughout the active learning loop and improves the efficiency of robust categorical structural optimization.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 1 minor

Summary. The manuscript proposes the COBALT framework for high-dimensional categorical optimization under aleatoric uncertainty in structural design. It first embeds a finite catalog of admissible designs into a low-dimensional latent space and locks the points as a discrete anchored graph. A data-independent random tree decomposition is applied to enable bounded-complexity additive modeling, after which an additive SAAS-GP surrogate is trained on heteroscedastic observations from Monte Carlo finite-element analysis (MC-FEA). A trust-region discrete graph acquisition search then selects the next admissible catalog member without continuous relaxation or post-hoc rounding. The approach is demonstrated on robust design of complex bar structures, optimizing structural weight, strain energy, and local buckling performance while strictly preserving physical admissibility throughout the active-learning loop.

Significance. If the empirical validation confirms that the random tree decomposition preserves physically relevant interactions and that the discrete acquisition yields measurable efficiency gains, the work would offer a principled route to Bayesian optimization over high-dimensional categorical spaces in engineering contexts where catalog constraints and physical admissibility cannot be relaxed. The anchored-graph construction and additive surrogate on tree decompositions address a recurring difficulty in structural and materials design under uncertainty.

major comments (2)
  1. [§3.2] §3.2 (Random tree decomposition and additive SAAS-GP): The central modeling assumption is that a data-independent random tree decomposition of the latent-embedded categorical variables yields an additive structure that faithfully captures all physically relevant interactions (e.g., cross-section choices jointly affecting global strain energy and local buckling). Because the tree is chosen without reference to the MC-FEA oracle, it may omit dominant dependencies or impose spurious additive structure; the manuscript must demonstrate that surrogate error remains below the threshold that would change trust-region acquisition decisions, for example via a sensitivity study or comparison against a non-additive baseline.
  2. [§5] §5 (Numerical experiments on bar structures): The claims of improved efficiency and preserved admissibility rest on unshown quantitative evidence. The results section should report baseline comparisons (standard BO with rounding, other latent-space methods), convergence curves with error bars, and tables of final objective values and wall-clock savings to substantiate that COBALT outperforms existing strategies on the same MC-FEA budget.
minor comments (1)
  1. [Notation] The notation for the anchored graph, latent embedding dimension, and trust-region radius is introduced without a consolidated table of symbols; adding such a table would improve readability.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the constructive and detailed comments. We address each major point below and will revise the manuscript to incorporate the requested clarifications and additional evidence.

read point-by-point responses
  1. Referee: [§3.2] §3.2 (Random tree decomposition and additive SAAS-GP): The central modeling assumption is that a data-independent random tree decomposition of the latent-embedded categorical variables yields an additive structure that faithfully captures all physically relevant interactions (e.g., cross-section choices jointly affecting global strain energy and local buckling). Because the tree is chosen without reference to the MC-FEA oracle, it may omit dominant dependencies or impose spurious additive structure; the manuscript must demonstrate that surrogate error remains below the threshold that would change trust-region acquisition decisions, for example via a sensitivity study or comparison against a non-additive baseline.

    Authors: We agree that the fidelity of the additive approximation must be explicitly validated. The random tree is chosen data-independently to guarantee bounded complexity and to prevent information leakage from the MC-FEA oracle. In the revised §3.2 we will add a sensitivity study that (i) fits both the additive SAAS-GP and a non-additive baseline GP to the same MC-FEA data from the bar-structure examples, (ii) quantifies the difference in predictive mean and variance, and (iii) checks whether the two surrogates produce identical or differing trust-region acquisition selections on a held-out set. This will directly address whether any omitted interactions affect downstream optimization decisions. revision: yes

  2. Referee: [§5] §5 (Numerical experiments on bar structures): The claims of improved efficiency and preserved admissibility rest on unshown quantitative evidence. The results section should report baseline comparisons (standard BO with rounding, other latent-space methods), convergence curves with error bars, and tables of final objective values and wall-clock savings to substantiate that COBALT outperforms existing strategies on the same MC-FEA budget.

    Authors: We acknowledge that the current §5 would be strengthened by the requested quantitative comparisons. The revised results section will include: (i) side-by-side performance against standard Bayesian optimization followed by rounding and against other latent-space categorical methods, (ii) convergence curves for structural weight, strain energy, and buckling load with error bars computed over multiple independent random seeds, and (iii) tables reporting final objective values, number of MC-FEA evaluations consumed, and wall-clock time on identical hardware. These additions will provide direct, reproducible evidence of efficiency gains while confirming strict preservation of catalog admissibility. revision: yes

Circularity Check

0 steps flagged

No significant circularity in COBALT derivation chain

full rationale

The provided abstract and description outline a framework that first embeds the catalog into a latent space and anchors it as a discrete graph, applies a data-independent random tree decomposition for additive structure, fits an additive SAAS-GP surrogate directly to MC-FEA oracle observations, and performs trust-region acquisition search over the discrete admissible set. No equations or claims reduce the reported performance metrics (structural weight, strain energy, buckling) to quantities defined by the same fitted parameters or by self-citation. The random tree is explicitly data-independent, the surrogate is fitted to external oracle evaluations, and the loop preserves admissibility by construction without rounding or relaxation steps that would create self-reference. This is a standard extension of Bayesian optimization to categorical domains with independent components; the central claim of improved efficiency for robust categorical optimization therefore rests on empirical validation against the MC-FEA oracle rather than on any definitional or fitted-input reduction.

Axiom & Free-Parameter Ledger

2 free parameters · 2 axioms · 2 invented entities

The central claim rests on several unverified modeling assumptions about latent embeddings and additive decompositions that are introduced without independent evidence or proof in the abstract.

free parameters (2)
  • latent embedding dimension
    Chosen to reduce catalog attributes while allowing anchoring; value and selection procedure unspecified.
  • random tree decomposition parameters
    Controls bounded-complexity additive modeling over categorical variables; specific construction rules and depth not detailed.
axioms (2)
  • domain assumption The physical catalog admits a low-dimensional latent embedding that can be locked into a discrete anchored graph without distorting structural performance metrics.
    Invoked when mapping catalog instances and forming the anchored graph for search.
  • domain assumption Random tree decomposition provides sufficient additive structure to model high-dimensional categorical interactions for surrogate accuracy.
    Used to justify bounded-complexity modeling before fitting the SAAS-GP.
invented entities (2)
  • Anchored latent graph no independent evidence
    purpose: Represents discrete catalog instances in latent space while enforcing physical admissibility during optimization.
    New construct introduced to avoid continuous relaxation and rounding.
  • Discrete graph acquisition search no independent evidence
    purpose: Selects next valid catalog configuration inside trust regions on the anchored graph.
    Core search mechanism replacing continuous acquisition.

pith-pipeline@v0.9.0 · 5590 in / 1646 out tokens · 87768 ms · 2026-05-07T16:35:07.121659+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

45 extracted references · 5 canonical work pages · 1 internal anchor

  1. [1]

    R. F. Coelho, M. Xiao, A. Guglielmetti, M. Herrera, W. Zhang, Investigation of three genotypes for mixed variable evolutionary optimization (2015) 309–319

  2. [2]

    Kokkolaras, C

    M. Kokkolaras, C. Audet, J. E. Dennis, Mixed variable optimization of the number and composition of heat intercepts in a thermal insulation system, Optimization and Engineering 2 (2001) 5–29

  3. [3]

    Lindroth, M

    P. Lindroth, M. Patriksson, Pure Categorical Optimization: A Global Descent Approach, Department of Mathe- matical Sciences, Chalmers University of Technology, University of Gothenburg, 2011

  4. [4]

    Sloane, S

    D. Sloane, S. P. Morgan, An introduction to categorical data analysis, Annual review of sociology (1996) 351–375

  5. [5]

    Herrera, M

    F. Herrera, M. Lozano, J. L. Verdegay, Tackling real-coded genetic algorithms: Operators and tools for be- havioural analysis, Artificial intelligence review 12 (1998) 265–319

  6. [6]

    D. E. Goldberg, Genetic algorithms, Pearson Education India, 2006

  7. [7]

    R. A. Caruana, J. D. Schaffer, Representation and hidden bias: Gray vs. binary coding for genetic algorithms, in: Machine Learning Proceedings 1988, Elsevier, 1988, pp. 153–161. 25 0.0 0.2 0.4 0.6 0.8 1.0 Area (m 2 ) 0.0 0.2 0.4 0.6 0.8 1.0 Iy (m4) 0.0 0.2 0.4 0.6 0.8 1.0 Iz (m 4) Design Variable 134 Section catalog BO iteration Start Before best (solid) ...

  8. [8]

    L. J. Eshelman, J. D. Schaffer, Real-coded genetic algorithms and interval-schemata, in: Foundations of genetic algorithms, volume 2, Elsevier, 1993, pp. 187–202

  9. [9]

    T. Liao, K. Socha, M. A. M. de Oca, T. Stützle, M. Dorigo, Ant colony optimization for mixed-variable opti- mization problems, IEEE Transactions on Evolutionary Computation 18 (2014) 503–518

  10. [10]

    Rajeev, C

    S. Rajeev, C. Krishnamoorthy, Discrete optimization of structures using genetic algorithms, Journal of structural engineering 118 (1992) 1233–1250

  11. [11]

    Herrera, A

    M. Herrera, A. Guglielmetti, M. Xiao, R. F. Coelho, Metamodel-assisted optimization based on multiple kernel regression for mixed variables, Structural and Multidisciplinary Optimization 49 (2014) 979–991

  12. [12]

    Filomeno Coelho, Extending moving least squares to mixed variables for metamodel-assisted optimization (2012)

    R. Filomeno Coelho, Extending moving least squares to mixed variables for metamodel-assisted optimization (2012)

  13. [13]

    McCane, M

    B. McCane, M. Albert, Distance functions for categorical and mixed variables, Pattern Recognition Letters 29 (2008) 986–993

  14. [14]

    R. Filomeno Coelho, Metamodels for mixed variables based on moving least squares: Application to the struc- tural analysis of a rigid frame, Optimization and engineering 15 (2014) 311–329

  15. [15]

    Y . Fu, S. Yan, T. S. Huang, Classification and feature extraction by simplexization, IEEE Transactions on Information Forensics and Security 3 (2008) 91–100

  16. [16]

    Jolliffe, Principal component analysis, Wiley Online Library, 2002

    I. Jolliffe, Principal component analysis, Wiley Online Library, 2002. 26 0.75 0.50 0.25 0.00 0.25 0.50 0.75 1.00 1.25 z1 0.4 0.2 0.0 0.2 0.4 0.6 z2 Design Variable 134 Sections COBALT step Start (iter -250) Start best (solid) After best (dashed) Global best feasible 0.75 0.50 0.25 0.00 0.25 0.50 0.75 1.00 1.25 z1 0.4 0.2 0.0 0.2 0.4 0.6 z2 Design Variabl...

  17. [17]

    I. M. Martin, S. Eroglu, Measuring a multi-dimensional construct: country image, Journal of business research 28 (1993) 191–210

  18. [18]

    J. B. Tenenbaum, V . De Silva, J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, science 290 (2000) 2319–2323

  19. [19]

    S. T. Roweis, L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science 290 (2000) 2323–2326

  20. [20]

    De Ridder, O

    D. De Ridder, O. Kouropteva, O. Okun, M. Pietikäinen, R. P. Duin, Supervised locally linear embedding, in: ICANN, Springer, 2003, pp. 333–341

  21. [21]

    Schölkopf, A

    B. Schölkopf, A. Smola, K.-R. Müller, Nonlinear component analysis as a kernel eigenvalue problem, Neural computation 10 (1998) 1299–1319

  22. [22]

    L. Cao, K. S. Chua, W. Chong, H. Lee, Q. Gu, A comparison of pca, kpca and ica for dimensionality reduction in support vector machine, Neurocomputing 55 (2003) 321–336

  23. [23]

    Balasubramanian, E

    M. Balasubramanian, E. L. Schwartz, The isomap algorithm and topological stability, Science 295 (2002) 7–7

  24. [24]

    E. W. Dijkstra, A note on two problems in connexion with graphs, Numerische mathematik 1 (1959) 269–271

  25. [25]

    L. Meng, P. Breitkopf, B. Raghavan, G. Mauvoisin, O. Bartier, X. Hernot, Identification of material properties using indentation test and shape manifold learning approach, Computer Methods in Applied Mechanics and Engineering 297 (2015) 239–257

  26. [26]

    Raghavan, P

    B. Raghavan, P. Breitkopf, Y . Tourbier, P. Villon, Towards a space reduction approach for efficient structural shape optimization, Structural and Multidisciplinary Optimization 48 (2013) 987–1000

  27. [27]

    Patwari, A

    N. Patwari, A. O. Hero III, A. Pacholski, Manifold learning visualization of network traffic data, in: Proceedings of the 2005 ACM SIGCOMM workshop on Mining network data, ACM, 2005, pp. 191–196

  28. [28]

    Svanberg, The method of moving asymptotes—a new method for structural optimization, International journal for numerical methods in engineering 24 (1987) 359–373

    K. Svanberg, The method of moving asymptotes—a new method for structural optimization, International journal for numerical methods in engineering 24 (1987) 359–373

  29. [29]

    Svanberg, The method of moving asymptotes (mma) with some extensions, in: Optimization of large structural systems, Springer, 1993, pp

    K. Svanberg, The method of moving asymptotes (mma) with some extensions, in: Optimization of large structural systems, Springer, 1993, pp. 555–566. 27

  30. [30]

    Shakhnarovich, T

    G. Shakhnarovich, T. Darrell, P. Indyk, Nearest-neighbor methods in learning and vision: theory and practice (neural information processing), The MIT press, 2006

  31. [31]

    M. A. Abramson, C. Audet, J. W. Chrissis, J. G. Walston, Mesh adaptive direct search algorithms for mixed variable optimization, Optimization Letters 3 (2009) 35–47

  32. [32]

    Stegmann, E

    J. Stegmann, E. Lund, Discrete material optimization of general composite shell structures, International Journal for Numerical Methods in Engineering 62 (2005) 2009–2027

  33. [33]

    Shahriari, K

    B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, N. de Freitas, Taking the human out of the loop: A review of bayesian optimization, Proceedings of the IEEE 104 (2016) 148–175. doi:10.1109/JPROC.2015.2494218

  34. [34]

    P. I. Frazier, A Tutorial on Bayesian Optimization, arXiv e-prints (2018) arXiv:1807.02811. arXiv:1807.02811

  35. [35]

    Williams, C

    C. Williams, C. Rasmussen, Gaussian processes for regression, in: Ad- vances in Neural Information Processing Systems, volume 8, MIT Press, 1995. URL: https://proceedings.neurips.cc/paper/1995/file/7cce53cf90577442771720a370c3c723-Paper.pdf

  36. [36]

    Rasmussen, C

    C. Rasmussen, C. Williams, Gaussian processes for machine learning, volume 2, MIT Press, 2006

  37. [37]

    neurips.cc/paper/2012/file/05311655a15b75fab86956663e1819cd-Paper.pdf

    J. Snoek, H. Larochelle, R. P. Adams, Practical bayesian optimization of machine learning algorithms, in: Advances in Neural Information Processing Systems, volume 25, Curran Associates, Inc., 2012. URL: https://proceedings.neurips.cc/paper/2012/file/05311655a15b75fab86956663e1819cd-Paper.pdf

  38. [38]

    X. Wan, V . Nguyen, H. Ha, B. Ru, C. Lu, M. A. Osborne, Think global and act local: Bayesian optimisation over high-dimensional categorical and mixed search spaces, International Conference on Machine Learning (2021)

  39. [39]

    Deshwal, S

    A. Deshwal, S. Ament, M. Balandat, E. Bakshy, J. R. Doppa, D. Eriksson, Bayesian optimization over high- dimensional combinatorial spaces via dictionary-based embeddings, CoRR abs/2303.01774 (2023). URL: https://doi.org/10.48550/arXiv.2303.01774. doi:10.48550/arXiv.2303.01774.arXiv:2303.01774

  40. [40]

    R. D. Banker, R. C. Morey, The use of categorical variables in data envelopment analysis, Management science 32 (1986) 1613–1627

  41. [41]

    C. K. Williams, C. E. Rasmussen, Gaussian processes for machine learning, volume 2, MIT press Cambridge, MA, 2006

  42. [42]

    Shahriari, K

    B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, N. De Freitas, Taking the human out of the loop: A review of bayesian optimization, Proceedings of the IEEE 104 (2015) 148–175

  43. [43]

    D. K. Duvenaud, H. Nickisch, C. Rasmussen, Additive gaussian processes, Advances in neural information processing systems 24 (2011)

  44. [44]

    Qamar, S

    S. Qamar, S. T. Tokdar, Additive gaussian process regression, arXiv preprint arXiv:1411.7009 (2014)

  45. [45]

    Rolland, J

    P. Rolland, J. Scarlett, I. Bogunovic, V . Cevher, High-dimensional bayesian optimization via additive models with overlapping groups, in: International conference on artificial intelligence and statistics, PMLR, 2018, pp. 298–307. 28