pith. machine review for the scientific record. sign in

hub

A Tutorial on Bayesian Optimization

22 Pith papers cite this work. Polarity classification is still indexing.

22 Pith papers citing it
abstract

Bayesian optimization is an approach to optimizing objective functions that take a long time (minutes or hours) to evaluate. It is best-suited for optimization over continuous domains of less than 20 dimensions, and tolerates stochastic noise in function evaluations. It builds a surrogate for the objective and quantifies the uncertainty in that surrogate using a Bayesian machine learning technique, Gaussian process regression, and then uses an acquisition function defined from this surrogate to decide where to sample. In this tutorial, we describe how Bayesian optimization works, including Gaussian process regression and three common acquisition functions: expected improvement, entropy search, and knowledge gradient. We then discuss more advanced techniques, including running multiple function evaluations in parallel, multi-fidelity and multi-information source optimization, expensive-to-evaluate constraints, random environmental conditions, multi-task Bayesian optimization, and the inclusion of derivative information. We conclude with a discussion of Bayesian optimization software and future research directions in the field. Within our tutorial material we provide a generalization of expected improvement to noisy evaluations, beyond the noise-free setting where it is more commonly applied. This generalization is justified by a formal decision-theoretic argument, standing in contrast to previous ad hoc modifications.

hub tools

citation-role summary

background 1

citation-polarity summary

years

2026 21 2024 1

roles

background 1

polarities

background 1

representative citing papers

Elicitation-Augmented Bayesian Optimization

cs.LG · 2026-05-12 · unverdicted · novelty 7.0

A cost-aware value-of-information acquisition function is derived to balance direct observations against noisy pairwise human comparisons in Bayesian optimization, approaching the convex hull of the individual information sources' performance trajectories.

Collaborative Contextual Bayesian Optimization

cs.LG · 2026-04-20 · unverdicted · novelty 7.0

CCBO enables collaborative contextual Bayesian optimization across clients with sublinear regret guarantees and shows substantial gains over non-collaborative methods in simulations and a hot rolling application even under heterogeneity.

ORTHOBO: Orthogonal Bayesian Hyperparameter Optimization

cs.LG · 2026-05-07 · unverdicted · novelty 5.0

OrthoBO introduces an orthogonal acquisition estimator subtracting an optimally weighted score-function control variate to reduce Monte Carlo variance, preserve the acquisition target, and improve ranking stability in Bayesian hyperparameter optimization.

citing papers explorer

Showing 22 of 22 citing papers.