Recognition: unknown
Optimization with SpotOptim
Pith reviewed 2026-05-10 13:18 UTC · model grok-4.3
The pith
The spotoptim package implements a Kriging-based optimization loop with expected improvement and OCBA noise handling for expensive black-box functions in Python.
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
The spotoptim package implements surrogate-model-based optimization of expensive black-box functions in Python. Building on sequential parameter optimization methodology, it provides a Kriging-based optimization loop with Expected Improvement, support for continuous, integer, and categorical variables, noise-aware evaluation via Optimal Computing Budget Allocation, and multi-objective extensions. A steady-state parallelization strategy overlaps surrogate search with objective evaluation on multi-core hardware, and a success-rate-based restart mechanism detects stagnation while preserving the best solution found. The package returns scipy-compatible OptimizeResult objects and accepts any scik
What carries the argument
Kriging-based optimization loop using Expected Improvement acquisition together with OCBA for noise-aware evaluation allocation.
If this is right
- Users gain a single framework that optimizes black-box functions with mixed continuous, integer, and categorical variables.
- Noisy evaluations receive more efficient sampling through dynamic budget allocation to promising candidates.
- Multi-core hardware is utilized by overlapping surrogate model updates with parallel objective evaluations.
- Stagnation during search triggers automatic restarts that retain the best solution encountered so far.
- Any scikit-learn compatible model can serve as the surrogate and results integrate directly with existing scipy workflows.
Where Pith is reading between the lines
- The architecture could reduce setup time for practitioners performing repeated hyperparameter searches on stochastic models where noise handling matters.
- TensorBoard integration provides an immediate way to inspect surrogate quality and convergence without additional coding.
- The open-source release and explicit comparisons position the tool as an alternative for users already working inside the Python scientific stack.
- The same loop structure could be tested on other expensive simulation-based problems beyond the neural-network examples shown.
Load-bearing premise
The described features including the Kriging loop, OCBA strategy, parallelization, and restart mechanism function correctly and deliver the stated practical advantages in use.
What would settle it
Running the neural network hyperparameter tuning example from the paper and confirming that it returns a valid scipy OptimizeResult containing the best parameters and objective value, or measuring whether the reported comparisons show measurable differences in solution quality or runtime against the listed alternative packages on the same test cases.
Figures
read the original abstract
The `spotoptim` package implements surrogate-model-based optimization of expensive black-box functions in Python. Building on two decades of Sequential Parameter Optimization (SPO) methodology, it provides a Kriging-based optimization loop with Expected Improvement, support for continuous, integer, and categorical variables, noise-aware evaluation via Optimal Computing Budget Allocation (OCBA), and multi-objective extensions. A steady-state parallelization strategy overlaps surrogate search with objective evaluation on multi-core hardware, and a success-rate-based restart mechanism detects stagnation while preserving the best solution found. The package returns scipy-compatible `OptimizeResult` objects and accepts any scikit-learn-compatible surrogate model. Built-in TensorBoard logging provides real-time monitoring of convergence and surrogate quality. This report describes the architecture and module structure of spotoptim, provides worked examples including neural network hyperparameter tuning, and compares the framework with BoTorch, Optuna, Ray Tune, BOHB, SMAC, and Hyperopt. The package is open-source.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The manuscript describes the spotoptim Python package for surrogate-model-based optimization of expensive black-box functions. Building on Sequential Parameter Optimization (SPO), it implements a Kriging/EI loop with support for continuous, integer, and categorical variables, OCBA noise handling, multi-objective extensions, steady-state parallelization, success-rate-based restarts, TensorBoard logging, and scipy-compatible output. The paper details the architecture and modules, provides worked examples including neural network hyperparameter tuning, and compares the package against BoTorch, Optuna, Ray Tune, BOHB, SMAC, and Hyperopt. The package is open-source.
Significance. If the implementation is faithful to the description, the package offers a practical, extensible tool that combines several established techniques (Kriging/EI, OCBA, mixed-variable support) with usability features such as steady-state parallelism and restart logic in a single open-source framework. The direct comparisons and examples position it relative to existing libraries, and the scikit-learn compatibility plus scipy output lower the barrier to adoption. The open-source release enables immediate verification and community extension, which is a concrete strength for a software contribution.
minor comments (3)
- [Abstract] The abstract states 'This report describes the architecture...' while the manuscript is submitted as a journal paper; rephrasing to 'This paper describes...' would align with standard academic style.
- [Section 5] Section 5 (Comparisons) presents direct comparisons but lacks a concise feature-comparison table; adding one would make the positioning against BoTorch, Optuna, etc., easier to scan and would highlight the claimed unique combination of OCBA, steady-state parallelism, and restart mechanism.
- [Section 4.2] The worked example in Section 4.2 (NN hyperparameter tuning) would benefit from explicit listing of the search-space bounds and the number of evaluations used, to allow readers to reproduce the exact setup.
Simulated Author's Rebuttal
We thank the referee for the positive and accurate summary of the spotoptim manuscript, the recognition of its practical contributions, and the recommendation for minor revision. No specific major comments were raised in the report.
Circularity Check
No significant circularity; software description with no derivation chain
full rationale
The paper is a software report describing the spotoptim package architecture, Kriging/EI optimization loop, OCBA noise handling, mixed-variable support, steady-state parallelization, success-rate restart, TensorBoard logging, and scipy-compatible outputs. It supplies worked examples (e.g., NN hyperparameter tuning) and direct comparisons to BoTorch, Optuna, etc. No equations, theorems, fitted predictions, or load-bearing self-citations appear; all claims are testable by inspecting and running the stated open-source code rather than reducing to internal definitions or prior author results by construction. This is the most common honest finding for descriptive software papers.
Axiom & Free-Parameter Ledger
Forward citations
Cited by 1 Pith paper
-
Multi-Task Optimization over Networks of Tasks
MONET represents tasks as graph nodes and uses neighbor-based crossover plus per-task mutation to transfer knowledge, matching or exceeding MAP-Elites performance on four large-scale simulation domains.
Reference graph
Works this paper leans on
-
[1]
Akiba, Takuya, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. `` Optuna: A Next-generation Hyperparameter Optimization Framework .'' Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2623--31. https://doi.org/10.1145/3292500.3330701
-
[2]
Balandat, Maximilian, Brian Karrer, Daniel R. Jiang, et al. 2020. `` BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization .'' Advances in Neural Information Processing Systems 33. https://arxiv.org/abs/1910.06403
-
[3]
Bartz, Eva, Thomas Bartz-Beielstein, Martin Zaefferer, and Olaf Mersmann, eds. 2022. Hyperparameter Tuning for Machine and Deep Learning with R - A Practical Guide . Springer
2022
-
[4]
Bartz-Beielstein, Thomas. 2023a. `` Hyperparameter Tuning Cookbook: A guide for scikit-learn, PyTorch, river, and spotpython .'' arXiv e-Prints, ahead of print, July. https://doi.org/10.48550/arXiv.2307.10262
-
[5]
Bartz-Beielstein, Thomas. 2023b. PyTorch Hyperparameter Tuning with SPOT : Comparison with Ray Tuner and Default Hyperparameters on CIFAR10 . https://github.com/sequential-parameter-optimization/spotpython/blob/main/notebooks/14_spot_ray_hpt_torch_cifar10.ipynb
-
[6]
Bartz-Beielstein, Thomas. 2025a. `` Multi-Objective Optimization and Hyperparameter Tuning With Desirability Functions .'' arXiv e-Prints, March, arXiv:2503.23595. https://doi.org/10.48550/arXiv.2503.23595
-
[7]
Bartz-Beielstein, Thomas. 2025b. ``Surrogate Model-Based Multi-Objective Optimization Using Desirability Functions.'' Proceedings of the Genetic and Evolutionary Computation Conference Companion (New York, NY, USA), GECCO '25 companion, 2458--65. https://doi.org/10.1145/3712255.3734331
-
[8]
Bartz-Beielstein, Thomas, and Martina Friese. 2011. Sequential Parameter Optimization and Optimal Computational Budget Allocation for Noisy Optimization Problems . Cologne University of Applied Science, Faculty of Computer Science; Engineering Science
2011
-
[9]
Bartz-Beielstein, Thomas, Martina Friese, Martin Zaefferer, et al. 2011. `` Noisy optimization with sequential parameter optimization and optimal computational budget allocation .'' Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation (New York, NY, USA), 119--20
2011
-
[10]
Bartz-Beielstein, Thomas, Christian Lasarczyk, and Mike Preuss. 2005. `` Sequential Parameter Optimization .'' In Proceedings 2005 Congress on Evolutionary Computation (CEC'05), Edinburgh, Scotland , edited by B McKay et al. IEEE Press . https://doi.org/10.1109/CEC.2005.1554761
-
[11]
Bartz-Beielstein, Thomas, and Martin Zaefferer. 2017. ``Model-Based Methods for Continuous and Discrete Global Optimization.'' Applied Soft Computing 55: 154--67. https://doi.org/10.1016/j.asoc.2017.01.039
-
[12]
Bartz-Beielstein, Thomas, and Martin Zaefferer. 2022. ``Hyperparameter Tuning Approaches.'' Chap. 4 in Hyperparameter Tuning for Machine and Deep Learning with R - A Practical Guide , edited by Eva Bartz, Thomas Bartz-Beielstein, Martin Zaefferer, and Olaf Mersmann. Springer
2022
- [13]
-
[14]
Bergstra, James, Rémi Bardenet, Yoshua Bengio, and Balázs Kégl. 2011. `` Algorithms for Hyper-Parameter Optimization .'' Advances in Neural Information Processing Systems 24
2011
-
[15]
Chen, Chun Hung. 2010. Stochastic simulation optimization: an optimal computing budget allocation . World Scientific
2010
-
[16]
Falkner, Stefan, Aaron Klein, and Frank Hutter. 2018. `` BOHB: Robust and Efficient Hyperparameter Optimization at Scale .'' Proceedings of the 35th International Conference on Machine Learning, 1437--46
2018
-
[17]
Forrester, Alexander, András Sóbester, and Andy Keane. 2008. Engineering Design via Surrogate Modelling . Wiley
2008
-
[18]
Gentile, Lorenzo, Thomas Bartz-Beielstein, and Martin Zaefferer. 2021. ``Sequential Parameter Optimization for Mixed-Discrete Problems.'' In Optimization Under Uncertainty with Applications to Aerospace Engineering, edited by Massimiliano Vasile. Springer International Publishing. https://doi.org/10.1007/978-3-030-60166-9_10
-
[19]
Gentile, Lorenzo, Martin Zaefferer, Dario Giugliano, Haofeng Chen, and Thomas Bartz-Beielstein. 2018. ``Surrogate Assisted Optimization of Particle Reinforced Metal Matrix Composites.'' Proceedings of the Genetic and Evolutionary Computation Conference (New York, NY, USA), GECCO '18, 1238--45. https://doi.org/10.1145/3205455.3205574
-
[20]
Gramacy, Robert B. 2020. Surrogates. CRC press
2020
- [21]
-
[22]
Hutter, Frank, Thomas Bartz-Beielstein, Holger Hoos, Kevin Leyton-Brown, and Kevin P Murphy. 2010. `` Sequential Model-Based Parameter Optimisation: an Experimental Investigation of Automated and Interactive Approaches .'' In Experimental Methods for the Analysis of Optimization Algorithms, edited by Thomas Bartz-Beielstein, Marco Chiarandini, Luis Paquet...
2010
-
[23]
Hutter, Frank, Holger H. Hoos, and Kevin Leyton-Brown. 2011. `` Sequential Model-based Algorithm Configuration .'' Learning and Intelligent Optimization (LION 5), 507--23. https://doi.org/10.1007/978-3-642-25566-3_40
-
[24]
Jones, D R, M Schonlau, and W J Welch. 1998. `` Efficient Global Optimization of Expensive Black-Box Functions .'' Journal of Global Optimization 13: 455--92
1998
-
[25]
Jones, Donald R., Matthias Schonlau, and William J. Welch. 1998. `` Efficient Global Optimization of Expensive Black-Box Functions .'' Journal of Global Optimization 13 (4): 455--92. https://doi.org/10.1023/A:1008306431147
-
[26]
Liaw, Richard, Eric Liang, Robert Nishihara, Philipp Moritz, Roy Fox, and Ken Goldberg. 2018. `` Tune: A Research Platform for Distributed Model Selection and Training .'' ICML AutoML Workshop. https://arxiv.org/abs/1807.05118
work page Pith review arXiv 2018
-
[27]
Storn, R. 1996. `` On the usage of differential evolution for function optimization .'' Fuzzy Information Processing Society, 1996. NAFIPS., 1996 Biennial Conference of the North American, 519--23. https://doi.org/10.1109/NAFIPS.1996.534789
-
[28]
Zaefferer, Martin, and Thomas Bartz-Beielstein. 2016. ``Efficient Global Optimization with Indefinite Kernels.'' In Parallel Problem Solving from Nature -- PPSN XIV: 14th International Conference, Edinburgh, UK, September 17-21, 2016, Proceedings, edited by Julia Handl, Emma Hart, Peter R. Lewis, Manuel López-Ibáñez, Gabriela Ochoa, and Ben Paechter. Spri...
-
[29]
Zaefferer, Martin, Jörg Stork, and Thomas Bartz-Beielstein. 2014. `` Distance Measures for Permutations in Combinatorial Efficient Global Optimization .'' In Parallel Problem Solving from Nature--PPSN XIII, edited by Thomas Bartz-Beielstein, Jürgen Branke, Bogdan Filipic, and Jim Smith. Springer
2014
-
[30]
ߍ s.G䍘pdJC;fJ&APt/r Dw+eT f 8ԭ\*m^*F`#\6 a fL ڃo 0 ,h، b Rި[ C l2z|nU ; r r# m Q70 h # ѡö&E
Zaefferer, Martin, Jörg Stork, Martina Friese, Andreas Fischbach, Boris Naujoks, and Thomas Bartz-Beielstein. 2014. `` Efficient Global Optimization for Combinatorial Problems .'' In Genetic and Evolutionary Computation Conference (GECCO'14), Proceedings, edited by Dirk V Arnold. ACM. https://doi.org/http://doi.acm.org/10.1145/2576768.2598282. CSLReferenc...
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.