Recognition: unknown
Novelty-Based Generation of Continuous Landscapes with Diverse Local Optima Networks
Pith reviewed 2026-05-08 13:10 UTC · model grok-4.3
The pith
An alternative definition of basins of attraction for Max-Set of Gaussians landscapes allows low-cost construction of local optima networks, which novelty search then uses to generate diverse instances where network features predict the gap
A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.
Core claim
By defining basins of attraction for Max-Set of Gaussians landscapes directly from the Gaussian parameters rather than through search, Local Optima Networks can be constructed at low cost. Novelty Search is then applied to the landscape generator parameters to produce instances with varied search difficulty and connectivity patterns among optima. Experiments confirm alignment with gradient-based basins and demonstrate that LON features predict success rates of evolutionary algorithms.
What carries the argument
The alternative definition of basins of attraction for Max-Set of Gaussians landscapes, computed directly from the Gaussian parameters to serve as a proxy for true basins without search.
If this is right
- The proposed basins of attraction closely align with those identified by gradient-based methods.
- Novelty search successfully generates instances with varied search difficulty and connectivity patterns among optima.
- Local optima network features can predict the success rates of two well-established evolutionary algorithms on the generated instances.
- The overall framework supplies a dataset of continuous landscapes with known network properties for landscape-aware optimization research.
Where Pith is reading between the lines
- Similar parameter-based proxies for basins could be developed for other families of continuous functions to enable network analysis at scale.
- The generated instances could serve as training data for machine-learning models that select or configure algorithms based on quick network descriptors.
- Systematic variation of LON features across the dataset might isolate which connectivity statistics most strongly influence algorithm performance.
Load-bearing premise
The alternative definition of basins of attraction for MSG landscapes accurately captures true basins and serves as a valid proxy without requiring search-based identification.
What would settle it
Running gradient-based basin identification on a fresh collection of MSG landscapes and finding substantial disagreement with the proposed definition, or observing no reliable correlation between LON features and evolutionary algorithm success on additional generated instances.
Figures
read the original abstract
Local Optima Networks (LONs) represent the global structure of search spaces as graphs, but their construction requires iterative execution of a search algorithm to find local optima and approximate transitions between Basins of Attraction (BoAs). In continuous optimization, this high computational cost prevents systematic investigation of the relationship between LON features and evolutionary algorithm performance. To address this issue, we propose an alternative definition of BoAs for Max-Set of Gaussians (MSG) landscapes with explicitly tunable multimodality. This bypasses search-based BoA identification, enabling low-cost LON construction. Moreover, we leverage Novelty Search (NS) to explore the parameter space of the MSG landscape generator, producing instances with diverse graph topologies. Our experiments show that the proposed BoAs closely align with gradient-based BoAs, and that NS successfully generates instances with varied search difficulty and connectivity patterns among optima. Finally, over the instances generated by NS, we predict the success rate of two well-established evolutionary algorithms from LON features. While our LON construction is specific to MSG landscapes, the proposed framework provides a dataset that serves as a foundation for landscape-aware optimization.
Editorial analysis
A structured set of objections, weighed in public.
Referee Report
Summary. The paper proposes an alternative definition of basins of attraction (BoAs) for Max-Set of Gaussians (MSG) landscapes that enables direct, search-free construction of Local Optima Networks (LONs). It applies Novelty Search (NS) to explore the MSG parameter space and generate instances with diverse LON topologies. The authors claim that the proposed BoAs align closely with gradient-based BoAs, that NS produces varied search difficulties and connectivity patterns, and that LON features extracted from these instances can predict the success rates of two evolutionary algorithms.
Significance. If the alignment and predictive results hold under proper validation, the work supplies a computationally tractable framework and dataset for studying the relationship between continuous landscape structure and EA performance. The NS-driven generation of diverse instances with explicit LONs is a concrete strength that could support landscape-aware algorithm design and benchmarking.
major comments (3)
- The central prediction that LON features forecast EA success rates rests on the proposed BoA definition serving as a valid proxy. Alignment is reported only with gradient-based BoAs (Abstract), yet the target EAs are population-based and derivative-free; no direct comparison to the actual search trajectories or basin transitions of those EAs is described, leaving the proxy assumption unverified for the claimed prediction task.
- Prediction of success rates is performed on the same NS-generated instances used to extract the LON features (Abstract). This raises a circularity risk if the novelty objective influences the feature distributions; the manuscript must specify whether held-out validation, cross-validation across independent generations, or external test landscapes were employed.
- The abstract asserts that 'the proposed BoAs closely align with gradient-based BoAs' and that prediction 'successfully' occurs, yet supplies no quantitative metrics, error bars, sample sizes, or statistical tests. These details are load-bearing for assessing whether the alignment and predictive performance support the central claims.
minor comments (1)
- The abstract refers to 'two well-established evolutionary algorithms' without naming them; explicit identification (e.g., CMA-ES and differential evolution) would improve clarity.
Simulated Author's Rebuttal
We thank the referee for the constructive report and for recognizing the potential of our NS-driven LON framework for landscape-aware optimization. We address each major comment below, indicating planned revisions to strengthen the manuscript.
read point-by-point responses
-
Referee: The central prediction that LON features forecast EA success rates rests on the proposed BoA definition serving as a valid proxy. Alignment is reported only with gradient-based BoAs (Abstract), yet the target EAs are population-based and derivative-free; no direct comparison to the actual search trajectories or basin transitions of those EAs is described, leaving the proxy assumption unverified for the claimed prediction task.
Authors: We acknowledge that direct validation against the trajectories of the specific population-based EAs would provide stronger evidence. Gradient-based BoAs were chosen as the reference because they offer an established, search-intensive definition of basins in continuous spaces; the derivative-free EAs still operate on the same underlying fitness landscape. In the revision we will expand the discussion to justify this proxy choice and add a limited empirical comparison (on a subset of instances) between LON transitions and sampled paths from the two EAs, where computationally feasible. revision: partial
-
Referee: Prediction of success rates is performed on the same NS-generated instances used to extract the LON features (Abstract). This raises a circularity risk if the novelty objective influences the feature distributions; the manuscript must specify whether held-out validation, cross-validation across independent generations, or external test landscapes were employed.
Authors: The prediction experiments used 10-fold cross-validation within the NS-generated set to separate training and test instances. We will make this explicit in the methods and results sections. To further address potential distribution shift, we will add a supplementary experiment reporting prediction performance on an independent collection of MSG landscapes generated by uniform random sampling of parameters (i.e., without NS). revision: yes
-
Referee: The abstract asserts that 'the proposed BoAs closely align with gradient-based BoAs' and that prediction 'successfully' occurs, yet supplies no quantitative metrics, error bars, sample sizes, or statistical tests. These details are load-bearing for assessing whether the alignment and predictive performance support the central claims.
Authors: The body of the manuscript already contains the quantitative results (overlap statistics, sample size N=500, cross-validation error bars, and significance tests). We agree the abstract should be more informative and will revise it to include the key numerical findings while remaining within length limits. revision: yes
Circularity Check
No significant circularity; empirical pipeline is self-contained
full rationale
The paper defines an alternative BoA for MSG landscapes to bypass search, uses NS to generate instances with varied topologies, reports alignment between proposed and gradient-based BoAs, and performs feature-based prediction of EA success rates on the generated set. No step reduces a claimed prediction or result to its own inputs by construction, self-citation, or fitted parameter. The LON construction and regression are presented as independent empirical steps with external validation against gradient BoAs; the framework is dataset-generating rather than self-referential.
Axiom & Free-Parameter Ledger
Reference graph
Works this paper leans on
-
[1]
Localoptima networks: A new model of combinatorial fitness landscapes
GabrielaOchoa,SébastienVerel,FabioDaolio,andMarcoTomassini. Localoptima networks: A new model of combinatorial fitness landscapes. InRecent Advances in the Theory and Application of Fitness Landscapes, pages 233–262. Springer, 2014
2014
-
[2]
PageRank centrality for performance prediction: The impact of the local optima network model.Journal of Heuristics, 24(3):243–264, 2018
Sebastian Herrmann, Gabriela Ochoa, and Franz Rothlauf. PageRank centrality for performance prediction: The impact of the local optima network model.Journal of Heuristics, 24(3):243–264, 2018
2018
-
[3]
Landscape-aware performance prediction for evolutionary multiobjective optimization.IEEE Transactions on Evolutionary Computation, 24(6):1063–1077, 2019
Arnaud Liefooghe, Fabio Daolio, Sébastien Verel, Bilel Derbel, Hernan Aguirre, and Kiyoshi Tanaka. Landscape-aware performance prediction for evolutionary multiobjective optimization.IEEE Transactions on Evolutionary Computation, 24(6):1063–1077, 2019
2019
-
[4]
Pareto local optimal solutions networks with compression, enhanced visualization and expres- siveness
Arnaud Liefooghe, Gabriela Ochoa, Sébastien Verel, and Bilel Derbel. Pareto local optimal solutions networks with compression, enhanced visualization and expres- siveness. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 713–721. Association for Computing Machinery, 2023. Novelty-Based Generation of Continuous Landscapes...
2023
-
[5]
Jason Adair, Gabriela Ochoa, and Katherine M. Malan. Local optima networks for continuous fitness landscapes. InProceedings of the Genetic and Evolutionary Computation Conference Companion (GECCO Companion), pages 1407–1414. As- sociation for Computing Machinery, 2019
2019
-
[6]
A local optima network view of real function fitness landscapes
Marco Tomassini. A local optima network view of real function fitness landscapes. Entropy, 24(5):703, 2022
2022
-
[7]
Local optima networks of the black box optimisation benchmark functions
Paul Mitchell, Gabriela Ochoa, and Romain Chassagne. Local optima networks of the black box optimisation benchmark functions. InProceedings of the Compan- ion Conference on Genetic and Evolutionary Computation (GECCO Companion), pages 2072–2080. Association for Computing Machinery, 2023
2072
-
[8]
Fieldsend
Jonathan E. Fieldsend. Scalable local optima networks for continuous search spaces. InInternational Conference on Artificial Evolution (Evolution Artificielle), pages 77–90. Springer, 2024
2024
-
[9]
Wales and Jonathan P
David J. Wales and Jonathan P. K. Doye. Global optimization by basin-hopping and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms.The Journal of Physical Chemistry A, 101(28):5111–5116, 1997
1997
-
[10]
Evolution strategies – a comprehensive introduction.Natural Computing, 1(1):3–52, 2002
Hans-Georg Beyer and Hans-Paul Schwefel. Evolution strategies – a comprehensive introduction.Natural Computing, 1(1):3–52, 2002
2002
-
[11]
Joel Lehman and Kenneth O. Stanley. Abandoning objectives: Evolution through the search for novelty alone.Evolutionary Computation, 19(2):189–223, 2011
2011
-
[12]
Completely derandomized self- adaptation in evolution strategies.Evolutionary Computation, 9(2):159–195, 2001
Nikolaus Hansen and Andreas Ostermeier. Completely derandomized self- adaptation in evolution strategies.Evolutionary Computation, 9(2):159–195, 2001
2001
-
[13]
Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces.Journal of Global Opti- mization, 11(4):341–359, 1997
Rainer Storn and Kenneth Price. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces.Journal of Global Opti- mization, 11(4):341–359, 1997
1997
-
[14]
A general-purpose tunable landscape generator
Marcus Gallagher and Bo Yuan. A general-purpose tunable landscape generator. IEEE Transactions on Evolutionary Computation, 10(5):590–603, 2006
2006
-
[15]
Localoptima networks with escape edges
SébastienVerel,FabioDaolio,GabrielaOchoa,andMarcoTomassini. Localoptima networks with escape edges. InInternational Conference on Artificial Evolution (Evolution Artificielle), pages 49–60. Springer, 2011
2011
-
[16]
Shuhei Tanaka, Shoichiro Tanaka, Kippei Mizuta, and Toshiharu Hatanaka. De- signing landscape-aware benchmarks with explicit local optima for single- and multi-objective optimization. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO). Association for Computing Machinery, 2026. Forthcoming. Preprint: https://doi.org/10.5281/zenod...
-
[17]
Ilya M. Sobol´. Distribution of points in a cube and approximate evaluation of integrals.USSR Computational Mathematics and Mathematical Physics, 7:86–112, 1967
1967
-
[18]
Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions
Nikolaus Hansen, Steffen Finck, Raymond Ros, and Anne Auger. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Tech- nical report, INRIA, 2009
2009
-
[19]
Thomson, Fabio Daolio, and Gabriela Ochoa
Sarah L. Thomson, Fabio Daolio, and Gabriela Ochoa. Comparing communities of optima with funnels in combinatorial fitness landscapes. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 377–384. As- sociation for Computing Machinery, 2017
2017
-
[20]
Joel Lehman and Kenneth O. Stanley. Exploiting open-endedness to solve prob- lems through the search for novelty. InProceedings of the Eleventh International Conference on the Synthesis and Simulation of Living Systems (ALIFE 2008), pages 329–336. MIT Press, 2008. 16 K. Mizuta et al
2008
-
[21]
Devising effective novelty search algorithms: A comprehensive empirical study
Jorge Gomes, Pedro Mariano, and Anders Lyhne Christensen. Devising effective novelty search algorithms: A comprehensive empirical study. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 943–950. Association for Computing Machinery, 2015
2015
-
[22]
Pedregosa, G
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python.Journal of Machine Learning Research, 12:2825–2830, 2011
2011
-
[23]
Pymoo: Multi-objective optimization in Python.IEEE Access, 8:89497–89509, 2020
Julian Blank and Kalyanmoy Deb. Pymoo: Multi-objective optimization in Python.IEEE Access, 8:89497–89509, 2020
2020
-
[24]
An analysis of the operation of differential evolution at high and low crossover rates
James Montgomery and Stephen Chen. An analysis of the operation of differential evolution at high and low crossover rates. In2010 IEEE Congress on Evolutionary Computation (CEC), pages 1–8. IEEE, 2010
2010
-
[25]
An adaptive Cauchy differen- tial evolution algorithm for global numerical optimization.The Scientific World Journal, 2013(1):969734, 2013
Tae Jong Choi, Chang Wook Ahn, and Jinung An. An adaptive Cauchy differen- tial evolution algorithm for global numerical optimization.The Scientific World Journal, 2013(1):969734, 2013
2013
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.