pith. machine review for the scientific record. sign in

arxiv: 2604.21468 · v1 · submitted 2026-04-23 · 💻 cs.NE

Recognition: unknown

Novelty-Based Generation of Continuous Landscapes with Diverse Local Optima Networks

Authors on Pith no claims yet

Pith reviewed 2026-05-08 13:10 UTC · model grok-4.3

classification 💻 cs.NE
keywords local optima networksbasins of attractionmax-set of gaussiansnovelty searchcontinuous optimizationevolutionary algorithmslandscape analysis
0
0 comments X

The pith

An alternative definition of basins of attraction for Max-Set of Gaussians landscapes allows low-cost construction of local optima networks, which novelty search then uses to generate diverse instances where network features predict the gap

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper proposes an alternative definition of basins of attraction for Max-Set of Gaussians landscapes that is computed directly from the Gaussian parameters instead of by running a search algorithm. This change removes the main computational barrier to building local optima networks in continuous spaces, where networks normally require repeated local searches to locate optima and approximate transitions between basins. The authors apply novelty search to the parameters of the landscape generator to produce many instances that differ in search difficulty and in the connectivity patterns among their optima. Experiments confirm that the new basins match those found by gradient methods and that features of the resulting networks can forecast success rates for two established evolutionary algorithms. The output is a dataset of continuous problems with known network structure that supports further study of landscape-aware optimization.

Core claim

By defining basins of attraction for Max-Set of Gaussians landscapes directly from the Gaussian parameters rather than through search, Local Optima Networks can be constructed at low cost. Novelty Search is then applied to the landscape generator parameters to produce instances with varied search difficulty and connectivity patterns among optima. Experiments confirm alignment with gradient-based basins and demonstrate that LON features predict success rates of evolutionary algorithms.

What carries the argument

The alternative definition of basins of attraction for Max-Set of Gaussians landscapes, computed directly from the Gaussian parameters to serve as a proxy for true basins without search.

If this is right

  • The proposed basins of attraction closely align with those identified by gradient-based methods.
  • Novelty search successfully generates instances with varied search difficulty and connectivity patterns among optima.
  • Local optima network features can predict the success rates of two well-established evolutionary algorithms on the generated instances.
  • The overall framework supplies a dataset of continuous landscapes with known network properties for landscape-aware optimization research.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Similar parameter-based proxies for basins could be developed for other families of continuous functions to enable network analysis at scale.
  • The generated instances could serve as training data for machine-learning models that select or configure algorithms based on quick network descriptors.
  • Systematic variation of LON features across the dataset might isolate which connectivity statistics most strongly influence algorithm performance.

Load-bearing premise

The alternative definition of basins of attraction for MSG landscapes accurately captures true basins and serves as a valid proxy without requiring search-based identification.

What would settle it

Running gradient-based basin identification on a fresh collection of MSG landscapes and finding substantial disagreement with the proposed definition, or observing no reliable correlation between LON features and evolutionary algorithm success on additional generated instances.

Figures

Figures reproduced from arXiv: 2604.21468 by Kippei Mizuta, Shoichiro Tanaka, Shuhei Tanaka, Toshiharu Hatanaka.

Figure 1
Figure 1. Figure 1: An MSG landscape and its basins of attraction (d = 1, m = 3) view at source ↗
Figure 2
Figure 2. Figure 2: Comparison of BoAours and BoAGD for the instance with the highest difference_rate (6.49%) at d = 2. Results and interpretation. The experimental results show that the median difference_rate is 2.04% for d = 2, 0.46% for d = 5, and 0.07% for d = 10 view at source ↗
Figure 3
Figure 3. Figure 3: MSG landscapes included in the initial population of NS+ and their corresponding LONs. The novelty score is calculated as the average distance to its k = 15 nearest neighbors in the feature space. The novelty threshold is set to ρmin = 0.05 and is dynamically adjusted based on the number of archive additions per generation, following the approach of existing studies [20, 21]. In this work, the threshold is… view at source ↗
Figure 4
Figure 4. Figure 4: Box plots showing the final coverage of each instance group for d = 2, 5 and 10. Ten trials were conducted for each dimensionality. The heatmaps on the right show feature-space coverage from a single representative trial. Black cells contain the initial population; colored cells contain instances generated during the search (blue: Random, green: NS, orange: NS+). imply uniformity in the LON feature space: … view at source ↗
Figure 5
Figure 5. Figure 5: success_rate of CMA-ES and DE in each cell. If there are multiple instances, the median value is displayed. prediction accuracy via 10-fold cross-validation. As test problems, we consider all 10,020 MSG landscapes generated by NS+ in RQ2. As test algorithms, we use CMA-ES [12] and DE [13], both from the pymoo implementation [23]. Each algorithm is benchmarked over 31 independent trials. For CMA-ES, the ini… view at source ↗
Figure 6
Figure 6. Figure 6: Spearman’s rank correlation coefficient ρ between LON features and the performance of CMA-ES and DE. indicates that optimization becomes harder when problems have many local optima and a small global funnel. Notably, the success_rate distribution is more bimodal for DE than for CMA-ES: DE shows a sharp separation between regions of near-zero and near-one success rate with few intermediate values, whereas C… view at source ↗
read the original abstract

Local Optima Networks (LONs) represent the global structure of search spaces as graphs, but their construction requires iterative execution of a search algorithm to find local optima and approximate transitions between Basins of Attraction (BoAs). In continuous optimization, this high computational cost prevents systematic investigation of the relationship between LON features and evolutionary algorithm performance. To address this issue, we propose an alternative definition of BoAs for Max-Set of Gaussians (MSG) landscapes with explicitly tunable multimodality. This bypasses search-based BoA identification, enabling low-cost LON construction. Moreover, we leverage Novelty Search (NS) to explore the parameter space of the MSG landscape generator, producing instances with diverse graph topologies. Our experiments show that the proposed BoAs closely align with gradient-based BoAs, and that NS successfully generates instances with varied search difficulty and connectivity patterns among optima. Finally, over the instances generated by NS, we predict the success rate of two well-established evolutionary algorithms from LON features. While our LON construction is specific to MSG landscapes, the proposed framework provides a dataset that serves as a foundation for landscape-aware optimization.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

3 major / 1 minor

Summary. The paper proposes an alternative definition of basins of attraction (BoAs) for Max-Set of Gaussians (MSG) landscapes that enables direct, search-free construction of Local Optima Networks (LONs). It applies Novelty Search (NS) to explore the MSG parameter space and generate instances with diverse LON topologies. The authors claim that the proposed BoAs align closely with gradient-based BoAs, that NS produces varied search difficulties and connectivity patterns, and that LON features extracted from these instances can predict the success rates of two evolutionary algorithms.

Significance. If the alignment and predictive results hold under proper validation, the work supplies a computationally tractable framework and dataset for studying the relationship between continuous landscape structure and EA performance. The NS-driven generation of diverse instances with explicit LONs is a concrete strength that could support landscape-aware algorithm design and benchmarking.

major comments (3)
  1. The central prediction that LON features forecast EA success rates rests on the proposed BoA definition serving as a valid proxy. Alignment is reported only with gradient-based BoAs (Abstract), yet the target EAs are population-based and derivative-free; no direct comparison to the actual search trajectories or basin transitions of those EAs is described, leaving the proxy assumption unverified for the claimed prediction task.
  2. Prediction of success rates is performed on the same NS-generated instances used to extract the LON features (Abstract). This raises a circularity risk if the novelty objective influences the feature distributions; the manuscript must specify whether held-out validation, cross-validation across independent generations, or external test landscapes were employed.
  3. The abstract asserts that 'the proposed BoAs closely align with gradient-based BoAs' and that prediction 'successfully' occurs, yet supplies no quantitative metrics, error bars, sample sizes, or statistical tests. These details are load-bearing for assessing whether the alignment and predictive performance support the central claims.
minor comments (1)
  1. The abstract refers to 'two well-established evolutionary algorithms' without naming them; explicit identification (e.g., CMA-ES and differential evolution) would improve clarity.

Simulated Author's Rebuttal

3 responses · 0 unresolved

We thank the referee for the constructive report and for recognizing the potential of our NS-driven LON framework for landscape-aware optimization. We address each major comment below, indicating planned revisions to strengthen the manuscript.

read point-by-point responses
  1. Referee: The central prediction that LON features forecast EA success rates rests on the proposed BoA definition serving as a valid proxy. Alignment is reported only with gradient-based BoAs (Abstract), yet the target EAs are population-based and derivative-free; no direct comparison to the actual search trajectories or basin transitions of those EAs is described, leaving the proxy assumption unverified for the claimed prediction task.

    Authors: We acknowledge that direct validation against the trajectories of the specific population-based EAs would provide stronger evidence. Gradient-based BoAs were chosen as the reference because they offer an established, search-intensive definition of basins in continuous spaces; the derivative-free EAs still operate on the same underlying fitness landscape. In the revision we will expand the discussion to justify this proxy choice and add a limited empirical comparison (on a subset of instances) between LON transitions and sampled paths from the two EAs, where computationally feasible. revision: partial

  2. Referee: Prediction of success rates is performed on the same NS-generated instances used to extract the LON features (Abstract). This raises a circularity risk if the novelty objective influences the feature distributions; the manuscript must specify whether held-out validation, cross-validation across independent generations, or external test landscapes were employed.

    Authors: The prediction experiments used 10-fold cross-validation within the NS-generated set to separate training and test instances. We will make this explicit in the methods and results sections. To further address potential distribution shift, we will add a supplementary experiment reporting prediction performance on an independent collection of MSG landscapes generated by uniform random sampling of parameters (i.e., without NS). revision: yes

  3. Referee: The abstract asserts that 'the proposed BoAs closely align with gradient-based BoAs' and that prediction 'successfully' occurs, yet supplies no quantitative metrics, error bars, sample sizes, or statistical tests. These details are load-bearing for assessing whether the alignment and predictive performance support the central claims.

    Authors: The body of the manuscript already contains the quantitative results (overlap statistics, sample size N=500, cross-validation error bars, and significance tests). We agree the abstract should be more informative and will revise it to include the key numerical findings while remaining within length limits. revision: yes

Circularity Check

0 steps flagged

No significant circularity; empirical pipeline is self-contained

full rationale

The paper defines an alternative BoA for MSG landscapes to bypass search, uses NS to generate instances with varied topologies, reports alignment between proposed and gradient-based BoAs, and performs feature-based prediction of EA success rates on the generated set. No step reduces a claimed prediction or result to its own inputs by construction, self-citation, or fitted parameter. The LON construction and regression are presented as independent empirical steps with external validation against gradient BoAs; the framework is dataset-generating rather than self-referential.

Axiom & Free-Parameter Ledger

0 free parameters · 0 axioms · 0 invented entities

Abstract-only review provides no explicit equations or parameter lists, so no free parameters, axioms, or invented entities can be identified with certainty.

pith-pipeline@v0.9.0 · 10166 in / 1169 out tokens · 104976 ms · 2026-05-08T13:10:51.537513+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

25 extracted references · 1 canonical work pages

  1. [1]

    Localoptima networks: A new model of combinatorial fitness landscapes

    GabrielaOchoa,SébastienVerel,FabioDaolio,andMarcoTomassini. Localoptima networks: A new model of combinatorial fitness landscapes. InRecent Advances in the Theory and Application of Fitness Landscapes, pages 233–262. Springer, 2014

  2. [2]

    PageRank centrality for performance prediction: The impact of the local optima network model.Journal of Heuristics, 24(3):243–264, 2018

    Sebastian Herrmann, Gabriela Ochoa, and Franz Rothlauf. PageRank centrality for performance prediction: The impact of the local optima network model.Journal of Heuristics, 24(3):243–264, 2018

  3. [3]

    Landscape-aware performance prediction for evolutionary multiobjective optimization.IEEE Transactions on Evolutionary Computation, 24(6):1063–1077, 2019

    Arnaud Liefooghe, Fabio Daolio, Sébastien Verel, Bilel Derbel, Hernan Aguirre, and Kiyoshi Tanaka. Landscape-aware performance prediction for evolutionary multiobjective optimization.IEEE Transactions on Evolutionary Computation, 24(6):1063–1077, 2019

  4. [4]

    Pareto local optimal solutions networks with compression, enhanced visualization and expres- siveness

    Arnaud Liefooghe, Gabriela Ochoa, Sébastien Verel, and Bilel Derbel. Pareto local optimal solutions networks with compression, enhanced visualization and expres- siveness. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 713–721. Association for Computing Machinery, 2023. Novelty-Based Generation of Continuous Landscapes...

  5. [5]

    Jason Adair, Gabriela Ochoa, and Katherine M. Malan. Local optima networks for continuous fitness landscapes. InProceedings of the Genetic and Evolutionary Computation Conference Companion (GECCO Companion), pages 1407–1414. As- sociation for Computing Machinery, 2019

  6. [6]

    A local optima network view of real function fitness landscapes

    Marco Tomassini. A local optima network view of real function fitness landscapes. Entropy, 24(5):703, 2022

  7. [7]

    Local optima networks of the black box optimisation benchmark functions

    Paul Mitchell, Gabriela Ochoa, and Romain Chassagne. Local optima networks of the black box optimisation benchmark functions. InProceedings of the Compan- ion Conference on Genetic and Evolutionary Computation (GECCO Companion), pages 2072–2080. Association for Computing Machinery, 2023

  8. [8]

    Fieldsend

    Jonathan E. Fieldsend. Scalable local optima networks for continuous search spaces. InInternational Conference on Artificial Evolution (Evolution Artificielle), pages 77–90. Springer, 2024

  9. [9]

    Wales and Jonathan P

    David J. Wales and Jonathan P. K. Doye. Global optimization by basin-hopping and the lowest energy structures of Lennard-Jones clusters containing up to 110 atoms.The Journal of Physical Chemistry A, 101(28):5111–5116, 1997

  10. [10]

    Evolution strategies – a comprehensive introduction.Natural Computing, 1(1):3–52, 2002

    Hans-Georg Beyer and Hans-Paul Schwefel. Evolution strategies – a comprehensive introduction.Natural Computing, 1(1):3–52, 2002

  11. [11]

    Joel Lehman and Kenneth O. Stanley. Abandoning objectives: Evolution through the search for novelty alone.Evolutionary Computation, 19(2):189–223, 2011

  12. [12]

    Completely derandomized self- adaptation in evolution strategies.Evolutionary Computation, 9(2):159–195, 2001

    Nikolaus Hansen and Andreas Ostermeier. Completely derandomized self- adaptation in evolution strategies.Evolutionary Computation, 9(2):159–195, 2001

  13. [13]

    Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces.Journal of Global Opti- mization, 11(4):341–359, 1997

    Rainer Storn and Kenneth Price. Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces.Journal of Global Opti- mization, 11(4):341–359, 1997

  14. [14]

    A general-purpose tunable landscape generator

    Marcus Gallagher and Bo Yuan. A general-purpose tunable landscape generator. IEEE Transactions on Evolutionary Computation, 10(5):590–603, 2006

  15. [15]

    Localoptima networks with escape edges

    SébastienVerel,FabioDaolio,GabrielaOchoa,andMarcoTomassini. Localoptima networks with escape edges. InInternational Conference on Artificial Evolution (Evolution Artificielle), pages 49–60. Springer, 2011

  16. [16]

    De- signing landscape-aware benchmarks with explicit local optima for single- and multi-objective optimization

    Shuhei Tanaka, Shoichiro Tanaka, Kippei Mizuta, and Toshiharu Hatanaka. De- signing landscape-aware benchmarks with explicit local optima for single- and multi-objective optimization. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO). Association for Computing Machinery, 2026. Forthcoming. Preprint: https://doi.org/10.5281/zenod...

  17. [17]

    Ilya M. Sobol´. Distribution of points in a cube and approximate evaluation of integrals.USSR Computational Mathematics and Mathematical Physics, 7:86–112, 1967

  18. [18]

    Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions

    Nikolaus Hansen, Steffen Finck, Raymond Ros, and Anne Auger. Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Tech- nical report, INRIA, 2009

  19. [19]

    Thomson, Fabio Daolio, and Gabriela Ochoa

    Sarah L. Thomson, Fabio Daolio, and Gabriela Ochoa. Comparing communities of optima with funnels in combinatorial fitness landscapes. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 377–384. As- sociation for Computing Machinery, 2017

  20. [20]

    Joel Lehman and Kenneth O. Stanley. Exploiting open-endedness to solve prob- lems through the search for novelty. InProceedings of the Eleventh International Conference on the Synthesis and Simulation of Living Systems (ALIFE 2008), pages 329–336. MIT Press, 2008. 16 K. Mizuta et al

  21. [21]

    Devising effective novelty search algorithms: A comprehensive empirical study

    Jorge Gomes, Pedro Mariano, and Anders Lyhne Christensen. Devising effective novelty search algorithms: A comprehensive empirical study. InProceedings of the Genetic and Evolutionary Computation Conference (GECCO), pages 943–950. Association for Computing Machinery, 2015

  22. [22]

    Pedregosa, G

    F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: Machine learning in Python.Journal of Machine Learning Research, 12:2825–2830, 2011

  23. [23]

    Pymoo: Multi-objective optimization in Python.IEEE Access, 8:89497–89509, 2020

    Julian Blank and Kalyanmoy Deb. Pymoo: Multi-objective optimization in Python.IEEE Access, 8:89497–89509, 2020

  24. [24]

    An analysis of the operation of differential evolution at high and low crossover rates

    James Montgomery and Stephen Chen. An analysis of the operation of differential evolution at high and low crossover rates. In2010 IEEE Congress on Evolutionary Computation (CEC), pages 1–8. IEEE, 2010

  25. [25]

    An adaptive Cauchy differen- tial evolution algorithm for global numerical optimization.The Scientific World Journal, 2013(1):969734, 2013

    Tae Jong Choi, Chang Wook Ahn, and Jinung An. An adaptive Cauchy differen- tial evolution algorithm for global numerical optimization.The Scientific World Journal, 2013(1):969734, 2013