pith. machine review for the scientific record. sign in

arxiv: 2605.04087 · v1 · submitted 2026-04-23 · 🧮 math.OC · cs.LG· stat.CO· stat.ML

Recognition: unknown

BOOOM: Loss-Function-Agnostic Black-Box Optimization over Orthonormal Manifolds for Machine Learning and Statistical Inference

Authors on Pith no claims yet

Pith reviewed 2026-05-09 20:36 UTC · model grok-4.3

classification 🧮 math.OC cs.LGstat.COstat.ML
keywords black-box optimizationStiefel manifoldGivens rotationsderivative-free optimizationmanifold optimizationpattern searchorthonormal matricesstatistical inference
0
0 comments X

The pith

A global Givens rotation parametrization reduces black-box optimization over column-orthonormal matrices to unconstrained Euclidean search.

A machine-rendered reading of the paper's core claim, the machinery that carries it, and where it could break.

The paper develops a framework for optimizing arbitrary loss functions over the Stiefel manifold of column-orthonormal matrices without gradients, convexity, or smoothness assumptions. It does so by mapping the manifold exactly to an unconstrained space of rotation angles via global Givens rotations, then applying a structured recursive pattern search that explores through plane-wise rotations. A reader would care because tasks such as independent component analysis, matrix decompositions, and supervised PCA often require orthonormal constraints yet encounter non-smooth or multimodal objectives where standard methods fail. If the mapping preserves feasibility exactly and the search transfers stationarity while converging globally in probability, the approach supplies a practical derivative-free alternative for these settings.

Core claim

BOOOM employs a global Givens rotation-based parametrization that maps the Stiefel manifold St(p,d) to an unconstrained Euclidean angle space while preserving feasibility exactly. Building on this representation, BOOOM employs a structured, parallelizable, derivative-free search based on Recursive Modified Pattern Search, enabling systematic exploration through plane-wise rotations without requiring gradient information. The framework establishes equivalence between angle-space and manifold optimization, transfer of stationarity, and global convergence in probability under mild conditions.

What carries the argument

The global Givens rotation-based parametrization that maps the manifold of column-orthonormal matrices to an unconstrained Euclidean angle space while preserving feasibility exactly, allowing derivative-free plane-wise search to operate without constraints.

If this is right

  • The method applies to heterogeneous quadratic optimization, low-rank and sparse matrix decomposition, independent component analysis, and orthogonal joint diagonalization.
  • It achieves strong performance relative to existing methods in non-smooth and highly multimodal regimes.
  • Stationarity transfers between the angle space and the manifold, supporting reliable optimization.
  • A supervised PCA formulation based on the framework applies to metabolomics data in colorectal cancer.

Where Pith is reading between the lines

These are editorial extensions of the paper, not claims the author makes directly.

  • Similar exact parametrizations could extend the approach to other compact manifolds encountered in statistical models.
  • The derivative-free nature may reduce sensitivity to poor local optima when initializing high-dimensional inference procedures.
  • Practitioners could benchmark the search against Riemannian methods on additional multimodal objectives arising in neural network layers with orthogonality constraints.
  • Parallel plane-wise rotations suggest scalability improvements for larger matrix sizes in scientific computing applications.

Load-bearing premise

The global Givens rotation-based parametrization maps the manifold to an unconstrained Euclidean angle space while preserving feasibility exactly, and the recursive modified pattern search converges globally in probability under mild conditions.

What would settle it

A concrete counterexample in which the angle parametrization fails to reach every point on the Stiefel manifold, or in which the pattern search fails to locate a known global optimum with probability approaching one on a multimodal test problem over orthonormal matrices.

Figures

Figures reproduced from arXiv: 2605.04087 by Beomchang Kim, Priyam Das, Subhrajyoty Roy.

Figure 1
Figure 1. Figure 1: Black-box optimization over orthogonal transformations in a pretrained image classifier [PITH_FULL_IMAGE:figures/full_fig_p006_1.png] view at source ↗
Figure 2
Figure 2. Figure 2: Fermi’s principle: possible 2N coordinate-wise movements from a current point in R N with fixed step size s. Fermi’s principle forms the basis of Pattern Search (PS) methods and related direct search al￾gorithms (Torczon, 1997; Kolda et al., 2003). These methods iteratively explore structured sets of candidate points and adapt the step size based on improvement, providing a robust framework for optimizing … view at source ↗
Figure 3
Figure 3. Figure 3: illustrates the proposed extension of Fermi’s principle to the Stiefel manifold. In contrast to Euclidean coordinate perturbations, which act independently along axes, performing these struc￾tured rotations via the Givens rotation operator as in (3) ensures that the iterates remain entirely within the feasible set. This establishes a direct analogue of coordinate-wise exploration in a curved, constrained s… view at source ↗
Figure 4
Figure 4. Figure 4: BOOOM flowchart 16 [PITH_FULL_IMAGE:figures/full_fig_p016_4.png] view at source ↗
Figure 5
Figure 5. Figure 5: Comparison of BOOOM, BOOOM-parallel, and four Riemannian optimization baselines: [PITH_FULL_IMAGE:figures/full_fig_p029_5.png] view at source ↗
Figure 6
Figure 6. Figure 6: Maximization of heterogeneous quadratic forms: Maximum value comparison between [PITH_FULL_IMAGE:figures/full_fig_p033_6.png] view at source ↗
Figure 7
Figure 7. Figure 7: Low rank and Sparse Matrix Decomposition: Mean absolute error (MAE) comparison on [PITH_FULL_IMAGE:figures/full_fig_p036_7.png] view at source ↗
Figure 8
Figure 8. Figure 8: Independent Component Analysis: Performance comparison across four problem dimen [PITH_FULL_IMAGE:figures/full_fig_p038_8.png] view at source ↗
Figure 9
Figure 9. Figure 9: Varimax Factor Rotation: Comparison of optimization performance for [PITH_FULL_IMAGE:figures/full_fig_p041_9.png] view at source ↗
Figure 10
Figure 10. Figure 10: Orthogonal Joint Diagonalization: Comparison of optimization performance for [PITH_FULL_IMAGE:figures/full_fig_p043_10.png] view at source ↗
Figure 11
Figure 11. Figure 11: Reduced Kohn–Sham Rayleigh–Ritz optimization: Each column corresponds to a reduced [PITH_FULL_IMAGE:figures/full_fig_p046_11.png] view at source ↗
Figure 12
Figure 12. Figure 12: Pareto curves showing the relationship between sparsity (proportion of non-zero columns) [PITH_FULL_IMAGE:figures/full_fig_p050_12.png] view at source ↗
Figure 13
Figure 13. Figure 13: Top 20 metabolites ranked by their importance scores, defined by the row norms of [PITH_FULL_IMAGE:figures/full_fig_p051_13.png] view at source ↗
read the original abstract

Optimization over the Stiefel manifold $\mathrm{St}(p,d)$, the set of $p \times d$ column-orthonormal matrices, is fundamental in statistics, machine learning, and scientific computing, yet remains challenging in the presence of non-convex, non-smooth, or black-box objectives. Existing methods largely rely on either convex relaxations or gradient-based Riemannian optimization, limiting applicability in derivative-free and highly multimodal settings. We propose \textsc{BOOOM} (Black-box Optimization Over Orthonormal Manifolds), a general-purpose framework for loss-function-agnostic optimization on $\mathrm{St}(p,d)$. The key idea is a global Givens rotation-based parametrization that maps the manifold to an unconstrained Euclidean angle space while preserving feasibility exactly. Building on this representation, BOOOM employs a structured, parallelizable, derivative-free search based on Recursive Modified Pattern Search, enabling systematic exploration through plane-wise rotations without requiring gradient information and facilitating escape from poor local optima. We establish a unified theoretical framework showing equivalence between angle-space and manifold optimization, transfer of stationarity, and global convergence in probability under mild conditions. Empirical results across diverse problems, including heterogeneous quadratic optimization, low-rank and sparse matrix decomposition, independent component analysis, and orthogonal joint diagonalization, among other widely studied settings, demonstrate strong performance relative to state-of-the-art methods, particularly in non-smooth and highly multimodal regimes. We further illustrate its practical utility through a novel supervised PCA formulation applied to metabolomics data in colorectal cancer.

Editorial analysis

A structured set of objections, weighed in public.

Desk editor's note, referee report, simulated authors' rebuttal, and a circularity audit. Tearing a paper down is the easy half of reading it; the pith above is the substance, this is the friction.

Referee Report

2 major / 2 minor

Summary. The paper proposes BOOOM, a black-box optimization framework for the Stiefel manifold St(p,d) based on a global Givens rotation parametrization that maps the manifold to unconstrained Euclidean angle space while exactly preserving feasibility. It combines this with a recursive modified pattern search for derivative-free exploration and provides a unified theory establishing equivalence to manifold optimization, stationarity transfer, and global convergence in probability under mild conditions. Empirical results are reported on heterogeneous quadratic optimization, low-rank/sparse decomposition, ICA, orthogonal joint diagonalization, and a novel supervised PCA application to metabolomics data, claiming superiority in non-smooth and multimodal regimes.

Significance. If the parametrization and convergence results hold, the work would supply a practical derivative-free alternative for non-convex, non-smooth optimization over orthonormal constraints, with potential impact on ICA, PCA variants, and matrix factorization tasks where gradients are unavailable or unreliable. The explicit handling of feasibility via angles is a constructive strength.

major comments (2)
  1. [§4] §4 (Theoretical Framework): the global convergence in probability for the recursive modified pattern search on unbounded R^m is asserted under 'mild conditions,' but the 2π-periodicity of each Givens rotation renders the angle-space map many-to-one and the objective periodic; without explicit compactification, modulo identification, or coercivity assumptions in the conditions, the global-in-probability claim does not follow from the local stationarity transfer.
  2. [§3.1] §3.1 (Parametrization): the claim that the global Givens rotation-based map 'preserves feasibility exactly' and enables full equivalence of optimization problems requires a precise statement of the angle-space objective and its relation to the original manifold loss; the many-to-one nature must be shown not to introduce spurious stationary points that violate stationarity transfer.
minor comments (2)
  1. [Abstract] The abstract lists applications but omits quantitative metrics or baseline names; a short table summarizing win rates or objective values would improve clarity.
  2. [§3] Notation for the angle dimension m and the recursive pattern-search parameters should be introduced with explicit definitions before their use in the convergence theorem.

Simulated Author's Rebuttal

2 responses · 0 unresolved

We thank the referee for the careful and constructive review of our manuscript. The comments on the theoretical sections are well-taken and will help us improve the rigor and clarity of the presentation. We respond to each major comment below.

read point-by-point responses
  1. Referee: [§4] §4 (Theoretical Framework): the global convergence in probability for the recursive modified pattern search on unbounded R^m is asserted under 'mild conditions,' but the 2π-periodicity of each Givens rotation renders the angle-space map many-to-one and the objective periodic; without explicit compactification, modulo identification, or coercivity assumptions in the conditions, the global-in-probability claim does not follow from the local stationarity transfer.

    Authors: We appreciate the referee's observation on the periodicity induced by the Givens parametrization. The angle-space objective is indeed 2π-periodic in each coordinate, so the map from R^m to St(p,d) is many-to-one. In the current analysis the mild conditions are intended to ensure that the recursive modified pattern search explores the space sufficiently to reach global optimality in probability. To make this rigorous, we will revise §4 to explicitly address the periodicity: we will add a remark noting that, without loss of generality, the search may be restricted to a single fundamental domain [0,2π)^m (or equivalently performed on the compact torus), and we will verify that the stationarity-transfer and convergence arguments carry over directly to this compact setting. This clarification removes the need for additional coercivity assumptions while preserving the stated global convergence result. revision: yes

  2. Referee: [§3.1] §3.1 (Parametrization): the claim that the global Givens rotation-based map 'preserves feasibility exactly' and enables full equivalence of optimization problems requires a precise statement of the angle-space objective and its relation to the original manifold loss; the many-to-one nature must be shown not to introduce spurious stationary points that violate stationarity transfer.

    Authors: We agree that a more formal statement of the objective and the stationarity equivalence would strengthen §3.1. The parametrization is constructed so that for every θ ∈ R^m the matrix Q(θ) obtained by the product of Givens rotations lies exactly in St(p,d), thereby preserving feasibility by design. The angle-space objective is defined by composition: f(θ) := L(Q(θ)), where L denotes the original loss on the manifold. We will augment the section with an explicit proposition that (i) states the precise relationship f(θ) = L(Q(θ)), (ii) shows that the differential of the parametrization map transfers stationarity (i.e., ∇f(θ*) = 0 implies that the Riemannian gradient of L at Q(θ*) vanishes), and (iii) confirms that the many-to-one character does not create extraneous stationary points because any two angles mapping to the same Q yield equivalent critical-point conditions. The revised text will include this proposition together with a brief proof outline based on the chain rule. revision: yes

Circularity Check

0 steps flagged

No significant circularity detected

full rationale

The paper introduces a novel global Givens rotation-based parametrization that maps St(p,d) to unconstrained Euclidean angles while preserving feasibility, then builds a recursive modified pattern search on this representation and derives equivalence, stationarity transfer, and global convergence in probability under stated mild conditions. These theoretical results are obtained by direct analysis of the proposed map and algorithm rather than by reducing to fitted parameters, self-definitional loops, or load-bearing self-citations. The derivation chain remains self-contained against external benchmarks, with the central claims resting on the new constructions themselves.

Axiom & Free-Parameter Ledger

0 free parameters · 2 axioms · 0 invented entities

The framework rests on the exact equivalence of the Givens angle parametrization with the original manifold constraint and on convergence guarantees that hold only under unspecified mild conditions. No free parameters or new entities are introduced in the abstract.

axioms (2)
  • domain assumption A global Givens rotation-based parametrization maps the Stiefel manifold to unconstrained Euclidean angle space while preserving feasibility exactly.
    This is stated as the key idea enabling the entire approach.
  • ad hoc to paper The recursive modified pattern search achieves global convergence in probability under mild conditions.
    Central theoretical claim whose precise conditions are not detailed.

pith-pipeline@v0.9.0 · 5599 in / 1586 out tokens · 70960 ms · 2026-05-09T20:36:24.188457+00:00 · methodology

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Reference graph

Works this paper leans on

300 extracted references · 8 canonical work pages · 1 internal anchor

  1. [1]

    Proceedings of the AAAI Conference on Artificial Intelligence , volume=

    Diversifying Counterattacks: Orthogonal Exploration for Robust CLlP Inference , author=. Proceedings of the AAAI Conference on Artificial Intelligence , volume=

  2. [2]

    arXiv preprint arXiv:2106.15023 , year=

    Evading adversarial example detection defenses with orthogonal projected gradient descent , author=. arXiv preprint arXiv:2106.15023 , year=

  3. [3]

    International conference on machine learning , pages=

    Synthesizing robust adversarial examples , author=. International conference on machine learning , pages=. 2018 , organization=

  4. [4]

    IEEE transactions on pattern analysis and machine intelligence , volume=

    Orthogonal deep neural networks , author=. IEEE transactions on pattern analysis and machine intelligence , volume=. 2019 , publisher=

  5. [5]

    arXiv preprint arXiv:2201.12133 , year=

    O-vit: Orthogonal vision transformer , author=. arXiv preprint arXiv:2201.12133 , year=

  6. [6]

    Proceedings of the IEEE conference on computer vision and pattern recognition , pages=

    Deep residual learning for image recognition , author=. Proceedings of the IEEE conference on computer vision and pattern recognition , pages=

  7. [7]

    An overview of gradient descent optimization algorithms

    An overview of gradient descent optimization algorithms , author=. arXiv preprint arXiv:1609.04747 , year=

  8. [8]

    2010 , publisher=

    Quantum computation and quantum information , author=. 2010 , publisher=

  9. [9]

    Mathematical Programming , volume=

    A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization , author=. Mathematical Programming , volume=

  10. [10]

    International Conference on Machine Learning (ICML) , year=

    On orthogonality and learning recurrent networks with long term dependencies , author=. International Conference on Machine Learning (ICML) , year=

  11. [11]

    AAAI Technical Track: Machine Learning , volume=

    Orthogonal Weight Normalization: Solution to Optimization over Multiple Dependent Stiefel Manifolds in Deep Neural Networks , author=. AAAI Technical Track: Machine Learning , volume=

  12. [12]

    Numerical Methods for Large Eigenvalue Problems , author=

  13. [13]

    Advances in Neural Information Processing Systems (NeurIPS) , volume=

    Non-convex robust PCA , author=. Advances in Neural Information Processing Systems (NeurIPS) , volume=

  14. [14]

    SIAM Journal on Optimization , volume=

    A singular value thresholding algorithm for matrix completion , author=. SIAM Journal on Optimization , volume=

  15. [15]

    and Arias, T

    Edelman, A. and Arias, T. and Smith, S. , title =. SIAM Journal on Matrix Analysis and Applications , volume =

  16. [16]

    2003 , publisher=

    Multiobjective Optimization: Principles and Case Studies , author=. 2003 , publisher=. doi:10.1007/978-3-662-08883-8 , address=

  17. [17]

    Physical Review , volume=

    Self-consistent equations including exchange and correlation effects , author=. Physical Review , volume=

  18. [18]

    ACM Transactions on Mathematical Software , volume=

  19. [19]

    Computer Physics Communications , volume=

  20. [20]

    Journal of Computational Physics , volume=

    Adaptive local basis set for Kohn--Sham density functional theory in a discontinuous Galerkin framework: Total energy calculation , author=. Journal of Computational Physics , volume=

  21. [21]

    Reviews of Modern Physics , volume=

    Linear scaling electronic structure methods , author=. Reviews of Modern Physics , volume=

  22. [22]

    SIAM Journal on Matrix Analysis and Applications , volume=

    Jacobi angles for simultaneous diagonalization , author=. SIAM Journal on Matrix Analysis and Applications , volume=

  23. [23]

    SIAM Journal on Matrix Analysis and Applications , volume=

    Joint approximate diagonalization of positive definite Hermitian matrices , author=. SIAM Journal on Matrix Analysis and Applications , volume=

  24. [24]

    Psychometrika , volume=

    A simple general procedure for orthogonal rotation , author=. Psychometrika , volume=

  25. [25]

    Psychometrika , volume=

    The varimax criterion for analytic rotation in factor analysis , author=. Psychometrika , volume=

  26. [26]

    Principal Component Analysis , author=

  27. [27]

    Advances in Neural Information Processing Systems , volume=

    A new learning algorithm for blind signal separation , author=. Advances in Neural Information Processing Systems , volume=

  28. [28]

    2001 , publisher=

    Independent Component Analysis , author=. 2001 , publisher=

  29. [29]

    IEEE Transactions on Neural Networks , volume=

    Fast and robust fixed-point algorithms for independent component analysis , author=. IEEE Transactions on Neural Networks , volume=

  30. [30]

    Neural Computation , volume=

    An information-maximization approach to blind separation and blind deconvolution , author=. Neural Computation , volume=

  31. [31]

    IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) , pages=

    Faster ICA under orthogonal constraint , author=. IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) , pages=

  32. [32]

    Proceedings of the IEEE , volume=

    Blind signal separation: statistical principles , author=. Proceedings of the IEEE , volume=

  33. [33]

    2019 , doi =

    Metagenomic and metabolomic analyses reveal distinct stage-specific phenotypes of the gut microbiota in colorectal cancer , Journal =. 2019 , doi =

  34. [34]

    Journal of the ACM , volume=

    Robust principal component analysis? , author=. Journal of the ACM , volume=

  35. [35]

    Proceedings of the 23rd international conference on Machine learning , pages=

    R 1-pca: rotational invariant l 1-norm principal component analysis for robust subspace factorization , author=. Proceedings of the 23rd international conference on Machine learning , pages=

  36. [36]

    Journal of Machine Learning Research , volume=

    Robust Principal Component Analysis using Density Power Divergence , author=. Journal of Machine Learning Research , volume=

  37. [37]

    The Annals of Applied Statistics , pages=

    Robust regularized singular value decomposition with application to mortality data , author=. The Annals of Applied Statistics , pages=. 2013 , publisher=

  38. [38]

    International Conference on Machine Learning , pages=

    Efficient dictionary learning with gradient descent , author=. International Conference on Machine Learning , pages=. 2019 , organization=

  39. [39]

    Computer Physics Communications , volume=

    Direct minimization on the complex Stiefel manifold in Kohn-Sham density functional theory for finite and extended systems , author=. Computer Physics Communications , volume=. 2025 , publisher=

  40. [40]

    Statistics and Computing , volume=

    A comparison of efficient approximations for a weighted sum of chi-squared random variables , author=. Statistics and Computing , volume=. 2016 , publisher=

  41. [41]

    and Burer, S

    Gilman, K. and Burer, S. and Balzano, L. , journal=. A semidefinite relaxation for sums of heterogeneous quadratic forms on the. 2025 , publisher=

  42. [42]

    Mathematical Programming , volume=

    A framework of constraint preserving update schemes for optimization on Stiefel manifold , author=. Mathematical Programming , volume=. 2015 , publisher=

  43. [43]

    SIAM Journal on Optimization , volume=

    Weakly convex optimization over Stiefel manifold using Riemannian subgradient-type methods , author=. SIAM Journal on Optimization , volume=. 2021 , publisher=

  44. [44]

    SIAM Review , volume=

    Nonsmooth optimization over the Stiefel manifold and beyond: Proximal gradient method and recent variants , author=. SIAM Review , volume=. 2024 , publisher=

  45. [45]

    2017 , publisher=

    IEEE Transactions on Neural Networks and Learning Systems , volume=. 2017 , publisher=

  46. [46]

    Journal of Machine Learning Research , volume=

    Accelerated alternating projections for robust principal component analysis , author=. Journal of Machine Learning Research , volume=

  47. [47]

    2009 , journal=

    Bi-cross-validation of the SVD and the nonnegative matrix factorization , author=. 2009 , journal=

  48. [48]

    The Visual Computer , volume=

    Low-rank and sparse matrix decomposition via the truncated nuclear norm and a sparse regularizer , author=. The Visual Computer , volume=. 2019 , publisher=

  49. [49]

    Proceedings of the International Conference on Learning Representations (ICLR) , year =

    Givens Coordinate Descent Methods for Rotation Matrix Learning in Trainable Embedding Indexes , author =. Proceedings of the International Conference on Learning Representations (ICLR) , year =

  50. [50]

    and Mahony, R

    Absil, P. and Mahony, R. and Sepulchre, R. , title =. 2008 , publisher =

  51. [51]

    Lee , title =

    J. Lee , title =. 2018 , series =

  52. [52]

    , title =

    Boumal, N. , title =. 2023 , publisher =

  53. [53]

    Nesterov , title =

    Y. Nesterov , title =. 2004 , publisher =

  54. [54]

    , title =

    Beck, A. , title =. 2017 , publisher =

  55. [55]

    Billingsley , title =

    P. Billingsley , title =. 1995 , series =

  56. [56]

    Hurwitz , title =

    A. Hurwitz , title =. Mathematische Werke , pages =

  57. [57]

    Balusha and S

    A. Balusha and S. Morrow , title =. 2024 , howpublished =

  58. [58]

    Journal of Multivariate Analysis , volume=

    Estimation of sparse covariance matrix via non-convex regularization , author=. Journal of Multivariate Analysis , volume=. 2024 , publisher=

  59. [59]

    Biometrika , volume=

    Robust estimation of high-dimensional covariance and precision matrices , author=. Biometrika , volume=. 2018 , publisher=

  60. [60]

    Quick Start Parallel Computing in MATLAB , year =

  61. [61]

    and Xia, Z

    Kim, B. and Xia, Z. and Das, P. , journal =

  62. [62]

    and Ghosal, S

    Tan, Q. and Ghosal, S. , title =. Nonparametric Statistics. ISNPS 2018 , series =. 2020 , publisher =

  63. [63]

    and Dennis, J

    Audet, C. and Dennis, J. E. , title =. SIAM Journal on Optimization , volume =

  64. [64]

    Journal of Machine Learning Research , volume =

    Manopt, a. Journal of Machine Learning Research , volume =

  65. [65]

    L. E. Blumenson , title =. American Mathematical Monthly , volume =

  66. [66]

    2004 , publisher =

    Introductory Lectures on Convex Optimization: A Basic Course , author =. 2004 , publisher =

  67. [67]

    1988 , publisher =

    Sphere Packings, Lattices and Groups , author =. 1988 , publisher =. doi:10.1007/978-1-4757-2016-7 , address =

  68. [68]

    S. B. Gelfand and S. K. Mitter , title =. Journal of Optimization Theory and Applications , year =

  69. [69]

    Mathematics of Operations Research , volume =

    Bruce Hajek , title =. Mathematics of Operations Research , volume =. 1988 , publisher =

  70. [70]

    Kolda and R

    G. Kolda and R. Lewis and V. Torczon , title =. SIAM Review , volume =. 2003 , doi =

  71. [71]

    Neurocomputing , volume =

    Methods to balance the exploration and exploitation in Differential Evolution from different scales: A survey , author =. Neurocomputing , volume =

  72. [72]

    The Exploration-Exploitation Dilemma: A Multidisciplinary Framework , journal =

  73. [73]

    Computer Methods in Applied Mechanics and Engineering , volume =

    Efficient parallel genetic algorithms: theory and practice , author =. Computer Methods in Applied Mechanics and Engineering , volume =. 2000 , doi =

  74. [74]

    Kennedy and R

    J. Kennedy and R. Eberhart , title =. Proceedings of ICNN'95 - International Conference on Neural Networks , year =

  75. [75]

    2004 , publisher=

    Convex Optimization , author=. 2004 , publisher=

  76. [76]

    Computational Statistics & Data Analysis , volume =

    Bayesian quantile regression using random B-spline series prior , author =. Computational Statistics & Data Analysis , volume =. 2017 , doi =

  77. [77]

    Zhu and L

    L. Zhu and L. Xue , title =. Journal of the Royal Statistical Society: Series B , volume =. 2006 , doi =

  78. [78]

    Carroll and J

    R. Carroll and J. Fan and I. Gijbels and others , title =. Journal of the American Statistical Association , volume =. 1997 , doi =

  79. [79]

    International Economic Review , year=

    A Multinomial Extension of the Linear Logit Model , author=. International Economic Review , year=

  80. [80]

    Tibshirani , title =

    R. Tibshirani , title =. Journal of the Royal Statistical Society: Series B (Statistical Methodology) , year =

Showing first 80 references.