Requiring LICQ/SCS/SOSC everywhere in bilevel optimization is non-prevalent and rigid, while holding almost everywhere is prevalent, but the distinction introduces fundamental difficulties.
On penalty methods for nonconvex bilevel optimization and first-order stochastic approximation
3 Pith papers cite this work. Polarity classification is still indexing.
years
2026 3representative citing papers
Second-order bilevel methods achieve Õ(ε^{-1.5}) iteration complexity for second-order stationary points, faster than first-order approaches, with a lazy variant improving computational efficiency by √d.
BROS achieves memory-efficient single-loop stochastic bilevel optimization with O(ε^{-2}) sample complexity by performing updates in randomized subspaces and using Rademacher bi-probe correction for unbiased estimation.
citing papers explorer
-
On the Nature of Regularity Assumptions in Bilevel Optimization with Constrained Lower-level Problem
Requiring LICQ/SCS/SOSC everywhere in bilevel optimization is non-prevalent and rigid, while holding almost everywhere is prevalent, but the distinction introduces fundamental difficulties.
-
Second-Order Bilevel Optimization with Accelerated Convergence Rates
Second-order bilevel methods achieve Õ(ε^{-1.5}) iteration complexity for second-order stationary points, faster than first-order approaches, with a lazy variant improving computational efficiency by √d.
-
BROS: Bias-Corrected Randomized Subspaces for Memory-Efficient Single-Loop Bilevel Optimization
BROS achieves memory-efficient single-loop stochastic bilevel optimization with O(ε^{-2}) sample complexity by performing updates in randomized subspaces and using Rademacher bi-probe correction for unbiased estimation.