Recognition: unknown
On Symplectic Optimization
read the original abstract
Accelerated gradient methods have had significant impact in machine learning -- in particular the theoretical side of machine learning -- due to their ability to achieve oracle lower bounds. But their heuristic construction has hindered their full integration into the practical machine-learning algorithmic toolbox, and has limited their scope. In this paper we build on recent work which casts acceleration as a phenomenon best explained in continuous time, and we augment that picture by providing a systematic methodology for converting continuous-time dynamics into discrete-time algorithms while retaining oracle rates. Our framework is based on ideas from Hamiltonian dynamical systems and symplectic integration. These ideas have had major impact in many areas in applied mathematics, but have not yet been seen to have a relationship with optimization.
This paper has not been read by Pith yet.
Forward citations
Cited by 3 Pith papers
-
Distributed Pose Graph Optimization via Continuous Riemannian Dynamics
Pose graph optimization is recast as damped Riemannian dynamics on Lie groups, enabling a fully distributed algorithm with a semi-implicit integrator that converges under both synchronous and asynchronous communication.
-
When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
SHAPE lifts gradient descent to an augmented phase space with a learned Hamiltonian vector field and event-triggered port updates to balance descent, exploitation, and exploration, improving best-so-far performance ov...
-
Foundations of Riemannian Geometry for Riemannian Optimization: A Monograph with Detailed Derivations
The monograph organizes and derives classical Riemannian geometry structures explicitly in coordinate and matrix form for direct use in optimization algorithms on nonlinear manifolds.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.