Adam-HNAG is a splitting-based reformulation of Adam that yields the first convergence proof for Adam-type methods, including accelerated rates, in convex smooth optimization.
Popov, Arash Sarshar, and Adrian Sandu
2 Pith papers cite this work. Polarity classification is still indexing.
2
Pith papers citing it
fields
math.OC 2years
2026 2verdicts
UNVERDICTED 2representative citing papers
Adam-SHANG is a convergent Adam variant for stochastic smooth convex optimization that uses a stable lagged-preconditioner update and a computable trace-ratio stepsize rule.
citing papers explorer
-
Adam-HNAG: A Convergent Reformulation of Adam with Accelerated Rate
Adam-HNAG is a splitting-based reformulation of Adam that yields the first convergence proof for Adam-type methods, including accelerated rates, in convex smooth optimization.
-
Adam-SHANG: A Convergent Adam-Type Method for Stochastic Smooth Convex Optimization
Adam-SHANG is a convergent Adam variant for stochastic smooth convex optimization that uses a stable lagged-preconditioner update and a computable trace-ratio stepsize rule.