pith. machine review for the scientific record. sign in

arxiv: 2603.03099 · v7 · submitted 2026-03-03 · 💻 cs.LG · cs.AI

Recognition: unknown

Why Adam Can Beat SGD: Second-Moment Normalization Yields Sharper Tails

Authors on Pith no claims yet
classification 💻 cs.LG cs.AI
keywords adamdeltaconvergencedependenceempiricalhigh-probabilitynormalizationsecond-moment
0
0 comments X
read the original abstract

Despite Adam demonstrating faster empirical convergence than SGD in many applications, much of the existing theory yields guarantees essentially comparable to those of SGD, leaving the empirical performance gap insufficiently explained. In this paper, we uncover a key second-moment normalization in Adam and develop a stopping-time/martingale analysis that provably distinguishes Adam from SGD under the classical bounded variance model (a second moment assumption). In particular, we establish the first theoretical separation between the high-probability convergence behaviors of the two methods: Adam achieves a $\delta^{-1/2}$ dependence on the confidence parameter $\delta$, whereas corresponding high-probability guarantee for SGD necessarily incurs at least a $\delta^{-1}$ dependence.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. PowerStep: Memory-Efficient Adaptive Optimization via $\ell_p$-Norm Steepest Descent

    cs.LG 2026-05 unverdicted novelty 6.0

    PowerStep delivers coordinate-wise adaptive optimization by nonlinearly transforming a momentum buffer under an lp-norm steepest-descent geometry, matching Adam convergence with half the memory and supporting aggressi...