pith. machine review for the scientific record. sign in

arxiv: 2601.22334 · v2 · submitted 2026-01-29 · 💻 cs.LG

Recognition: unknown

DP-{λ}CGD: Efficient Noise Correlation for Differentially Private Model Training

Nikita P. Kalinin , Ryan McKenna , Rasmus Pagh , Christoph H. Lampert

Authors on Pith no claims yet
classification 💻 cs.LG
keywords noisedp-sgdtrainingaccuracyacrosscorrelationdifferentiallyiterations
0
0 comments X
read the original abstract

Differentially private stochastic gradient descent (DP-SGD) is the gold standard for training machine learning models with formal differential privacy guarantees. Several recent extensions improve its accuracy by introducing correlated noise across training iterations. Matrix factorization mechanisms are a prominent example, but they correlate noise across many iterations and require storing previously added noise vectors, leading to substantial memory overhead in some settings. In this work, we propose a new noise correlation strategy that correlates noise only with the immediately preceding iteration and cancels a controlled portion of it. Our method relies on noise regeneration using a pseudorandom noise generator, eliminating the need to store past noise. As a result, it requires no additional memory beyond standard DP-SGD. We show that the computational overhead is minimal and empirically demonstrate improved accuracy over DP-SGD.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.

Forward citations

Cited by 1 Pith paper

Reviewed papers in the Pith corpus that reference this work. Sorted by Pith novelty score.

  1. Population Risk Bounds for Kolmogorov-Arnold Networks Trained by DP-SGD with Correlated Noise

    cs.LG 2026-05 unverdicted novelty 8.0

    First population risk bounds for KANs under mini-batch DP-SGD with correlated noise, using a new non-convex optimization analysis combined with stability-based generalization.