S. Mahloujifar, A. Sablayrolles, G. Cormode, and S. Jha. Optimal group privacy for dp-sgd. IEEE Data Engineering Bulletin, 2025.

One challenging problem with differentially private machine learning is privacy accounting. After years of research, the community has successfully established tight privacy accounting methods for differentially private stochastic gradient descent (DP-SGD). Despite these advances, tight bounds for group privacy still remain elusive. Group privacy is an essential aspect of differential privacy that enables many applications. In this work, we develop tight bounds on group privacy for DP-SGD. In this work, we develop tight bounds on group privacy for DP-SGD. Our analysis uses a novel technique to show “dominating pairs of distributions” explicitly tailored for the case of group privacy. Our experiments show that our bounds are significantly better than previously known bounds in certain regimes. Surprisingly, we find that group privacy is significantly affected by sub-sampling. Two sets of hyper-parameters (sampling rate and noise) with the exact same (ε, δ) parameters can have significantly different group privacy curves.

bib | .pdf ] Back


This file was generated by bibtex2html 1.92.