menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

DC-SGD: Di...
source image

Arxiv

1d

read

162

img
dot

Image Credit: Arxiv

DC-SGD: Differentially Private SGD with Dynamic Clipping through Gradient Norm Distribution Estimation

  • Differentially Private Stochastic Gradient Descent (DP-SGD) is widely used for privacy-preserving deep learning.
  • The selection of the optimal clipping threshold C in DP-SGD poses a challenge, resulting in privacy and computational overhead.
  • A new framework called Dynamic Clipping DP-SGD (DC-SGD) is proposed, leveraging differentially private histograms to estimate gradient norm distributions and adjust the clipping threshold C dynamically.
  • Experimental results show that DC-SGD achieves up to 9 times acceleration in hyperparameter tuning compared to DP-SGD, with improved accuracy and privacy guarantees.

Read Full Article

like

9 Likes

For uninterrupted reading, download the app