menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

PCDP-SGD: ...
source image

Arxiv

4d

read

315

img
dot

Image Credit: Arxiv

PCDP-SGD: Improving the Convergence of Differentially Private SGD via Projection in Advance

  • The paper proposes a framework called PCDP-SGD to improve the convergence of Differentially Private Stochastic Gradient Descent (DP-SGD).
  • PCDP-SGD compresses redundant gradient norms and preserves crucial top gradient components through a projection operation before gradient clipping.
  • The framework is extended to differential privacy federated learning (DPFL) to tackle data heterogeneity and achieve efficient communication.
  • Experimental results show that PCDP-SGD achieves higher accuracy compared to other DP-SGD variants and outperforms current federated learning frameworks in computer vision tasks.

Read Full Article

like

18 Likes

For uninterrupted reading, download the app