menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Second-Ord...
source image

Arxiv

1d

read

186

img
dot

Image Credit: Arxiv

Second-Order Convergence in Private Stochastic Non-Convex Optimization

  • Researchers propose a new method for finding second-order stationary points in differentially private stochastic non-convex optimization.
  • Existing methods have limitations in convergence error rate and reliance on auxiliary private model selection procedures.
  • The proposed perturbed stochastic gradient descent framework addresses these issues by using Gaussian noise injection and gradient oracles.
  • Numerical experiments on real-world datasets validate the effectiveness of the new approach.

Read Full Article

like

11 Likes

For uninterrupted reading, download the app