Researchers propose a new method for finding second-order stationary points in differentially private stochastic non-convex optimization.Existing methods have limitations in convergence error rate and reliance on auxiliary private model selection procedures.The proposed perturbed stochastic gradient descent framework addresses these issues by using Gaussian noise injection and gradient oracles.Numerical experiments on real-world datasets validate the effectiveness of the new approach.