Motivated by robust and quantile regression problems, a study investigates the stochastic gradient descent (SGD) algorithm for minimizing an objective function with a sub--quadratic tail.
The study introduces a novel piecewise Lyapunov function that can handle functions with only first-order differentiability, including popular loss functions such as Huber loss.
Finite-time moment bounds are derived for general diminishing stepsizes and constant stepsizes, and weak convergence, central limit theorem, and bias characterization are established for constant stepsize.
The results have wide applications, specifically in online robust regression and online quantile regression.