The paper proposes a nested stochastic gradient descent algorithm for solving regularized nonconvex Distributionally Robust Optimization (DRO) problems.
The algorithm is designed to handle DRO problems with generalized Sinkhorn distance and nonconvex, unbounded loss functions.
The proposed algorithm has polynomial iteration and sample complexities that are independent of data size and parameter dimension.
Numerical experiments demonstrate the efficiency and robustness of the algorithm.