math.OC

(what is this?)

# Title: Stochastic model-based minimization of weakly convex functions

Abstract: We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$. Our result yields new complexity guarantees for the stochastic proximal point algorithm on weakly convex problems and for the stochastic prox-linear algorithm for minimizing compositions of convex functions with smooth maps. Moreover, our result also recovers the recently obtained complexity estimate for the stochastic proximal subgradient method on weakly convex problems.
 Comments: 9 pages Subjects: Optimization and Control (math.OC); Learning (cs.LG) MSC classes: 65K05, 65K10, 90C15, 90C30 Cite as: arXiv:1803.06523 [math.OC] (or arXiv:1803.06523v1 [math.OC] for this version)

## Submission history

From: Damek Davis [view email]
[v1] Sat, 17 Mar 2018 15:38:20 GMT (11kb)