Loading [MathJax]/jax/element/mml/optable/GeneralPunctuation.js

arjevani20a@v125@PMLR

Total: 1

#1 Second-Order Information in Non-Convex Stochastic Optimization: Power and Limitations [PDF] [Copy] [Kimi] [REL]

Authors: Yossi Arjevani, Yair Carmon, John C. Duchi, Dylan J. Foster, Ayush Sekhari, Karthik Sridharan

We design an algorithm which finds an ϵ-approximate stationary point (with ) using O(\epsilon^{-3}) stochastic gradient and Hessian-vector products, matching guarantees that were previously available only under a stronger assumption of access to multiple queries with the same random seed. We prove a lower bound which establishes that this rate is optimal and—surprisingly—that it cannot be improved using stochastic pth order methods for any p\ge 2, even when the first p derivatives of the objective are Lipschitz. Together, these results characterize the complexity of non-convex stochastic optimization with second-order methods and beyond. Expanding our scope to the oracle complexity of finding (\epsilon,\gamma)-approximate second-order stationary points, we establish nearly matching upper and lower bounds for stochastic second-order methods. Our lower bounds here are novel even in the noiseless case.

Subject: COLT.2020 - Accept