作者
Yi Xu, Zhuoning Yuan, Sen Yang, Rong Jin, Tianbao Yang
发表日期
2019/1/30
期刊
International Joint Conference on Artificial Intelligence 28 (IJCAI2019)
简介
Extrapolation is a well-known technique for solving convex optimization and variational inequalities and recently attracts some attention for nonconvex optimization. Several recent works have empirically shown its success in some machine learning tasks. However, it has not been analyzed for non-convex minimization and there still remains a gap between the theory and the practice. In this paper, we analyze gradient descent and stochastic gradient descent methods with extrapolation for finding an approximate first-order stationary point of smooth non-convex optimization problems. Our convergence upper bounds show that the algorithms with extrapolation could be potentially faster than without extrapolation.
引用总数
201920202021202220232024133531