C He, Z Lu, TK Pong - SIAM Journal on Optimization, 2023 - SIAM
In this paper we consider finding a second-order stationary point (SOSP) of nonconvex equality constrained optimization when a nearly feasible point is known. In particular, we first …
Y Liu, F Roosta - arXiv preprint arXiv:2208.07095, 2022 - arxiv.org
In this paper, we consider variants of Newton-MR algorithm for solving unconstrained, smooth, but non-convex optimization problems. Unlike the overwhelming majority of Newton …
C Cartis, W Zhu - arXiv preprint arXiv:2308.15336, 2023 - arxiv.org
There has been growing interest in high-order tensor methods for nonconvex optimization, with adaptive regularization, as they possess better/optimal worst-case evaluation …
Despite their popularity in the field of continuous optimisation, second-order quasi-Newton methods are challenging to apply in machine learning, as the Hessian matrix is intractably …
Nonlinear conjugate gradients are among the most popular techniques for solving continuous optimization problems. Although these schemes have long been studied from a …
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions that alternates between regularized Newton and negative curvature steps in an iteration …
The difficulty of minimizing a nonconvex function is in part explained by the presence of saddle points. This slows down optimization algorithms and impacts worst-case complexity …
Y Liu, F Roosta - SIAM Journal on Optimization, 2022 - SIAM
The conjugate gradient method (CG) has long been the workhorse for inner-iterations of second-order algorithms for large-scale nonconvex optimization. Prominent examples …
We consider variants of a recently developed Newton-CG algorithm for nonconvex problems (Royer, CW & Wright, SJ (2018) Complexity analysis of second-order line-search algorithms …