[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives

C Cartis, NIM Gould, PL Toint - 2022 - SIAM
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …

A Newton-CG based augmented Lagrangian method for finding a second-order stationary point of nonconvex equality constrained optimization with complexity …

C He, Z Lu, TK Pong - SIAM Journal on Optimization, 2023 - SIAM
In this paper we consider finding a second-order stationary point (SOSP) of nonconvex
equality constrained optimization when a nearly feasible point is known. In particular, we first …

A Newton-MR algorithm with complexity guarantees for nonconvex smooth unconstrained optimization

Y Liu, F Roosta - arXiv preprint arXiv:2208.07095, 2022 - arxiv.org
In this paper, we consider variants of Newton-MR algorithm for solving unconstrained,
smooth, but non-convex optimization problems. Unlike the overwhelming majority of Newton …

Second-order methods for quartically-regularised cubic polynomials, with applications to high-order tensor methods

C Cartis, W Zhu - arXiv preprint arXiv:2308.15336, 2023 - arxiv.org
There has been growing interest in high-order tensor methods for nonconvex optimization,
with adaptive regularization, as they possess better/optimal worst-case evaluation …

Series of hessian-vector products for tractable saddle-free newton optimisation of neural networks

ET Oldewage, RM Clarke… - arXiv preprint arXiv …, 2023 - arxiv.org
Despite their popularity in the field of continuous optimisation, second-order quasi-Newton
methods are challenging to apply in machine learning, as the Hessian matrix is intractably …

A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression

R Chan–Renous-Legoubin, CW Royer - EURO Journal on Computational …, 2022 - Elsevier
Nonlinear conjugate gradients are among the most popular techniques for solving
continuous optimization problems. Although these schemes have long been studied from a …

Yet another fast variant of Newton's method for nonconvex optimization

S Gratton, S Jerad, PL Toint - IMA Journal of Numerical Analysis, 2024 - academic.oup.com
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions
that alternates between regularized Newton and negative curvature steps in an iteration …

Riemannian trust-region methods for strict saddle functions with complexity guarantees

F Goyens, CW Royer - Mathematical Programming, 2024 - Springer
The difficulty of minimizing a nonconvex function is in part explained by the presence of
saddle points. This slows down optimization algorithms and impacts worst-case complexity …

Minres: From negative curvature detection to monotonicity properties

Y Liu, F Roosta - SIAM Journal on Optimization, 2022 - SIAM
The conjugate gradient method (CG) has long been the workhorse for inner-iterations of
second-order algorithms for large-scale nonconvex optimization. Prominent examples …

Inexact Newton-CG algorithms with complexity guarantees

Z Yao, P Xu, F Roosta, SJ Wright… - IMA Journal of …, 2023 - academic.oup.com
We consider variants of a recently developed Newton-CG algorithm for nonconvex problems
(Royer, CW & Wright, SJ (2018) Complexity analysis of second-order line-search algorithms …