[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization

N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …

Sub-sampled cubic regularization for non-convex optimization

JM Kohler, A Lucchi - International Conference on Machine …, 2017 - proceedings.mlr.press
We consider the minimization of non-convex functions that typically arise in machine
learning. Specifically, we focus our attention on a variant of trust region methods known as …

Optimal and adaptive monteiro-svaiter acceleration

Y Carmon, D Hausler, A Jambulapati… - Advances in Neural …, 2022 - proceedings.neurips.cc
We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the
need to solve an expensive implicit equation at every iteration. Consequently, for any $ p\ge …

[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives

C Cartis, NIM Gould, PL Toint - 2022 - SIAM
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …

Cubic-regularization counterpart of a variable-norm trust-region method for unconstrained minimization

JM Martínez, M Raydan - Journal of Global Optimization, 2017 - Springer
In a recent paper, we introduced a trust-region method with variable norms for
unconstrained minimization, we proved standard asymptotic convergence results, and we …

Adaptive regularization with cubics on manifolds

N Agarwal, N Boumal, B Bullins, C Cartis - Mathematical Programming, 2021 - Springer
Adaptive regularization with cubics (ARC) is an algorithm for unconstrained, non-convex
optimization. Akin to the trust-region method, its iterations can be thought of as approximate …

A derivative-free Gauss–Newton method

C Cartis, L Roberts - Mathematical Programming Computation, 2019 - Springer
We present DFO-GN, a derivative-free version of the Gauss–Newton method for solving
nonlinear least-squares problems. DFO-GN uses linear interpolation of residual values to …

Worst-case evaluation complexity and optimality of second-order methods for nonconvex smooth optimization

C Cartis, NIM Gould, PL Toint - Proceedings of the International …, 2018 - World Scientific
We establish or refute the optimality of inexact second-order methods for unconstrained
nonconvex optimization from the point of view of worst-case evaluation complexity …

On the use of iterative methods in cubic regularization for unconstrained optimization

T Bianconcini, G Liuzzi, B Morini… - Computational …, 2015 - Springer
In this paper we consider the problem of minimizing a smooth function by using the adaptive
cubic regularized (ARC) framework. We focus on the computation of the trial step as a …

Parameter-free accelerated gradient descent for nonconvex minimization

N Marumo, A Takeda - SIAM Journal on Optimization, 2024 - SIAM
We propose a new first-order method for minimizing nonconvex functions with a Lipschitz
continuous gradient and Hessian. The proposed method is an accelerated gradient descent …