JM Kohler, A Lucchi - International Conference on Machine …, 2017 - proceedings.mlr.press
We consider the minimization of non-convex functions that typically arise in machine learning. Specifically, we focus our attention on a variant of trust region methods known as …
Y Carmon, D Hausler, A Jambulapati… - Advances in Neural …, 2022 - proceedings.neurips.cc
We develop a variant of the Monteiro-Svaiter (MS) acceleration framework that removes the need to solve an expensive implicit equation at every iteration. Consequently, for any $ p\ge …
Do you know the difference between an optimist and a pessimist? The former believes we live in the best possible world, and the latter is afraid that the former might be right.… In that …
In a recent paper, we introduced a trust-region method with variable norms for unconstrained minimization, we proved standard asymptotic convergence results, and we …
Adaptive regularization with cubics (ARC) is an algorithm for unconstrained, non-convex optimization. Akin to the trust-region method, its iterations can be thought of as approximate …
C Cartis, L Roberts - Mathematical Programming Computation, 2019 - Springer
We present DFO-GN, a derivative-free version of the Gauss–Newton method for solving nonlinear least-squares problems. DFO-GN uses linear interpolation of residual values to …
We establish or refute the optimality of inexact second-order methods for unconstrained nonconvex optimization from the point of view of worst-case evaluation complexity …
In this paper we consider the problem of minimizing a smooth function by using the adaptive cubic regularized (ARC) framework. We focus on the computation of the trial step as a …
N Marumo, A Takeda - SIAM Journal on Optimization, 2024 - SIAM
We propose a new first-order method for minimizing nonconvex functions with a Lipschitz continuous gradient and Hessian. The proposed method is an accelerated gradient descent …