Derivative-free optimization methods

J Larson, M Menickelly, SM Wild - Acta Numerica, 2019 - cambridge.org
In many optimization problems arising from scientific, engineering and artificial intelligence
applications, objective and constraint functions are available only as the output of a black …

Painless stochastic gradient: Interpolation, line-search, and convergence rates

S Vaswani, A Mishkin, I Laradji… - Advances in neural …, 2019 - proceedings.neurips.cc
Recent works have shown that stochastic gradient descent (SGD) achieves the fast
convergence rates of full-batch gradient descent for over-parameterized models satisfying …

A stochastic line search method with expected complexity analysis

C Paquette, K Scheinberg - SIAM Journal on Optimization, 2020 - SIAM
For deterministic optimization, line search methods augment algorithms by providing stability
and improved efficiency. Here we adapt a classical backtracking Armijo line search to the …

Global convergence rate analysis of a generic line search algorithm with noise

AS Berahas, L Cao, K Scheinberg - SIAM Journal on Optimization, 2021 - SIAM
In this paper, we develop convergence analysis of a modified line search method for
objective functions whose value is computed with noise and whose gradient estimates are …

[图书][B] Evaluation Complexity of Algorithms for Nonconvex Optimization: Theory, Computation and Perspectives

C Cartis, NIM Gould, PL Toint - 2022 - SIAM
Do you know the difference between an optimist and a pessimist? The former believes we
live in the best possible world, and the latter is afraid that the former might be right.… In that …

First-and second-order high probability complexity bounds for trust-region methods with noisy oracles

L Cao, AS Berahas, K Scheinberg - Mathematical Programming, 2024 - Springer
In this paper, we present convergence guarantees for a modified trust-region method
designed for minimizing objective functions whose value and gradient and Hessian …

Inequality constrained stochastic nonlinear optimization via active-set sequential quadratic programming

S Na, M Anitescu, M Kolar - Mathematical Programming, 2023 - Springer
We study nonlinear optimization problems with a stochastic objective and deterministic
equality and inequality constraints, which emerge in numerous applications including …

An adaptive stochastic sequential quadratic programming with differentiable exact augmented lagrangians

S Na, M Anitescu, M Kolar - Mathematical Programming, 2023 - Springer
We consider solving nonlinear optimization problems with a stochastic objective and
deterministic equality constraints. We assume for the objective that its evaluation, gradient …

A trust region method for noisy unconstrained optimization

S Sun, J Nocedal - Mathematical Programming, 2023 - Springer
Classical trust region methods were designed to solve problems in which function and
gradient information are exact. This paper considers the case when there are errors (or …

Iteration complexity and finite-time efficiency of adaptive sampling trust-region methods for stochastic derivative-free optimization

Y Ha, S Shashaani - IISE Transactions, 2024 - Taylor & Francis
ASTRO-DF is a prominent trust-region method using adaptive sampling for stochastic
derivative-free optimization of nonconvex problems. Its salient feature is an easy-to …