Scalable subspace methods for derivative-free nonlinear least-squares optimization

C Cartis, L Roberts - Mathematical Programming, 2023 - Springer
We introduce a general framework for large-scale model-based derivative-free optimization
based on iterative minimization within random subspaces. We present a probabilistic worst …

Direct search based on probabilistic descent in reduced spaces

L Roberts, CW Royer - SIAM Journal on Optimization, 2023 - SIAM
Derivative-free algorithms seek the minimum value of a given objective function without
using any derivative information. The performance of these methods often worsens as the …

Randomised subspace methods for non-convex optimization, with applications to nonlinear least-squares

C Cartis, J Fowkes, Z Shao - arXiv preprint arXiv:2211.09873, 2022 - arxiv.org
We propose a general random subspace framework for unconstrained nonconvex
optimization problems that requires a weak probabilistic assumption on the subspace …

Stochastic trust-region algorithm in random subspaces with convergence and expected complexity analyses

KJ Dzahini, SM Wild - SIAM Journal on Optimization, 2024 - SIAM
This work proposes a framework for large-scale stochastic derivative-free optimization (DFO)
by introducing STARS, a trust-region method based on iterative minimization in random …

Yet another fast variant of Newton's method for nonconvex optimization

S Gratton, S Jerad, PL Toint - IMA Journal of Numerical Analysis, 2024 - academic.oup.com
A class of second-order algorithms is proposed for minimizing smooth nonconvex functions
that alternates between regularized Newton and negative curvature steps in an iteration …

Randomized subspace regularized newton method for unconstrained non-convex optimization

T Fuji, PL Poirion, A Takeda - arXiv preprint arXiv:2209.04170, 2022 - arxiv.org
While there already exist randomized subspace Newton methods that restrict the search
direction to a random subspace for a convex function, we propose a randomized subspace …

Expected decrease for derivative-free algorithms using random subspaces

W Hare, L Roberts, C Royer - Mathematics of Computation, 2025 - ams.org
Derivative-free algorithms seek the minimum of a given function based only on function
values queried at appropriate points. Although these methods are widely used in practice …

Learning the subspace of variation for global optimization of functions with low effective dimension

C Cartis, X Liang, E Massart, A Otemissov - arXiv preprint arXiv …, 2024 - arxiv.org
We propose an algorithmic framework, that employs active subspace techniques, for
scalable global optimization of functions with low effective dimension (also referred to as low …

Direct search for stochastic optimization in random subspaces with zeroth-, first-, and second-order convergence and expected complexity

KJ Dzahini, SM Wild - arXiv preprint arXiv:2403.13320, 2024 - arxiv.org
The work presented here is motivated by the development of StoDARS, a framework for
large-scale stochastic blackbox optimization that not only is both an algorithmic and …

Randomized Subspace Derivative-Free Optimization with Quadratic Models and Second-Order Convergence

C Cartis, L Roberts - arXiv preprint arXiv:2412.14431, 2024 - arxiv.org
We consider model-based derivative-free optimization (DFO) for large-scale problems,
based on iterative minimization in random subspaces. We provide the first worst-case …