[图书][B] Nonlinear conjugate gradient methods for unconstrained optimization

N Andrei - 2020 - Springer
This book is on conjugate gradient methods for unconstrained optimization. The concept of
conjugacy was introduced by Magnus Hestenes and Garrett Birkhoff in 1936 in the context of …

On amortizing convex conjugates for optimal transport

B Amos - arXiv preprint arXiv:2210.12153, 2022 - arxiv.org
This paper focuses on computing the convex conjugate operation that arises when solving
Euclidean Wasserstein-2 optimal transport problems. This conjugation, which is also …

A three-term conjugate gradient algorithm for large-scale unconstrained optimization problems

S Deng, Z Wan - Applied Numerical Mathematics, 2015 - Elsevier
In this paper, a three-term conjugate gradient algorithm is developed for solving large-scale
unconstrained optimization problems. The search direction at each iteration of the algorithm …

The global convergence of the BFGS method with a modified WWP line search for nonconvex functions

G Yuan, P Li, J Lu - Numerical Algorithms, 2022 - Springer
The BFGS method, which has great numerical stability, is one of the quasi-Newton line
search methods. However, the global convergence of the BFGS method with a Wolfe line …

[HTML][HTML] An extended nonmonotone line search technique for large-scale unconstrained optimization

S Huang, Z Wan, J Zhang - Journal of Computational and Applied …, 2018 - Elsevier
In this paper, an extended nonmonotone line search is proposed to improve the efficiency of
the existing line searches. This line search is first proved to be an extension of the classical …

A new nonmonotone line search technique for unconstrained optimization

S Huang, Z Wan, X Chen - Numerical Algorithms, 2015 - Springer
In this paper, a new nonmonotone line search rule is proposed, which is verified to be an
improved version of the nonmonotone line search technique proposed by Zhang and Hager …

A globalization of L-BFGS for nonconvex unconstrained optimization

F Mannel - arXiv preprint arXiv:2401.03805, 2024 - arxiv.org
We present a modification of the limited memory BFGS (L-BFGS) method that ensures global
and linear convergence on nonconvex objective functions. Importantly, the modified method …

An advanced active set L-BFGS algorithm for training weight-constrained neural networks

IE Livieris - Neural Computing and Applications, 2020 - Springer
In this work, a new advanced active set limited memory BFGS (Broyden–Fletcher–Goldfarb–
Shanno) algorithm is proposed for efficiently training weight-constrained neural networks …

A modified Broyden family algorithm with global convergence under a weak Wolfe-Powell line search for unconstrained nonconvex problems

G Yuan, Z Wang, P Li - Calcolo, 2020 - Springer
Abstract The Quasi-Newton method is one of the most effective methods using the first
derivative for solving all unconstrained optimization problems. The Broyden family method …

A novel hybrid algorithm for solving multiobjective optimization problems with engineering applications

L Fan, T Yoshino, T Xu, Y Lin… - Mathematical Problems in …, 2018 - Wiley Online Library
An effective hybrid algorithm is proposed for solving multiobjective optimization engineering
problems with inequality constraints. The weighted sum technique and BFGS quasi …