Convergence properties of the BFGS algoritm

YH Dai - SIAM Journal on Optimization, 2002 - SIAM
The BFGS method is one of the most famous quasi-Newton algorithms for unconstrained
optimization. In 1984, Powell presented an example of a function of two variables that shows …

[HTML][HTML] The global convergence of a modified BFGS method for nonconvex functions

G Yuan, Z Sheng, B Wang, W Hu, C Li - Journal of Computational and …, 2018 - Elsevier
The standard BFGS method plays an important role among the quasi-Newton algorithms for
constrained/un-constrained optimization problems. However, Dai (2003) constructed a …

[HTML][HTML] A simple three-term conjugate gradient algorithm for unconstrained optimization

N Andrei - Journal of Computational and Applied Mathematics, 2013 - Elsevier
A simple three-term conjugate gradient algorithm which satisfies both the descent condition
and the conjugacy condition is presented. This algorithm is a modification of the Hestenes …

A perfect example for the BFGS method

YH Dai - Mathematical Programming, 2013 - Springer
Consider the BFGS quasi-Newton method applied to a general non-convex function that has
continuous second derivatives. This paper aims to construct a four-dimensional example …

[HTML][HTML] Global convergence of BFGS and PRP methods under a modified weak Wolfe–Powell line search

G Yuan, Z Wei, X Lu - Applied Mathematical Modelling, 2017 - Elsevier
The BFGS method is one of the most effective quasi-Newton algorithms for optimization
problems. However, its global convergence for general functions is still open. In this paper …

Another hybrid conjugate gradient algorithm for unconstrained optimization

N Andrei - Numerical Algorithms, 2008 - Springer
Another hybrid conjugate gradient algorithm is subject to analysis. The parameter β k is
computed as a convex combination of β^ HS _ k (Hestenes-Stiefel) and β^ DY _ k (Dai …

A combined conjugate-gradient quasi-Newton minimization algorithm

AG Buckley - Mathematical Programming, 1978 - Springer
Although quasi-Newton algorithms generally converge in fewer iterations than conjugate
gradient algorithms, they have the disadvantage of requiring substantially more storage. An …

On the behavior of Broyden's class of quasi-Newton methods

RH Byrd, DC Liu, J Nocedal - SIAM Journal on Optimization, 1992 - SIAM
This paper analyzes algorithms from the Broyden class of quasi-Newton methods for
nonlinear unconstrained optimization. This class depends on a parameter \phi_k, for which …

A relationship between the BFGS and conjugate gradient algorithms and its implications for new algorithms

L Nazareth - SIAM Journal on Numerical Analysis, 1979 - SIAM
Based upon analysis and numerical experience, the BFGS (Broyden–Fletcher–Goldfarb–
Shanno) algorithm is currently considered to be one of the most effective algorithms for …

On the global convergence of the BFGS method for nonconvex unconstrained optimization problems

DH Li, M Fukushima - SIAM Journal on Optimization, 2001 - SIAM
This paper is concerned with the open problem of whether the BFGS method with inexact
line search converges globally when applied to nonconvex unconstrained optimization …