[HTML][HTML] A new family of conjugate gradient methods

ZJ Shi, J Guo - Journal of Computational and Applied Mathematics, 2009 - Elsevier
Journal of Computational and Applied Mathematics, 2009Elsevier
In this paper we develop a new class of conjugate gradient methods for unconstrained
optimization problems. A new nonmonotone line search technique is proposed to guarantee
the global convergence of these conjugate gradient methods under some mild conditions. In
particular, Polak–Ribiére–Polyak and Liu–Storey conjugate gradient methods are special
cases of the new class of conjugate gradient methods. By estimating the local Lipschitz
constant of the derivative of objective functions, we can find an adequate step size and …
In this paper we develop a new class of conjugate gradient methods for unconstrained optimization problems. A new nonmonotone line search technique is proposed to guarantee the global convergence of these conjugate gradient methods under some mild conditions. In particular, Polak–Ribiére–Polyak and Liu–Storey conjugate gradient methods are special cases of the new class of conjugate gradient methods. By estimating the local Lipschitz constant of the derivative of objective functions, we can find an adequate step size and substantially decrease the function evaluations at each iteration. Numerical results show that these new conjugate gradient methods are effective in minimizing large-scale non-convex non-quadratic functions.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References