Two families of scaled three-term conjugate gradient methods with sufficient descent property for nonconvex optimization

S Bojari, MR Eslahchi - Numerical Algorithms, 2020 - Springer
Numerical Algorithms, 2020Springer
In this paper, we present two families of modified three-term conjugate gradient methods for
solving unconstrained large-scale smooth optimization problems. We show that our new
families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under
any line search technique which guarantees the positiveness of yk T sk y_k^Ts_k. For
uniformly convex functions, we indicate that our families are globally convergent under weak-
Wolfe-Powell line search technique and standard conditions on the objective function. We …
Abstract
In this paper, we present two families of modified three-term conjugate gradient methods for solving unconstrained large-scale smooth optimization problems. We show that our new families satisfy the Dai-Liao conjugacy condition and the sufficient descent condition under any line search technique which guarantees the positiveness of . For uniformly convex functions, we indicate that our families are globally convergent under weak-Wolfe-Powell line search technique and standard conditions on the objective function. We also establish a weaker global convergence theorem for general smooth functions under similar assumptions. Our numerical experiments for 260 standard problems and seven other recently developed conjugate gradient methods illustrate that the members of our families are numerically efficient and effective.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果