作者
George D Magoulas, Michael N Vrahatis, George S Androulakis
发表日期
1997/1/31
期刊
Neural Networks
卷号
10
期号
1
页码范围
69-82
出版商
Pergamon
简介
The issue of variable stepsize in the backpropagation training algorithm has been widely investigated and several techniques employing heuristic factors have been suggested to improve training time and reduce convergence to local minima. In this contribution, backpropagation training is based on a modified steepest descent method which allows variable stepsize. It is computationally efficient and posseses interesting convergence properties utilizing estimates of the Lipschitz constant without any additional computational cost. The algorithm has been implemented and tested on several problems and the results have been very satisfactory. Numerical evidence shows that the method is robust with good average performance on many classes of problems. Copyright © 1996 Elsevier Science Ltd.
引用总数
19961997199819992000200120022003200420052006200720082009201020112012201320142015201620172018201920202021202220232024266171519141920152615141016841185667654841
学术搜索中的文章