[HTML][HTML] An adaptive fractional-order BP neural network based on extremal optimization for handwritten digits recognition

MR Chen, BP Chen, GQ Zeng, KD Lu, P Chu - Neurocomputing, 2020 - Elsevier
MR Chen, BP Chen, GQ Zeng, KD Lu, P Chu
Neurocomputing, 2020Elsevier
The optimal generation of initial connection weight parameters and dynamic updating
strategies of connection weights are critical for adjusting the performance of back-
propagation (BP) neural networks. This paper presents an adaptive fractional-order BP
neural network abbreviated as PEO-FOBP for handwritten digit recognition problems by
combining a competitive evolutionary algorithm called population extremal optimization and
a fractional-order gradient descent learning mechanism. Population extremal optimization is …
Abstract
The optimal generation of initial connection weight parameters and dynamic updating strategies of connection weights are critical for adjusting the performance of back-propagation (BP) neural networks. This paper presents an adaptive fractional-order BP neural network abbreviated as PEO-FOBP for handwritten digit recognition problems by combining a competitive evolutionary algorithm called population extremal optimization and a fractional-order gradient descent learning mechanism. Population extremal optimization is introduced to optimize a large number of initial connection weight parameters and fractional-order gradient descent learning mechanism is designed to update these connection weight parameters adaptively during the evolutionary process of fractional-order BP neural network. The extensive experimental results for a well-known MNIST handwritten digits dataset have demonstrated that the proposed PEO-FOBP outperforms the original fractional-order BP neural network and the traditional integer-order BP neural network in terms of training and testing accuracies.
Elsevier
以上显示的是最相近的搜索结果。 查看全部搜索结果