Improved GWO and its application in parameter optimization of Elman neural network

W Liu, J Sun, G Liu, S Fu, M Liu, Y Zhu, Q Gao - Plos one, 2023 - journals.plos.org
W Liu, J Sun, G Liu, S Fu, M Liu, Y Zhu, Q Gao
Plos one, 2023journals.plos.org
Traditional neural networks used gradient descent methods to train the network structure,
which cannot handle complex optimization problems. We proposed an improved grey wolf
optimizer (SGWO) to explore a better network structure. GWO was improved by using circle
population initialization, information interaction mechanism and adaptive position update to
enhance the search performance of the algorithm. SGWO was applied to optimize Elman
network structure, and a new prediction method (SGWO-Elman) was proposed. The …
Traditional neural networks used gradient descent methods to train the network structure, which cannot handle complex optimization problems. We proposed an improved grey wolf optimizer (SGWO) to explore a better network structure. GWO was improved by using circle population initialization, information interaction mechanism and adaptive position update to enhance the search performance of the algorithm. SGWO was applied to optimize Elman network structure, and a new prediction method (SGWO-Elman) was proposed. The convergence of SGWO was analyzed by mathematical theory, and the optimization ability of SGWO and the prediction performance of SGWO-Elman were examined using comparative experiments. The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance.
PLOS
以上显示的是最相近的搜索结果。 查看全部搜索结果