AdaSwarm: Augmenting gradient-based optimizers in deep learning with swarm intelligence

R Mohapatra, S Saha, CAC Coello… - … on Emerging Topics …, 2021 - ieeexplore.ieee.org
This paper introduces AdaSwarm, a novel gradient-free optimizer which has similar or even
better performance than the Adam optimizer adopted in neural networks. In order to support
our proposed AdaSwarm, a novel Exponentially weighted Momentum Particle Swarm
Optimizer (EMPSO), is proposed. The ability of AdaSwarm to tackle optimization problems is
attributed to its capability to perform good gradient approximations. We show that, the
gradient of any function, differentiable or not, can be approximated by using the parameters …

[PDF][PDF] Adaswarm: Augmenting gradient-based optimizers in deep learning with swarm intelligence

SS Dhavala, S Saha - researchgate.net
This paper introduces AdaSwarm, a novel gradientfree optimizer which has similar or even
better performance than the Adam optimizer adopted in neural networks. In order to support
our proposed AdaSwarm, a novel Exponentially weighted Momentum Particle Swarm
Optimizer (EMPSO), is proposed. The ability of AdaSwarm to tackle optimization problems is
attributed to its capability to perform good gradient approximations. We show that, the
gradient of any function, differentiable or not, can be approximated by using the parameters …
以上显示的是最相近的搜索结果。 查看全部搜索结果