Mathematical analysis and performance evaluation of the gelu activation function in deep learning

M Lee - Journal of Mathematics, 2023 - Wiley Online Library
Selecting the most suitable activation function is a critical factor in the effectiveness of deep
learning models, as it influences their learning capacity, stability, and computational …

Gelu activation function in deep learning: a comprehensive mathematical analysis and performance

M Lee - arXiv preprint arXiv:2305.12073, 2023 - arxiv.org
Selecting the most suitable activation function is a critical factor in the effectiveness of deep
learning models, as it influences their learning capacity, stability, and computational …

Natural-logarithm-rectified activation function in convolutional neural networks

Y Liu, J Zhang, C Gao, J Qu, L Ji - 2019 IEEE 5th International …, 2019 - ieeexplore.ieee.org
Activation functions playa key role in providing remarkable performance in deep neural
networks, and the rectified linear unit (ReLU) is one of the most widely used activation …

HcLSH: a novel non-linear monotonic activation function for deep learning methods

H Abdel-Nabi, G Al-Naymat, MZ Ali, A Awajan - IEEE Access, 2023 - ieeexplore.ieee.org
Activation functions are essential components in any neural network model; they play a
crucial role in determining the network's expressive power through their introduced non …

Activation functions in artificial neural networks: A systematic overview

J Lederer - arXiv preprint arXiv:2101.09957, 2021 - arxiv.org
Activation functions shape the outputs of artificial neurons and, therefore, are integral parts
of neural networks in general and deep learning in particular. Some activation functions …

PolyLU: A simple and robust polynomial-based linear unit activation function for deep learning

HS Feng, CH Yang - IEEE Access, 2023 - ieeexplore.ieee.org
The activation function has a critical influence on whether a convolutional neural network in
deep learning can converge or not; a proper activation function not only makes the …

Revise saturated activation functions

B Xu, R Huang, M Li - arXiv preprint arXiv:1602.05980, 2016 - arxiv.org
In this paper, we revise two commonly used saturated functions, the logistic sigmoid and the
hyperbolic tangent (tanh). We point out that, besides the well-known non-zero centered …

Review and comparison of commonly used activation functions for deep neural networks

T Szandała - Bio-inspired neurocomputing, 2021 - Springer
The primary neural networks' decision-making units are activation functions. Moreover, they
evaluate the output of networks neural node; thus, they are essential for the performance of …

SMU: smooth activation function for deep networks using smoothing maximum technique

K Biswas, S Kumar, S Banerjee, AK Pandey - arXiv preprint arXiv …, 2021 - arxiv.org
Deep learning researchers have a keen interest in proposing two new novel activation
functions which can boost network performance. A good choice of activation function can …

RSigELU: A nonlinear activation function for deep neural networks

S Kiliçarslan, M Celik - Expert Systems with Applications, 2021 - Elsevier
In deep learning models, the inputs to the network are processed using activation functions
to generate the output corresponding to these inputs. Deep learning models are of particular …