Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

HH Chieng, N Wahid, P Ong, SRK Perla - arXiv preprint arXiv:1812.06247, 2018 - arxiv.org
Activation functions are essential for deep learning methods to learn and perform complex
tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and …

Deep learning with s-shaped rectified linear activation units

X Jin, C Xu, J Feng, Y Wei, J Xiong, S Yan - Proceedings of the AAAI …, 2016 - ojs.aaai.org
Rectified linear activation units are important components for state-of-the-art deep
convolutional networks. In this paper, we propose a novel S-shaped rectifiedlinear activation …

Natural-logarithm-rectified activation function in convolutional neural networks

Y Liu, J Zhang, C Gao, J Qu, L Ji - 2019 IEEE 5th International …, 2019 - ieeexplore.ieee.org
Activation functions playa key role in providing remarkable performance in deep neural
networks, and the rectified linear unit (ReLU) is one of the most widely used activation …

Nipuna: A novel optimizer activation function for deep neural networks

G Madhu, S Kautish, KA Alnowibet, HM Zawbaa… - Axioms, 2023 - mdpi.com
In recent years, various deep neural networks with different learning paradigms have been
widely employed in various applications, including medical diagnosis, image analysis, self …

Is it time to swish? Comparing deep learning activation functions across NLP tasks

S Eger, P Youssef, I Gurevych - arXiv preprint arXiv:1901.02671, 2019 - arxiv.org
Activation functions play a crucial role in neural networks because they are the
nonlinearities which have been attributed to the success story of deep learning. One of the …

Gish: a novel activation function for image classification

M Kaytan, IB Aydilek, C Yeroğlu - Neural Computing and Applications, 2023 - Springer
Abstract In Convolutional Neural Networks (CNNs), the selection and use of appropriate
activation functions is of critical importance. It has been seen that the Rectified Linear Unit …

HcLSH: a novel non-linear monotonic activation function for deep learning methods

H Abdel-Nabi, G Al-Naymat, MZ Ali, A Awajan - IEEE Access, 2023 - ieeexplore.ieee.org
Activation functions are essential components in any neural network model; they play a
crucial role in determining the network's expressive power through their introduced non …

PFLU and FPFLU: Two novel non-monotonic activation functions in convolutional neural networks

M Zhu, W Min, Q Wang, S Zou, X Chen - Neurocomputing, 2021 - Elsevier
The choice of activation functions in Convolutional Neural Networks (CNNs) is very
important. Rectified Linear Unit (ReLU) has been widely-used in most CNNs. Recently, a …

Learning specialized activation functions with the piecewise linear unit

Y Zhou, Z Zhu, Z Zhong - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …

An empirical study on generalizations of the ReLU activation function

C Banerjee, T Mukherjee, E Pasiliao Jr - Proceedings of the 2019 ACM …, 2019 - dl.acm.org
Deep Neural Networks have become the tool of choice for Machine Learning practitioners
today. They have been successfully applied for solving a large class of learning problems …