PFLU and FPFLU: Two novel non-monotonic activation functions in convolutional neural networks

M Zhu, W Min, Q Wang, S Zou, X Chen - Neurocomputing, 2021 - Elsevier
The choice of activation functions in Convolutional Neural Networks (CNNs) is very
important. Rectified Linear Unit (ReLU) has been widely-used in most CNNs. Recently, a …

Natural-logarithm-rectified activation function in convolutional neural networks

Y Liu, J Zhang, C Gao, J Qu, L Ji - 2019 IEEE 5th International …, 2019 - ieeexplore.ieee.org
Activation functions playa key role in providing remarkable performance in deep neural
networks, and the rectified linear unit (ReLU) is one of the most widely used activation …

Improving convolutional neural network using pseudo derivative ReLU

Z Hu, Y Li, Z Yang - 2018 5th international conference on …, 2018 - ieeexplore.ieee.org
Rectified linear unit (ReLU) is a widely used activation function in artificial neural networks, it
is considered to be an efficient active function benefit from its simplicity and nonlinearity …

Parametric rectified nonlinear unit (PRenu) for convolution neural networks

I El Jaafari, A Ellahyani, S Charfi - Signal, Image and Video Processing, 2021 - Springer
Activation function unit is an extremely important part of convolution neural networks; it is the
nonlinear transformation that we do over the input data. Using hidden layer incorporating …

RELU-function and derived function review

Y Bai - SHS Web of Conferences, 2022 - shs-conferences.org
The activation function plays an important role in training and improving performance in
deep neural networks (dnn). The rectified linear unit (relu) function provides the necessary …

FReLU: Flexible rectified linear units for improving convolutional neural networks

S Qiu, X Xu, B Cai - 2018 24th international conference on …, 2018 - ieeexplore.ieee.org
Rectified linear unit (ReLU) is a widely used activation function for deep convolutional
neural networks. However, because of the zero-hard rectification, ReLU networks lose the …

Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

HH Chieng, N Wahid, P Ong, SRK Perla - arXiv preprint arXiv:1812.06247, 2018 - arxiv.org
Activation functions are essential for deep learning methods to learn and perform complex
tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and …

PolyLU: A simple and robust polynomial-based linear unit activation function for deep learning

HS Feng, CH Yang - IEEE Access, 2023 - ieeexplore.ieee.org
The activation function has a critical influence on whether a convolutional neural network in
deep learning can converge or not; a proper activation function not only makes the …

Rectified exponential units for convolutional neural networks

Y Ying, J Su, P Shan, L Miao, X Wang, S Peng - IEEE Access, 2019 - ieeexplore.ieee.org
Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks
(CNNs). In this paper, we propose a novel activation function called Rectified Exponential …

Deep learning with s-shaped rectified linear activation units

X Jin, C Xu, J Feng, Y Wei, J Xiong, S Yan - Proceedings of the AAAI …, 2016 - ojs.aaai.org
Rectified linear activation units are important components for state-of-the-art deep
convolutional networks. In this paper, we propose a novel S-shaped rectifiedlinear activation …