LR Sütfeld, F Brieger, H Finger, S Füllhase… - arXiv preprint arXiv …, 2018 - arxiv.org
The most widely used activation functions in current deep feed-forward neural networks are
rectified linear units (ReLU), and many alternatives have been successfully applied, as well …