In neural networks literature, there is a strong interest in identifying and defining activation functions which can improve neural network performance. In recent years there has been a …
Inspired by biological neurons, the activation functions play an essential part in the learning process of any artificial neural network (ANN) commonly used in many real-world problems …
Neural networks are generally built by interleaving (adaptable) linear layers with (fixed) nonlinear activation functions. To increase their flexibility, several authors have proposed …
Neural networks have proven to be a highly effective tool for solving complex problems in many areas of life. Recently, their importance and practical usability have further been …
For neural networks we propose stochastic, non-parametric activation functions that are fully learnable and individual to each neuron. Overfitting is prevented by placing a Gaussian …
CJ Vercellino, WY Wang - … on Meta-learning. Long Beach, USA, 2017 - meta-learn.github.io
Typically, when designing neural network architectures, a fixed activation function is chosen to introduce nonlinearity between layers. Various architecture agnostic activation functions …
We propose stochastic, non-parametric activation functions that are fully learnable and individual to each neuron. Complexity and the risk of overfitting are controlled by placing a …
The design of activation functions is a growing research area in the field of neural networks. In particular, instead of using fixed point-wise functions (eg, the rectified linear unit), several …
Deep learning, the study of multi-layered artificial neural networks, has received tremendous attention over the course of the last few years. Neural networks are now able to outperform …