A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

LB Godfrey, MS Gashler - 2015 7th international joint …, 2015 - ieeexplore.ieee.org
2015 7th international joint conference on knowledge discovery …, 2015ieeexplore.ieee.org
We present the soft exponential activation function for artificial neural networks that
continuously interpolates between logarithmic, linear, and exponential functions. This
activation function is simple, differentiable, and parameterized so that it can be trained as the
rest of the network is trained. We hypothesize that soft exponential has the potential to
improve neural network learning, as it can exactly calculate many natural operations that
typical neural networks can only approximate, including addition, multiplication, inner …
We present the soft exponential activation function for artificial neural networks that continuously interpolates between logarithmic, linear, and exponential functions. This activation function is simple, differentiable, and parameterized so that it can be trained as the rest of the network is trained. We hypothesize that soft exponential has the potential to improve neural network learning, as it can exactly calculate many natural operations that typical neural networks can only approximate, including addition, multiplication, inner product, distance, and sinusoids.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果