Sigmoid and Beyond: Algebraic Activation Functions for Artificial Neural Networks Based on Solutions of a Riccati Equation

NE Protonotarios, AS Fokas, GA Kastis… - IT …, 2022 - ieeexplore.ieee.org
IT Professional, 2022ieeexplore.ieee.org
Activation functions play a key role in neural networks, as they significantly affect the training
process and the network's performance. Based on the solution of a certain ordinary
differential equation of the Riccati type, this work proposes an alternative generalized
adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation”(GRA).
The proposed GRA function was employed on the output layer of an artificial neural network
with a single hidden layer that consisted of eight neurons. The performance of the neural …
Activation functions play a key role in neural networks, as they significantly affect the training process and the network’s performance. Based on the solution of a certain ordinary differential equation of the Riccati type, this work proposes an alternative generalized adaptive solution to the fixed sigmoid, which is called “generalized Riccati activation” (GRA). The proposed GRA function was employed on the output layer of an artificial neural network with a single hidden layer that consisted of eight neurons. The performance of the neural network was evaluated on a binary and a multiclass classification problem using different combinations of activation functions in the input/output layers. The results demonstrated that the swish/GRA combination yields higher accuracy than any other combination of activation functions. This benefit in terms of accuracy could be critical for certain domains, such as healthcare and smart grids, where AI-assisted decisions are becoming essential.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果