Learning specialized activation functions with the piecewise linear unit

Y Zhou, Z Zhu, Z Zhong - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …

Learning specialized activation functions with the Piecewise Linear Unit

Y Zhou, Z Zhu, Z Zhong - arXiv preprint arXiv:2104.03693, 2021 - arxiv.org
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …

Learning specialized activation functions with the Piecewise Linear Unit

Y Zhou, Z Zhu, Z Zhong - 2021 IEEE/CVF International Conference on …, 2021 - computer.org
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …

Learning specialized activation functions with the Piecewise Linear Unit

Y Zhou, Z Zhu, Z Zhong - arXiv e-prints, 2021 - ui.adsabs.harvard.edu
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …

Learning specialized activation functions with the Piecewise Linear Unit

Y Zhou, Z Zhu, Z Zhong - 2021 IEEE/CVF International …, 2021 - ieeexplore.ieee.org
The choice of activation functions is crucial for modern deep neural networks. Popular hand-
designed activation functions like Rectified Linear Unit (ReLU) and its variants show …