B Sattelberg, R Cavalieri, M Kirby… - Frontiers in Artificial …, 2023 - frontiersin.org
A ReLU neural network functions as a continuous piecewise linear map from an input space to an output space. The weights in the neural network determine a partitioning of the input …
This paper investigates the integration of multiple geometries present within a ReLU-based neural network. A ReLU neural network determines a piecewise affine linear continuous …
Current theoretical and empirical research in neural networks suggests that complex datasets require large network architectures for thorough classification, yet the precise …
Every ReLU Neural Network (NN) tessellates its input space into activation regions. Studying this tessellation provides insights into some of the architecture's properties. Recent …
Neural networks can be thought of as applying a transformation to an input dataset. The way in which they change the topology of such a dataset often holds practical significance for …
K Beshkov - arXiv preprint arXiv:2502.01360, 2025 - arxiv.org
Previous research has proven that the set of maps implemented by neural networks with a ReLU activation function is identical to the set of piecewise linear continuous maps …
Tame functions are a class of nonsmooth, nonconvex functions, which feature in a wide range of applications: functions encountered in the training of deep neural networks with all …
The geometrical perspective of deep ReLU networks is important to understand the learning behavior and generalization capability of the neural networks. As such, here we investigate …