Topological Data Analysis for Neural Network Analysis: A Comprehensive Survey

R Ballester, C Casacuberta, S Escalera - arXiv preprint arXiv:2312.05840, 2023 - arxiv.org
This survey provides a comprehensive exploration of applications of Topological Data
Analysis (TDA) within neural network analysis. Using TDA tools such as persistent homology …

Locally linear attributes of ReLU neural networks

B Sattelberg, R Cavalieri, M Kirby… - Frontiers in Artificial …, 2023 - frontiersin.org
A ReLU neural network functions as a continuous piecewise linear map from an input space
to an output space. The weights in the neural network determine a partitioning of the input …

Integrating geometries of ReLU feedforward neural networks

Y Liu, T Caglar, C Peterson, M Kirby - Frontiers in big Data, 2023 - frontiersin.org
This paper investigates the integration of multiple geometries present within a ReLU-based
neural network. A ReLU neural network determines a piecewise affine linear continuous …

Defining Neural Network Architecture through Polytope Structures of Dataset

S Lee, A Mammadov, JC Ye - arXiv preprint arXiv:2402.02407, 2024 - arxiv.org
Current theoretical and empirical research in neural networks suggests that complex
datasets require large network architectures for thorough classification, yet the precise …

SkelEx and BoundEx-Geometrical Framework for Interpretable ReLU Neural Networks

P Pukowski, J Spoerhase, H Lu - 2024 International Joint …, 2024 - ieeexplore.ieee.org
Every ReLU Neural Network (NN) tessellates its input space into activation regions.
Studying this tessellation provides insights into some of the architecture's properties. Recent …

A rank decomposition for the topological classification of neural representations

K Beshkov, GT Einevoll - arXiv preprint arXiv:2404.19710, 2024 - arxiv.org
Neural networks can be thought of as applying a transformation to an input dataset. The way
in which they change the topology of such a dataset often holds practical significance for …

A Relative Homology Theory of Representation in Neural Networks

K Beshkov - arXiv preprint arXiv:2502.01360, 2025 - arxiv.org
Previous research has proven that the set of maps implemented by neural networks with a
ReLU activation function is identical to the set of piecewise linear continuous maps …

Piecewise polynomial regression of tame functions via integer programming

G Bareilles, J Aspman, J Nemecek… - arXiv preprint arXiv …, 2023 - arxiv.org
Tame functions are a class of nonsmooth, nonconvex functions, which feature in a wide
range of applications: functions encountered in the training of deep neural networks with all …

Data geometry and topology dependent bounds on network widths in deep ReLU networks

S Lee, A Mammadov, JC Ye - openreview.net
The geometrical perspective of deep ReLU networks is important to understand the learning
behavior and generalization capability of the neural networks. As such, here we investigate …