deep neural networks and topological characteristics of datasets. Our primary contribution is
to introduce data topology-dependent upper bounds on the network width. Specifically, we
first show that a three-layer neural network, applying a ReLU activation function and max
pooling, can be designed to approximate an indicator function over a compact set, one that
is encompassed by a tight convex polytope. This is then extended to a simplicial complex …