governed by ReLU, Sigmoid and Tanh activation functions. We show that the method is
complete for ReLU networks and sound for other activation functions. The technique extends
symbolic interval propagation by using gradient-descent to locate counter-examples from
spurious solutions generated by the associated LP problems. The approach includes a
novel adaptive splitting strategy intended to refine the nodes with the greatest impact on the …
Neural networks have over the last years become an essential technique for solving
regression and classification problems with complex data. While these networks often
achieve impressive results, the empirical methods commonly used to measure their
performance have their limitations. We propose an efficient algorithm based on symbolic
interval propagation for formal verification of large neural networks with high-dimensional
input data. Our approach extends present state-of-the-art algorithms with three significant …