Tensorfuzz: Debugging neural networks with coverage-guided fuzzing

A Odena, C Olsson, D Andersen… - … on Machine Learning, 2019 - proceedings.mlr.press
International Conference on Machine Learning, 2019proceedings.mlr.press
Neural networks are difficult to interpret and debug. We introduce testing techniques for
neural networks that can discover errors occurring only for rare inputs. Specifically, we
develop coverage-guided fuzzing (CGF) methods for neural networks. In CGF, random
mutations of inputs are guided by a coverage metric toward the goal of satisfying user-
specified constraints. We describe how approximate nearest neighbor (ANN) algorithms can
provide this coverage metric for neural networks. We then combine these methods with …
Abstract
Neural networks are difficult to interpret and debug. We introduce testing techniques for neural networks that can discover errors occurring only for rare inputs. Specifically, we develop coverage-guided fuzzing (CGF) methods for neural networks. In CGF, random mutations of inputs are guided by a coverage metric toward the goal of satisfying user-specified constraints. We describe how approximate nearest neighbor (ANN) algorithms can provide this coverage metric for neural networks. We then combine these methods with techniques for property-based testing (PBT). In PBT, one asserts properties that a function should satisfy and the system automatically generates tests exercising those properties. We then apply this system to practical goals including (but not limited to) surfacing broken loss functions in popular GitHub repositories and making performance improvements to TensorFlow. Finally, we release an open source library called TensorFuzz that implements the described techniques.
proceedings.mlr.press
以上显示的是最相近的搜索结果。 查看全部搜索结果