作者
Gagandeep Singh, Rupanshu Ganvir, Markus Püschel, Martin Vechev
发表日期
2019
期刊
Advances in Neural Information Processing Systems
卷号
32
简介
We propose a new parametric framework, called k-ReLU, for computing precise and scalable convex relaxations used to certify neural networks. The key idea is to approximate the output of multiple ReLUs in a layer jointly instead of separately. This joint relaxation captures dependencies between the inputs to different ReLUs in a layer and thus overcomes the convex barrier imposed by the single neuron triangle relaxation and its approximations. The framework is parametric in the number of k ReLUs it considers jointly and can be combined with existing verifiers in order to improve their precision. Our experimental results show that k-ReLU en-ables significantly more precise certification than existing state-of-the-art verifiers while maintaining scalability.
引用总数
20192020202120222023202422145515529
学术搜索中的文章
G Singh, R Ganvir, M Püschel, M Vechev - Advances in Neural Information Processing Systems, 2019