In Autonomous Driving (AD) systems, perception is both security and safety critical. Despite various prior studies on its security issues, all of them only consider attacks on camera-or …
B Zhang, D Jiang, D He… - Advances in neural …, 2022 - proceedings.neurips.cc
Designing neural networks with bounded Lipschitz constant is a promising way to obtain certifiably robust classifiers against adversarial examples. However, the relevant progress …
L Li, T Xie, B Li - 2023 IEEE symposium on security and privacy …, 2023 - ieeexplore.ieee.org
Great advances in deep neural networks (DNNs) have led to state-of-the-art performance on a wide range of tasks. However, recent studies have shown that DNNs are vulnerable to …
To obtain, deterministic guarantees of adversarial robustness, specialized training methods are used. We propose, SABR, a novel such certified training method, based on the key …
A Levine, S Feizi - arXiv preprint arXiv:2006.14768, 2020 - arxiv.org
Adversarial poisoning attacks distort training data in order to corrupt the test-time behavior of a classifier. A provable defense provides a certificate for each test sample, which is a lower …
Recent studies have shown that deep neural net-works (DNNs) are vulnerable to adversarial attacks, including evasion and backdoor (poisoning) attacks. On the defense …
H Salman, S Jain, E Wong… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Certified patch defenses can guarantee robustness of an image classifier to arbitrary changes within a bounded contiguous region. But, currently, this robustness comes at a cost …
Randomized smoothing is currently a state-of-the-art method to construct a certifiably robust classifier from neural networks against $\ell_2 $-adversarial perturbations. Under the …
Z Zhang, Y Zhou, X Zhao, T Che… - Advances in Neural …, 2022 - proceedings.neurips.cc
The right to be forgotten calls for efficient machine unlearning techniques that make trained machine learning models forget a cohort of data. The combination of training and unlearning …