Adversarial training with stochastic weight average

J Hwang, Y Lee, S Oh, Y Bae - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
2021 IEEE International Conference on Image Processing (ICIP), 2021ieeexplore.ieee.org
Although adversarial training is the most reliable method to train robust deep neural
networks so far, adversarially trained networks still show large gap between their accuracies
on clean images and those on adversarial images. In conventional classification problem,
one can gain higher accuracy by ensembling multiple networks. However, in adversarial
training, there are obstacles to adopt such ensemble method. First, as inner maximization is
expensive, training multiple networks adversarially becomes overburden. Moreover, the …
Although adversarial training is the most reliable method to train robust deep neural networks so far, adversarially trained networks still show large gap between their accuracies on clean images and those on adversarial images. In conventional classification problem, one can gain higher accuracy by ensembling multiple networks. However, in adversarial training, there are obstacles to adopt such ensemble method. First, as inner maximization is expensive, training multiple networks adversarially becomes overburden. Moreover, the naive ensemble faces dilemma on choosing target model to generate adversarial examples with. Training adversarial examples of the members causes covariate shift, while training those of ensemble diminishes the benefit of ensembling. With these insights, we adopt stochastic weight average methods and improve it by considering overfitting nature of adversarial training. Our method take the benefit of ensemble while avoiding the described problems. Experiments on CIFAR10 and CIFAR100 shows our method improves the robustness effectively.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果