LocalNorm: robust image classification through dynamically regularized normalization

B Yin, HS Scholte, S Bohté - … and Machine Learning–ICANN 2021: 30th …, 2021 - Springer
Artificial Neural Networks and Machine Learning–ICANN 2021: 30th International …, 2021Springer
While modern convolutional neural networks achieve outstanding accuracy on many image
classification tasks, they are, once trained, much more sensitive to image degradation
compared to humans. Much of this sensitivity is caused by the resultant shift in data
distribution. As we show, dynamically recalculating summary statistics for normalization over
batches at test-time improves network robustness, but at the expense of accuracy. Here, we
describe a variant of Batch Normalization, LocalNorm, that regularizes the normalization …
Abstract
While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, once trained, much more sensitive to image degradation compared to humans. Much of this sensitivity is caused by the resultant shift in data distribution. As we show, dynamically recalculating summary statistics for normalization over batches at test-time improves network robustness, but at the expense of accuracy. Here, we describe a variant of Batch Normalization, LocalNorm, that regularizes the normalization layer in the spirit of Dropout during training, while dynamically adapting to the local image intensity and contrast at test-time. We show that the resulting deep neural networks are much more resistant to noise-induced image degradation, while achieving the same or slightly better accuracy on non-degraded classical benchmarks and where calculating single image summary statistics at test-time suffices. In computational terms, LocalNorm adds negligible training cost and little or no cost at inference time, and can be applied to pre-trained networks in a straightforward manner.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果