[HTML][HTML] Review of image classification algorithms based on convolutional neural networks

L Chen, S Li, Q Bai, J Yang, S Jiang, Y Miao - Remote Sensing, 2021 - mdpi.com
Image classification has always been a hot research direction in the world, and the
emergence of deep learning has promoted the development of this field. Convolutional …

[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021 - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

Simam: A simple, parameter-free attention module for convolutional neural networks

L Yang, RY Zhang, L Li, X Xie - International conference on …, 2021 - proceedings.mlr.press
In this paper, we propose a conceptually simple but very effective attention module for
Convolutional Neural Networks (ConvNets). In contrast to existing channel-wise and spatial …

Crossvit: Cross-attention multi-scale vision transformer for image classification

CFR Chen, Q Fan, R Panda - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
The recently developed vision transformer (ViT) has achieved promising results on image
classification compared to convolutional neural networks. Inspired by this, in this paper, we …

Localvit: Bringing locality to vision transformers

Y Li, K Zhang, J Cao, R Timofte, L Van Gool - arXiv preprint arXiv …, 2021 - arxiv.org
We study how to introduce locality mechanisms into vision transformers. The transformer
network originates from machine translation and is particularly good at modelling long-range …

Dynamic neural networks: A survey

Y Han, G Huang, S Song, L Yang… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Dynamic neural network is an emerging research topic in deep learning. Compared to static
models which have fixed computational graphs and parameters at the inference stage …

Sa-net: Shuffle attention for deep convolutional neural networks

QL Zhang, YB Yang - ICASSP 2021-2021 IEEE International …, 2021 - ieeexplore.ieee.org
Attention mechanisms, which enable a neural network to accurately focus on all the relevant
elements of the input, have become an essential component to improve the performance of …

Fcanet: Frequency channel attention networks

Z Qin, P Zhang, F Wu, X Li - Proceedings of the IEEE/CVF …, 2021 - openaccess.thecvf.com
Attention mechanism, especially channel attention, has gained great success in the
computer vision field. Many works focus on how to design efficient channel attention …

Attentional feature fusion

Y Dai, F Gieseke, S Oehmcke, Y Wu… - Proceedings of the …, 2021 - openaccess.thecvf.com
Feature fusion, the combination of features from different layers or branches, is an
omnipresent part of modern network architectures. It is often implemented via simple …

Diverse branch block: Building a convolution as an inception-like unit

X Ding, X Zhang, J Han, G Ding - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
We propose a universal building block of Convolutional Neural Network (ConvNet) to
improve the performance without any inference-time costs. The block is named Diverse …