Normalization techniques in training dnns: Methodology, analysis and application

L Huang, J Qin, Y Zhou, F Zhu, L Liu… - IEEE transactions on …, 2023 - ieeexplore.ieee.org
Normalization techniques are essential for accelerating the training and improving the
generalization of deep neural networks (DNNs), and have successfully been used in various …

Channel-Equalization-HAR: a light-weight convolutional neural network for wearable sensor based human activity recognition

W Huang, L Zhang, H Wu, F Min… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Recently, human activity recognition (HAR) that uses wearable sensors has become a
research hotspot due to its wide applications in a large variety of real-world scenarios …

Crossnorm and selfnorm for generalization under distribution shifts

Z Tang, Y Gao, Y Zhu, Z Zhang, M Li… - Proceedings of the …, 2021 - openaccess.thecvf.com
Traditional normalization techniques (eg, Batch Normalization and Instance Normalization)
generally and simplistically assume that training and test data follow the same distribution …

Delving into the estimation shift of batch normalization in a network

L Huang, Y Zhou, T Wang, J Luo… - Proceedings of the …, 2022 - openaccess.thecvf.com
Batch normalization (BN) is a milestone technique in deep learning. It normalizes the
activation using mini-batch statistics during training but the estimated population statistics …

Group whitening: Balancing learning efficiency and representational capacity

L Huang, Y Zhou, L Liu, F Zhu… - Proceedings of the IEEE …, 2021 - openaccess.thecvf.com
Batch normalization (BN) is an important technique commonly incorporated into deep
learning models to perform standardization within mini-batches. The merits of BN in …

An efficient sharing grouped convolution via bayesian learning

T Chen, B Duan, Q Sun, M Zhang, G Li… - … on Neural Networks …, 2021 - ieeexplore.ieee.org
Compared with traditional convolutions, grouped convolutional neural networks are
promising for both model performance and network parameters. However, existing models …

Channel equilibrium networks for learning deep representation

W Shao, S Tang, X Pan, P Tan… - … on machine learning, 2020 - proceedings.mlr.press
Abstract Convolutional Neural Networks (CNNs) are typically constructed by stacking
multiple building blocks, each of which contains a normalization layer such as batch …

Selfnorm and crossnorm for out-of-distribution robustness

Z Tang, Y Gao, Y Zhu, Z Zhang, M Li, DN Metaxas - 2021 - openreview.net
Normalization techniques are crucial in stabilizing and accelerating the training of deep
neural networks. However, they are mainly designed for the independent and identically …

EAN: An Efficient Attention Module Guided by Normalization for Deep Neural Networks

J Li, Z Li, Y Wen - Proceedings of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Deep neural networks (DNNs) have achieved remarkable success in various fields, and two
powerful techniques, feature normalization and attention mechanisms, have been widely …

SIRe-Networks: Convolutional neural networks architectural extension for information preservation via skip/residual connections and interlaced auto-encoders

D Avola, L Cinque, A Fagioli, GL Foresti - Neural Networks, 2022 - Elsevier
Improving existing neural network architectures can involve several design choices such as
manipulating the loss functions, employing a diverse learning strategy, exploiting gradient …