Normalization techniques in training dnns: Methodology, analysis and application

L Huang, J Qin, Y Zhou, F Zhu, L Liu… - IEEE transactions on …, 2023 - ieeexplore.ieee.org
Normalization techniques are essential for accelerating the training and improving the
generalization of deep neural networks (DNNs), and have successfully been used in various …

Decentralized riemannian algorithm for nonconvex minimax problems

X Wu, Z Hu, H Huang - Proceedings of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
The minimax optimization over Riemannian manifolds (possibly nonconvex constraints) has
been actively applied to solve many problems, such as robust dimensionality reduction and …

Riemannian Hamiltonian methods for min-max optimization on manifolds

A Han, B Mishra, P Jawanpuria, P Kumar, J Gao - SIAM Journal on …, 2023 - SIAM
In this paper, we study min-max optimization problems on Riemannian manifolds. We
introduce a Riemannian Hamiltonian function, minimization of which serves as a proxy for …

On the application of generative adversarial networks for nonlinear modal analysis

G Tsialiamanis, MD Champneys, N Dervilis… - … Systems and Signal …, 2022 - Elsevier
Linear modal analysis is a useful and effective tool for the design and analysis of structures.
However, a comprehensive basis for nonlinear modal analysis remains to be developed. In …

Nonconvex-nonconcave min-max optimization on Riemannian manifolds

A Han, B Mishra, P Jawanpuria, J Gao - Transactions on Machine …, 2023 - openreview.net
This work studies nonconvex-nonconcave min-max problems on Riemannian manifolds. We
first characterize the local optimality of nonconvex-nonconcave problems on manifolds with …

Orthogonal regularizers in deep learning: how to handle rectangular matrices?

E Massart - 2022 26th International Conference on Pattern …, 2022 - ieeexplore.ieee.org
Orthogonal regularizers typically promote column orthonormality of some matrix W∈ ℝ n× p,
by measuring the discrepancy between W⊤ W and the identity according to some matrix …

An Adaptive Orthogonal Convolution Scheme for Efficient and Flexible CNN Architectures

T Boissin, F Mamalet, T Fel, AM Picard… - arXiv preprint arXiv …, 2025 - arxiv.org
Orthogonal convolutional layers are the workhorse of multiple areas in machine learning,
such as adversarial robustness, normalizing flows, GANs, and Lipschitzconstrained models …

Local convergence of min-max algorithms to differentiable equilibrium on Riemannian manifold

S Zhang - arXiv preprint arXiv:2405.13392, 2024 - arxiv.org
We study min-max algorithms to solve zero-sum differentiable games on Riemannian
manifold. The notions of differentiable Stackelberg equilibrium and differentiable Nash …

Improving weight clipping in Wasserstein GANs

E Massart - 2022 26th International Conference on Pattern …, 2022 - ieeexplore.ieee.org
Weight clipping is a well-known strategy to keep the Lipschitz constant of the critic under
control, in Wasserstein GAN training. After each training iteration, all parameters of the critic …

Normalization ináTask-Specific Applications

L Huang - Normalization Techniques in Deep Learning, 2022 - Springer
As previously stated, normalization methods can be wrapped as general modules, which
have been extensively integrated into various DNNs to stabilize and accelerate training …