Faster single-loop algorithms for minimax optimization without strong concavity

J Yang, A Orvieto, A Lucchi… - … Conference on Artificial …, 2022 - proceedings.mlr.press
Gradient descent ascent (GDA), the simplest single-loop algorithm for nonconvex minimax
optimization, is widely used in practical applications such as generative adversarial …

Stable nonconvex-nonconcave training via linear interpolation

T Pethick, W Xie, V Cevher - Advances in Neural …, 2024 - proceedings.neurips.cc
This paper presents a theoretical analysis of linear interpolation as a principled method for
stabilizing (large-scale) neural network training. We argue that instabilities in the …

Solving nonconvex-nonconcave min-max problems exhibiting weak minty solutions

A Böhm - arXiv preprint arXiv:2201.12247, 2022 - arxiv.org
We investigate a structured class of nonconvex-nonconcave min-max problems exhibiting
so-called\emph {weak Minty} solutions, a notion which was only recently introduced, but is …

Universal gradient descent ascent method for nonconvex-nonconcave minimax optimization

T Zheng, L Zhu, AMC So… - Advances in Neural …, 2023 - proceedings.neurips.cc
Nonconvex-nonconcave minimax optimization has received intense attention over the last
decade due to its broad applications in machine learning. Most existing algorithms rely on …

What is a good metric to study generalization of minimax learners?

A Ozdaglar, S Pattathil, J Zhang… - Advances in Neural …, 2022 - proceedings.neurips.cc
Minimax optimization has served as the backbone of many machine learning problems.
Although the convergence behavior of optimization algorithms has been extensively studied …

Extragradient-Type Methods with Last-Iterate Convergence Rates for Co-Hypomonotone Inclusions

Q Tran-Dinh - arXiv preprint arXiv:2302.04099, 2023 - arxiv.org
We develop two" Nesterov's accelerated" variants of the well-known extragradient method to
approximate a solution of a co-hypomonotone inclusion constituted by the sum of two …

Convergence of the preconditioned proximal point method and Douglas-Rachford splitting in the absence of monotonicity

B Evens, P Pas, P Latafat, P Patrinos - arXiv preprint arXiv:2305.03605, 2023 - arxiv.org
The proximal point algorithm (PPA) is the most widely recognized method for solving
inclusion problems and serves as the foundation for many numerical algorithms. Despite this …

Decentralized gradient descent maximization method for composite nonconvex strongly-concave minimax problems

Y Xu - SIAM Journal on Optimization, 2024 - SIAM
Minimax problems have recently attracted a lot of research interests. A few efforts have been
made to solve decentralized nonconvex strongly-concave (NCSC) minimax-structured …

Near-optimal algorithms for making the gradient small in stochastic minimax optimization

L Chen, L Luo - arXiv preprint arXiv:2208.05925, 2022 - arxiv.org
We study the problem of finding a near-stationary point for smooth minimax optimization.
The recent proposed extra anchored gradient (EAG) methods achieve the optimal …

On the linear convergence of extragradient methods for nonconvex–nonconcave minimax problems

S Hajizadeh, H Lu, B Grimmer - INFORMS Journal on …, 2024 - pubsonline.informs.org
Recently, minimax optimization has received renewed focus due to modern applications in
machine learning, robust optimization, and reinforcement learning. The scale of these …