T Lin, C Jin, M Jordan - arXiv preprint arXiv:2408.11974, 2024 - arxiv.org
We provide a unified analysis of two-timescale gradient descent ascent (TTGDA) for solving structured nonconvex minimax optimization problems in the form of $\min_\textbf {x}\max …
Y Huang, X Li, Y Shen, N He, J Xu - arXiv preprint arXiv:2406.02939, 2024 - arxiv.org
In this paper, we show that applying adaptive methods directly to distributed minimax problems can result in non-convergence due to inconsistency in locally computed adaptive …
We consider double-regularized nonconvex-strongly concave (NCSC) minimax problems of the form $(P):\min_ {x\in\mathcal {X}}\max_ {y\in\mathcal {Y}} g (x)+ f (x, y)-h (y) $, where $ g …
J Yang, H Zhang, Z Xu - arXiv preprint arXiv:2407.21372, 2024 - arxiv.org
Due to their importance in various emerging applications, efficient algorithms for solving minimax problems have recently received increasing attention. However, many existing …
K Kim, D Kim - Forty-first International Conference on Machine … - openreview.net
In nonconvex-nonconcave minimax optimization, two-timescale gradient methods have shown their potential to find local minimax (optimal) points, provided that the timescale …
Y Huang, Y Cheng, Y Liang, L Huang - Transactions on Machine Learning … - openreview.net
Online min-max optimization has recently gained considerable interest due to its rich applications to game theory, multi-agent reinforcement learning, online robust learning, etc …