Biased stochastic first-order methods for conditional stochastic optimization and applications in meta learning Y Hu, S Zhang, X Chen, N He Advances in Neural Information Processing Systems 33, 2759-2770, 2020 | 76* | 2020 |
The complexity of nonconvex-strongly-concave minimax optimization S Zhang, J Yang, C Guzmán, N Kiyavash, N He Uncertainty in Artificial Intelligence, 482-492, 2021 | 67 | 2021 |
On the convergence rate of stochastic mirror descent for nonsmooth nonconvex optimization S Zhang, N He arXiv preprint arXiv:1806.04781, 2018 | 61 | 2018 |
A catalyst framework for minimax optimization J Yang, S Zhang, N Kiyavash, N He Advances in Neural Information Processing Systems 33, 5667-5678, 2020 | 58 | 2020 |
First-Order Optimization Inspired from Finite-Time Convergent Flows S Zhang, M Benosman, O Romero, A Cherian arXiv preprint arXiv:2010.02990, 2020 | 4* | 2020 |
Generalization Bounds of Nonconvex-(Strongly)-Concave Stochastic Minimax Optimization S Zhang, Y Hu, L Zhang, N He International Conference on Artificial Intelligence and Statistics, 694-702, 2024 | 3* | 2024 |
Communication-efficient gradient descent-accent methods for distributed variational inequalities: Unified analysis and local updates S Zhang, S Choudhury, SU Stich, N Loizou arXiv preprint arXiv:2306.05100, 2023 | 2 | 2023 |
ProxSkip for Stochastic Variational Inequalities: A Federated Learning Algorithm for Provable Communication Acceleration S Zhang, N Loizou | | 2022 |