Greedy quasi-Newton methods with explicit superlinear convergence A Rodomanov, Y Nesterov SIAM Journal on Optimization 31 (1), 785-811, 2021 | 61 | 2021 |
Rates of superlinear convergence for classical quasi-Newton methods A Rodomanov, Y Nesterov Mathematical Programming, 1-32, 2022 | 56 | 2022 |
Putting MRFs on a tensor train A Novikov, A Rodomanov, A Osokin, D Vetrov International Conference on Machine Learning, 811-819, 2014 | 54 | 2014 |
New Results on Superlinear Convergence of Classical Quasi-Newton Methods A Rodomanov, Y Nesterov Journal of Optimization Theory and Applications 188, 744-769, 2021 | 53 | 2021 |
A superlinearly-convergent proximal Newton-type method for the optimization of finite sums A Rodomanov, D Kropotov International Conference on Machine Learning, 2597-2605, 2016 | 47 | 2016 |
Primal-dual method for searching equilibrium in hierarchical congestion population games P Dvurechensky, A Gasnikov, E Gasnikova, S Matsievsky, A Rodomanov, ... arXiv preprint arXiv:1606.08988, 2016 | 36 | 2016 |
A randomized coordinate descent method with volume sampling A Rodomanov, D Kropotov SIAM Journal on Optimization 30 (3), 1878-1904, 2020 | 13 | 2020 |
Smoothness parameter of power of Euclidean norm A Rodomanov, Y Nesterov Journal of Optimization Theory and Applications 185, 303-326, 2020 | 8 | 2020 |
Subgradient ellipsoid method for nonsmooth convex problems A Rodomanov, Y Nesterov Mathematical Programming 199 (1), 305-341, 2023 | 3 | 2023 |
Quasi-Newton Methods with Provable Efficiency Guarantees A Rodomanov PhD thesis, UCL-Université Catholique de Louvain, 2022 | 2 | 2022 |
Polynomial preconditioning for gradient methods N Doikov, A Rodomanov International Conference on Machine Learning, 8162-8187, 2023 | 1 | 2023 |
Universality of AdaGrad Stepsizes for Stochastic Optimization: Inexact Oracle, Acceleration and Variance Reduction A Rodomanov, X Jiang, S Stich arXiv e-prints, arXiv: 2406.06398, 2024 | | 2024 |
Global Complexity Analysis of BFGS A Rodomanov arXiv preprint arXiv:2404.15051, 2024 | | 2024 |
Federated Optimization with Doubly Regularized Drift Correction X Jiang, A Rodomanov, SU Stich arXiv preprint arXiv:2404.08447, 2024 | | 2024 |
Non-Convex Stochastic Composite Optimization with Polyak Momentum Y Gao, A Rodomanov, SU Stich arXiv preprint arXiv:2403.02967, 2024 | | 2024 |
Universal Gradient Methods for Stochastic Convex Optimization A Rodomanov, A Kavis, Y Wu, K Antonakopoulos, V Cevher arXiv preprint arXiv:2402.03210, 2024 | | 2024 |
Gradient Methods for Stochastic Optimization in Relative Scale Y Nesterov, A Rodomanov arXiv preprint arXiv:2301.08352, 2023 | | 2023 |
Linear Coupling of Gradient and Mirror Descent: Version for Composite Functions with Adaptive Estimation of the Lipschitz Constant A Rodomanov | | 2016 |