Stochastic Subspace Cubic Newton Method F Hanzely, N Doikov, P Richtárik, Y Nesterov ICML 2020 (International Conference on Machine Learning), 2020 | 49 | 2020 |
Randomized Block Cubic Newton Method N Doikov, P Richtárik ICML 2018 (International Conference on Machine Learning), 2018 | 42 | 2018 |
Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method N Doikov, Y Nesterov Journal of Optimization Theory and Applications, 2021 | 39 | 2021 |
Gradient regularization of Newton method with Bregman distances N Doikov, Y Nesterov Mathematical Programming, 1-25, 2023 | 34 | 2023 |
Contracting Proximal Methods for Smooth Convex Optimization N Doikov, Y Nesterov SIAM Journal on Optimization 30 (4), 2020 | 30 | 2020 |
Local convergence of tensor methods N Doikov, Y Nesterov Mathematical Programming, 2021 | 27 | 2021 |
Inexact Tensor Methods with Dynamic Accuracies N Doikov, Y Nesterov ICML 2020 (International Conference on Machine Learning), 2020 | 25 | 2020 |
Super-Universal Regularized Newton Method N Doikov, K Mishchenko, Y Nesterov SIAM Journal on Optimization 34 (1), 27-56, 2024 | 20 | 2024 |
Second-order optimization with lazy Hessians N Doikov, EM Chayti, M Jaggi ICML 2023 (International Conference on Machine Learning), 2022 | 14 | 2022 |
Affine-invariant contracting-point methods for Convex Optimization N Doikov, Y Nesterov Mathematical Programming, 1-23, 2022 | 13 | 2022 |
High-Order Optimization Methods for Fully Composite Problems N Doikov, Y Nesterov SIAM Journal on Optimization 32 (3), 2402-2427, 2022 | 8 | 2022 |
Convex optimization based on global lower second-order models N Doikov, Y Nesterov NeurIPS 2020 (Advances in Neural Information Processing Systems 33), 2020 | 8 | 2020 |
New second-order and tensor methods in Convex Optimization N Doikov Université catholique de Louvain, 2021 | 7 | 2021 |
On Convergence of Incremental Gradient for Non-Convex Smooth Functions A Koloskova, N Doikov, SU Stich, M Jaggi ICML 2024 (International Conference on Machine Learning), 2023 | 5* | 2023 |
First and zeroth-order implementations of the regularized Newton method with lazy approximated Hessians N Doikov, GN Grapiglia arXiv preprint arXiv:2309.02412, 2023 | 3 | 2023 |
Многокритериальные и многомодальные вероятностные тематические модели коллекций текстовых документов КВ Воронцов, АА Потапенко, АИ Фрей, МА Апишев, НВ Дойков, ... 10-я Междунар. конф. ИОИ, 198, 2014 | 3 | 2014 |
Linearization Algorithms for Fully Composite Optimization ML Vladarean, N Doikov, M Jaggi, N Flammarion COLT 2023 (Conference on Learning Theory), 2023 | 2 | 2023 |
Optimization Methods for Fully Composite Problems N Doikov, Y Nesterov arXiv preprint arXiv:2103.12632, 2021 | 2 | 2021 |
Cubic regularized subspace Newton for non-convex optimization J Zhao, A Lucchi, N Doikov arXiv preprint arXiv:2406.16666, 2024 | 1 | 2024 |
Minimizing Quasi-Self-Concordant Functions by Gradient Regularization of Newton Method N Doikov arXiv preprint arXiv:2308.14742, 2023 | 1 | 2023 |