关注
Amit Attia
Amit Attia
在 mail.tau.ac.il 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
SGD with AdaGrad stepsizes: Full adaptivity with high probability to unknown parameters, unbounded gradients and affine variance
A Attia, T Koren
International Conference on Machine Learning, 1147-1171, 2023
152023
Algorithmic instabilities of accelerated gradient descent
A Attia, T Koren
Advances in Neural Information Processing Systems 34, 1204-1214, 2021
112021
Uniform stability for first-order empirical risk minimization
A Attia, T Koren
Conference on Learning Theory, 3313-3332, 2022
52022
How Free is Parameter-Free Stochastic Optimization?
A Attia, T Koren
arXiv preprint arXiv:2402.03126, 2024
22024
Faster Stochastic Optimization with Arbitrary Delays via Asynchronous Mini-Batching
A Attia, O Gaash, T Koren
arXiv preprint arXiv:2408.07503, 2024
2024
A Note on High-Probability Analysis of Algorithms with Exponential, Sub-Gaussian, and General Light Tails
A Attia, T Koren
arXiv preprint arXiv:2403.02873, 2024
2024
系统目前无法执行此操作,请稍后再试。
文章 1–6