Optimal complexity in decentralized training

Y Lu, C De Sa - International conference on machine …, 2021 - proceedings.mlr.press
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …

Optimal Complexity in Decentralized Training

Y Lu, C De Sa - arXiv preprint arXiv:2006.08085, 2020 - arxiv.org
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …

[PDF][PDF] Optimal Complexity in Decentralized Training

Y Lu, C De Sa - proceedings.mlr.press
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …

[PDF][PDF] Optimal Complexity in Decentralized Training

Y Lu, C De Sa - arXiv preprint arXiv:2006.08085, 2020 - thetalkingmachines.com
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …

[PDF][PDF] Optimal Complexity in Decentralized Training

Y Lu, C De Sa - arXiv preprint arXiv:2006.08085, 2020 - thetalkingmachines.com
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …

Optimal Complexity in Decentralized Training

Y Lu, C De Sa - arXiv e-prints, 2020 - ui.adsabs.harvard.edu
Decentralization is a promising method of scaling up parallel machine learning systems. In
this paper, we provide a tight lower bound on the iteration complexity for such methods in a …