R Bitar, M Wootters… - IEEE Journal on Selected …, 2020 - ieeexplore.ieee.org
We consider distributed gradient descent in the presence of stragglers. Recent work on gradient coding and approximate gradient coding have shown how to add redundancy in …
Coded computation techniques provide robustness against straggling workers in distributed computing. However, most of the existing schemes require exact provisioning of the …
When gradient descent (GD) is scaled to many parallel workers for large-scale machine learning applications, its per-iteration computation time is limited by straggling workers …
T Jahani-Nezhad… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
One of the major challenges in using distributed learning to train complicated models with large data sets is to deal with stragglers effect. As a solution, coded computation has been …
T Jahani-Nezhad… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
In this paper, we propose CodedSketch, as a distributed straggler-resistant scheme to compute an approximation of the multiplication of two massive matrices. The objective is to …
S Dutta, J Wang, G Joshi - IEEE Journal on Selected Areas in …, 2021 - ieeexplore.ieee.org
Distributed Stochastic Gradient Descent (SGD) when run in a synchronous manner, suffers from delays in runtime as it waits for the slowest workers (stragglers). Asynchronous …
Federated Learning is an emerging learning paradigm that allows training models from samples distributed across a large network of clients while respecting privacy and …
M Glasgow, M Wootters - IEEE Journal on Selected Areas in …, 2021 - ieeexplore.ieee.org
Gradient codes use data replication to mitigate the effect of straggling machines in distributed machine learning. Approximate gradient codes consider codes where the data …
H Jeong, A Devulapalli, VR Cadambe… - IEEE Journal on …, 2021 - ieeexplore.ieee.org
We study coded distributed matrix multiplication from an approximate recovery viewpoint. We consider a system of computation nodes where each node stores of each multiplicand …