We review the use of differential privacy (DP) for privacy protection in machine learning (ML). We show that, driven by the aim of preserving the accuracy of the learned models, DP …
The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of …
We introduce ProxSkip—a surprisingly simple and provably efficient method for minimizing the sum of a smooth ($ f $) and an expensive nonsmooth proximable ($\psi $) function. The …
Federated learning (FL) is a machine learning setting where many clients (eg, mobile devices or whole organizations) collaboratively train a model under the orchestration of a …
Decentralized stochastic optimization methods have gained a lot of attention recently, mainly because of their cheap per iteration cost, data locality, and their communication-efficiency. In …
Many large-scale machine learning (ML) applications need to perform decentralized learning over datasets generated at different devices and locations. Such datasets pose a …
We consider decentralized stochastic optimization with the objective function (eg data samples for machine learning tasks) being distributed over n machines that can only …
H Yu, R Jin, S Yang - International Conference on Machine …, 2019 - proceedings.mlr.press
Recent developments on large-scale distributed machine learning applications, eg, deep neural networks, benefit enormously from the advances in distributed non-convex …
Training foundation models, such as GPT-3 and PaLM, can be extremely expensive, often involving tens of thousands of GPUs running continuously for months. These models are …