Byzantine-robust distributed learning: Towards optimal statistical rates

D Yin, Y Chen, R Kannan… - … conference on machine …, 2018 - proceedings.mlr.press
In this paper, we develop distributed optimization algorithms that are provably robust against
Byzantine failures—arbitrary and potentially adversarial behavior, in distributed computing …

Byzantine-resilient secure federated learning

J So, B Güler, AS Avestimehr - IEEE Journal on Selected Areas …, 2020 - ieeexplore.ieee.org
Secure federated learning is a privacy-preserving framework to improve machine learning
models by training over large volumes of data collected by mobile users. This is achieved …

Machine learning with adversaries: Byzantine tolerant gradient descent

P Blanchard, EM El Mhamdi… - Advances in neural …, 2017 - proceedings.neurips.cc
We study the resilience to Byzantine failures of distributed implementations of Stochastic
Gradient Descent (SGD). So far, distributed machine learning frameworks have largely …

Distributed statistical machine learning in adversarial settings: Byzantine gradient descent

Y Chen, L Su, J Xu - Proceedings of the ACM on Measurement and …, 2017 - dl.acm.org
We consider the distributed statistical learning problem over decentralized systems that are
prone to adversarial attacks. This setup arises in many practical applications, including …

Robust federated learning in a heterogeneous environment

A Ghosh, J Hong, D Yin, K Ramchandran - arXiv preprint arXiv …, 2019 - arxiv.org
We study a recently proposed large-scale distributed learning paradigm, namely Federated
Learning, where the worker machines are end users' own devices. Statistical and …

When machine learning meets blockchain: A decentralized, privacy-preserving and secure design

X Chen, J Ji, C Luo, W Liao, P Li - 2018 IEEE international …, 2018 - ieeexplore.ieee.org
With the onset of the big data era, designing efficient and effective machine learning
algorithms to analyze large-scale data is in dire need. In practice, data is typically generated …

Eiffel: Ensuring integrity for federated learning

A Roy Chowdhury, C Guo, S Jha… - Proceedings of the 2022 …, 2022 - dl.acm.org
Federated learning (FL) enables clients to collaborate with a server to train a machine
learning model. To ensure privacy, the server performs secure aggregation of updates from …

ByRDiE: Byzantine-resilient distributed coordinate descent for decentralized learning

Z Yang, WU Bajwa - IEEE Transactions on Signal and …, 2019 - ieeexplore.ieee.org
Distributed machine learning algorithms enable learning of models from datasets that are
distributed over a network without gathering the data at a centralized location. While efficient …

Coded computing: Mitigating fundamental bottlenecks in large-scale distributed computing and machine learning

S Li, S Avestimehr - Foundations and Trends® in …, 2020 - nowpublishers.com
We introduce the concept of “coded computing”, a novel computing paradigm that utilizes
coding theory to effectively inject and leverage data/computation redundancy to mitigate …

Distributed gradient descent algorithm robust to an arbitrary number of byzantine attackers

X Cao, L Lai - IEEE Transactions on Signal Processing, 2019 - ieeexplore.ieee.org
Due to the growth of modern dataset size and the desire to harness computing power of
multiple machines, there is a recent surge of interest in the design of distributed machine …