We consider the critical problems of distributed computing and learning over data while keeping it private from the computational servers. The state-of-the-art approaches to this …
How to train a machine learning model while keeping the data private and secure? We present CodedPrivateML, a fast and scalable approach to this critical problem …
Two major challenges in distributed learning and estimation are 1) preserving the privacy of the local samples; and 2) communicating them efficiently to a central server, while achieving …
K Talwar - arXiv preprint arXiv:2202.10618, 2022 - arxiv.org
Computing the noisy sum of real-valued vectors is an important primitive in differentially private learning and statistics. In private federated learning applications, these vectors are …
Two major challenges in distributed learning and estimation are 1) preserving the privacy of the local samples; and 2) communicating them efficiently to a central server, while achieving …
G Yan, T Li, T Lan, K Wu, L Song - arXiv preprint arXiv:2312.07060, 2023 - arxiv.org
Next-generation wireless networks, such as edge intelligence and wireless distributed learning, face two critical challenges: communication efficiency and privacy protection. In …
Strict privacy is of paramount importance in distributed machine learning. Federated learning, with the main idea of communicating only what is needed for learning, has been …
Distributed learning allows a group of independent data owners to collaboratively learn a model over their data sets without exposing their private data. We present a distributed …
C Li, P Zhou, L Xiong, Q Wang… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
In the big data era, the generation of data presents some new characteristics, including wide distribution, high velocity, high dimensionality, and privacy concern. To address these …