作者
Ruixuan Liu, Yang Cao, Masatoshi Yoshikawa, Hong Chen
发表日期
2020
研讨会论文
Database Systems for Advanced Applications: 25th International Conference, DASFAA 2020, Jeju, South Korea, September 24–27, 2020, Proceedings, Part I 25
页码范围
485-501
出版商
Springer International Publishing
简介
As massive data are produced from small gadgets, federated learning on mobile devices has become an emerging trend. In the federated setting, Stochastic Gradient Descent (SGD) has been widely used in federated learning for various machine learning models. To prevent privacy leakages from gradients that are calculated on users’ sensitive data, local differential privacy (LDP) has been considered as a privacy guarantee in federated SGD recently. However, the existing solutions have a dimension dependency problem: the injected noise is substantially proportional to the dimension d. In this work, we propose a two-stage framework FedSel for federated SGD under LDP to relieve this problem. Our key idea is that not all dimensions are equally important so that we privately select Top-k dimensions according to their contributions in each iteration of federated SGD. Specifically, we propose three private …
引用总数
20202021202220232024220363925
学术搜索中的文章
R Liu, Y Cao, M Yoshikawa, H Chen - Database Systems for Advanced Applications: 25th …, 2020