作者
Olusola Odeyomi, Gergely Zaruba
发表日期
2021/7/12
研讨会论文
2021 IEEE International Symposium on Information Theory (ISIT)
页码范围
1308-1313
出版商
IEEE
简介
This paper discusses a fully decentralized online federated learning setting with long-term constraints. The fully decentralized setting removes communication and computational bottlenecks associated with a central server communicating with a large number of clients. Also, online learning is introduced to the federated learning setting to capture situations with time-varying data distribution. Practical federated learning settings are imposed with long-term constraints such as energy constraints, money cost constraints, time constraints etc. The clients are not obligated to satisfy any per round constraint, but they must satisfy these long-term constraints. To provide privacy to the shared local model updates of the clients, local differential privacy is introduced. An online mirror descent-based algorithm is proposed and its regret bound is obtained. The regret bound is compared with the regret bound of a differentially-private …
引用总数
学术搜索中的文章