作者
Jie Zhang, Song Guo, Zhihao Qu, Deze Zeng, Yufeng Zhan, Qifeng Liu, Rajendra Akerkar
发表日期
2021/7/26
期刊
IEEE Transactions on Computers
卷号
71
期号
7
页码范围
1655-1667
出版商
IEEE
简介
Federated learning (FL) has been widely recognized as a promising approach by enabling individual end-devices to cooperatively train a global model without exposing their own data. One of the key challenges in FL is the non-independent and identically distributed (Non-IID) data across the clients, which decreases the efficiency of stochastic gradient descent (SGD) based training process. Moreover, clients with different data distributions may cause bias to the global model update, resulting in a degraded model accuracy. To tackle the Non-IID problem in FL, we aim to optimize the local training process and global aggregation simultaneously. For local training, we analyze the effect of hyperparameters (e.g., the batch size, the number of local updates) on the training performance of FL. Guided by the toy example and theoretical analysis, we are motivated to mitigate the negative impacts incurred by Non-IID data …
引用总数
学术搜索中的文章
J Zhang, S Guo, Z Qu, D Zeng, Y Zhan, Q Liu… - IEEE Transactions on Computers, 2021