Enhancing decentralized federated learning for non-iid data on heterogeneous devices

M Chen, Y Xu, H Xu, L Huang - 2023 IEEE 39th International …, 2023 - ieeexplore.ieee.org
2023 IEEE 39th International Conference on Data Engineering (ICDE), 2023ieeexplore.ieee.org
Data generated at the network edge can be processed locally by leveraging the emerging
technology of Federated Learning (FL). However, non-IID local data will lead to degradation
of model accuracy and the heterogeneity of edge nodes inevitably slows down model
training efficiency. Moreover, to avoid the potential communication bottleneck in the
parameter-server-based FL, we concentrate on the Decentralized Federated Learning (DFL)
that performs distributed model training in Peer-to-Peer (P2P) manner. To address these …
Data generated at the network edge can be processed locally by leveraging the emerging technology of Federated Learning (FL). However, non-IID local data will lead to degradation of model accuracy and the heterogeneity of edge nodes inevitably slows down model training efficiency. Moreover, to avoid the potential communication bottleneck in the parameter-server-based FL, we concentrate on the Decentralized Federated Learning (DFL) that performs distributed model training in Peer-to-Peer (P2P) manner. To address these challenges, we propose an asynchronous DFL system by incorporating neighbor selection and gradient push, termed AsyNG. Specifically, we require each edge node to push gradients only to a subset of neighbors for resource efficiency. Herein, we first give a theoretical convergence analysis of AsyNG under the complicated non-IID and heterogeneous scenario, and further design a priority-based algorithm to dynamically select neighbors for each edge node so as to achieve the trade-off between communication cost and model performance. We evaluate the performance of AsyNG through extensive experiments on a physical platform. Evaluation results show that AsyNG can reduce the communication cost by 60% and the completion time by about 30% for achieving the same test accuracy, compared to the baselines.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果