Overhead-free noise-tolerant federated learning: A new baseline

S Lin, D Zhai, F Zhang, J Jiang, X Liu, X Ji - Machine Intelligence Research, 2024 - Springer
S Lin, D Zhai, F Zhang, J Jiang, X Liu, X Ji
Machine Intelligence Research, 2024Springer
Federated learning (FL) is a promising decentralized machine learning approach that
enables multiple distributed clients to train a model jointly while keeping their data private.
However, in real-world scenarios, the supervised training data stored in local clients
inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased
labels. These noisy labels can harm the collaborative aggregation process of FL by inducing
inconsistent decision boundaries. Unfortunately, few attempts have been made towards …
Abstract
Federated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the supervised training data stored in local clients inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased labels. These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries. Unfortunately, few attempts have been made towards noise-tolerant federated learning, with most of them relying on the strategy of transmitting overhead messages to assist noisy labels detection and correction, which increases the communication burden as well as privacy risks. In this paper, we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework. Our method leverages the inherent discrepancy in the learning ability of the local and global models in FL, which can be regarded as two complementary views. By iteratively exchanging samples with their high confident predictions, the two models “teach each other” to suppress the influence of noisy labels. The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning. Experimental results demonstrate that our method outperforms existing approaches, highlighting the superiority of our method.
Springer
以上显示的是最相近的搜索结果。 查看全部搜索结果