Hwamei: A learning-based synchronization scheme for hierarchical federated learning

T Qi, Y Zhan, P Li, J Guo, Y Xia - 2023 IEEE 43rd International …, 2023 - ieeexplore.ieee.org
Federated learning (FL) enables collaborative model training among distributed devices
without data sharing, but existing FL suffers from poor scalability because of global model …

Arena: A Learning-based Synchronization Scheme for Hierarchical Federated Learning--Technical Report

T Qi, Y Zhan, P Li, J Guo, Y Xia - arXiv preprint arXiv:2308.10298, 2023 - arxiv.org
Federated learning (FL) enables collaborative model training among distributed devices
without data sharing, but existing FL suffers from poor scalability because of global model …

FLOAT: Federated Learning Optimizations with Automated Tuning

AF Khan, AA Khan, AM Abdelmoniem… - Proceedings of the …, 2024 - dl.acm.org
Federated Learning (FL) has emerged as a powerful approach that enables collaborative
distributed model training without the need for data sharing. However, FL grapples with …

Ranking-based Client Selection with Imitation Learning for Efficient Federated Learning

C Tian, Z Shi, X Qin, L Li, C Xu - arXiv preprint arXiv:2405.04122, 2024 - arxiv.org
Federated Learning (FL) enables multiple devices to collaboratively train a shared model
while ensuring data privacy. The selection of participating devices in each training round …

FedGCS: A Generative Framework for Efficient Client Selection in Federated Learning via Gradient-based Optimization

Z Ning, C Tian, M Xiao, W Fan, P Wang, L Li… - arXiv preprint arXiv …, 2024 - arxiv.org
Federated Learning faces significant challenges in statistical and system heterogeneity,
along with high energy consumption, necessitating efficient client selection strategies …

Breaking the Memory Wall for Heterogeneous Federated Learning with Progressive Training

Y Wu, L Li, C Tian, C Xu - arXiv preprint arXiv:2404.13349, 2024 - arxiv.org
This paper presents ProFL, a novel progressive FL framework to effectively break the
memory wall. Specifically, ProFL divides the model into different blocks based on its original …

Heterogeneity-Aware Memory Efficient Federated Learning via Progressive Layer Freezing

W Yebo, L Li, T Chunlin, C Tao, L Chi, W Cong… - arXiv preprint arXiv …, 2024 - arxiv.org
In this paper, we propose SmartFreeze, a framework that effectively reduces the memory
footprint by conducting the training in a progressive manner. Instead of updating the full …

FedCoop: Cooperative Federated Learning for Noisy Labels

K Tam, L Li, Y Zhao, C Xu - ECAI 2023, 2023 - ebooks.iospress.nl
Federated Learning coordinates multiple clients to collaboratively train a shared model
while preserving data privacy. However, the training data with noisy labels located on the …

How Few Davids Improve One Goliath: Federated Learning in Resource-Skewed Edge Computing Environments

J Zhang, S Li, H Huang, Z Wang, X Fu, D Hong… - Proceedings of the …, 2024 - dl.acm.org
Real-world deployment of federated learning requires orchestrating clients with widely
varied compute resources, from strong enterprise-grade devices in data centers to weak …

Bridging the Data Gap in Federated Preference Learning with AIGC

C Wang, Z Zhou, X Zhang… - 2024 IEEE 44th …, 2024 - ieeexplore.ieee.org
Federated learning (FL), a decentralized machine learning approach, enables privacy-
preserving and collaborative model training without centralizing sensitive data. It has been …