作者
Ahmet M Elbir, Sinem Coleri, Anastasios K Papazafeiropoulos, Pandelis Kourtessis, Symeon Chatzinotas
发表日期
2022
期刊
IEEE Transactions on Cognitive Communications and Networking
出版商
Institute of Electrical and Electronics Engineers, United States
简介
Many of the machine learning tasks focus on cen-tralized learning (CL), which requires the transmission of localdatasets from the clients to a parameter server (PS) entailing hugecommunication overhead. To overcome this, federated learning(FL) has been a promising tool, wherein the clients send only themodel updates to the PS instead of the whole dataset. However,FL demands powerful computational resources from the clients.Therefore, not all the clients can participate in training if they donot have enough computational resources. To address this issue,we introduce a more practical approach called hybrid federatedand centralized learning (HFCL), wherein only the clients withsufficient resources employ FL, while the remaining ones sendtheir datasets to the PS, which computes the model on behalf ofthem. Then, the model parameters corresponding to all clientsare aggregated at the PS. To improve the efficiency of datasettransmission, we propose two different techniques: increasedcomputation-per-client and sequential data transmission. TheHFCL frameworks outperform FL with up to20%improvementin the learning accuracy when only half of the clients perform FLwhile having50%less communication overhead than CL since allthe clients collaborate on the learning process with their datasets.
引用总数
学术搜索中的文章
AM Elbir, S Coleri, AK Papazafeiropoulos, P Kourtessis… - IEEE Transactions on Cognitive Communications and …, 2022