M Venkatasubramanian, AH Lashkari, S Hakak - IEEE Access, 2023 - ieeexplore.ieee.org
The Internet of Things (IoT) has paved the way to a highly connected society where all things are interconnected and exchanging information has become more accessible through the …
Deep neural networks have shown the ability to extract universal feature representations from data such as images and text that have been useful for a variety of learning tasks …
We introduce ProxSkip—a surprisingly simple and provably efficient method for minimizing the sum of a smooth ($ f $) and an expensive nonsmooth proximable ($\psi $) function. The …
DA Tarzanagh, M Li… - … on Machine Learning, 2022 - proceedings.mlr.press
Standard federated optimization methods successfully apply to stochastic problems with single-level structure. However, many contemporary ML problems-including adversarial …
Abstract The Federated Averaging (FedAvg) algorithm, which consists of alternating between a few local stochastic gradient updates at client nodes, followed by a model …
J Jin, J Ren, Y Zhou, L Lyu, J Liu… - … on Machine Learning, 2022 - proceedings.mlr.press
The federated learning (FL) framework enables edge clients to collaboratively learn a shared inference model while keeping privacy of training data on clients. Recently, many …
Federated learning (FL) is a useful tool in distributed machine learning that utilizes users' local datasets in a privacy-preserving manner. When deploying FL in a constrained wireless …
Z Li, H Zhao, B Li, Y Chi - Advances in Neural Information …, 2022 - proceedings.neurips.cc
To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication …
We study distributed optimization methods based on the {\em local training (LT)} paradigm, ie, methods which achieve communication efficiency by performing richer local gradient …