Efficient joint optimization of layer-adaptive weight pruning in deep neural networks

K Xu, Z Wang, X Geng, M Wu, X Li… - Proceedings of the …, 2023 - openaccess.thecvf.com
In this paper, we propose a novel layer-adaptive weight-pruning approach for Deep Neural
Networks (DNNs) that addresses the challenge of optimizing the output distortion …

Sparse random networks for communication-efficient federated learning

B Isik, F Pase, D Gunduz, T Weissman… - arXiv preprint arXiv …, 2022 - arxiv.org
One main challenge in federated learning is the large communication cost of exchanging
weight updates from clients to the server at each round. While prior work has made great …

Resfed: Communication efficient federated learning with deep compressed residuals

R Song, L Zhou, L Lyu, A Festag… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
Federated learning allows for cooperative training among distributed clients by sharing their
locally learned model parameters, such as weights or gradients. However, as model size …

Exact optimality of communication-privacy-utility tradeoffs in distributed mean estimation

B Isik, WN Chen, A Ozgur… - Advances in Neural …, 2024 - proceedings.neurips.cc
We study the mean estimation problem under communication and local differential privacy
constraints. While previous work has proposed order-optimal algorithms for the same …

Lvac: Learned volumetric attribute compression for point clouds using coordinate based networks

B Isik, PA Chou, SJ Hwang, N Johnston… - Frontiers in Signal …, 2022 - frontiersin.org
We consider the attributes of a point cloud as samples of a vector-valued volumetric function
at discrete positions. To compress the attributes given the positions, we compress the …

NOLA: Networks as linear combination of low rank random basis

SA Koohpayegani, KL Navaneet… - UMBC Faculty …, 2023 - mdsoar.org
Large Language Models (LLMs) have recently gained popularity due to their impressive few-
shot performance across various downstream tasks. However, fine-tuning all parameters …

Lpvit: Low-power semi-structured pruning for vision transformers

K Xu, Z Wang, C Chen, X Geng, J Lin, X Yang… - … on Computer Vision, 2025 - Springer
Vision transformers (ViTs) have emerged as a promising alternative to convolutional neural
networks (CNNs) for various image analysis tasks, offering comparable or superior …

Txt2Vid: Ultra-low bitrate compression of talking-head videos via text

P Tandon, S Chandak… - IEEE Journal on …, 2022 - ieeexplore.ieee.org
Video represents the majority of internet traffic today, driving a continual race between the
generation of higher quality content, transmission of larger file sizes, and the development of …

Fed-QSSL: A Framework for Personalized Federated Learning under Bitwidth and Data Heterogeneity

Y Chen, H Vikalo, C Wang - Proceedings of the AAAI Conference on …, 2024 - ojs.aaai.org
Motivated by high resource costs of centralized machine learning schemes as well as data
privacy concerns, federated learning (FL) emerged as an efficient alternative that relies on …

Airnet: Neural network transmission over the air

M Jankowski, D Gündüz… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
State-of-the-art performance for many edge applications is achieved by deep neural
networks (DNNs). Often, these DNNs are location-and time-sensitive, and must be delivered …