Transfer learning via fine-tuning pre-trained transformer models has gained significant success in delivering state-of-the-art results across various NLP tasks. In the absence of …
Recently, foundation models have exhibited remarkable advancements in multi-modal learning. These models, equipped with millions (or billions) of parameters, typically require a …
Abstract Vision Transformers (ViT) and Visual Prompt Tuning (VPT) achieve state-of-the-art performance with improved efficiency in various computer vision tasks. This suggests a …
Pre-trained language models (PLM) have revolutionized the NLP landscape, achieving stellar performances across diverse tasks. These models, while benefiting from vast training …
The integration of Foundation Models (FMs) with Federated Learning (FL) presents a transformative paradigm in Artificial Intelligence (AI), offering enhanced capabilities while …
MF Elvebakken, A Iosifidis, L Esterle - arXiv preprint arXiv:2302.02949, 2023 - arxiv.org
Federated Learning offers a way to train deep neural networks in a distributed fashion. While this addresses limitations related to distributed data, it incurs a communication overhead as …
IJ Liu, CS Lin, FE Yang, YCF Wang - … of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Federated Learning (FL) is an emerging paradigm that enables multiple users to collaboratively train a robust model in a privacy-preserving manner without sharing their …
PVNP Srihitha, M Verma, MVNK Prasad - International Conference on …, 2023 - Springer
Building an efficient deep learning-based Facial Expression Recognition (FER) system is challenging due to the requirements of large amounts of personal data and the rise in data …