A comprehensive survey on pretrained foundation models: A history from bert to chatgpt

C Zhou, Q Li, C Li, J Yu, Y Liu, G Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks with different data modalities. A PFM (eg, BERT, ChatGPT, and GPT-4) is …

Facer: Contrastive attention based expression recognition via smartphone earpiece speaker

G Wang, Q Yan, S Patrarungrong… - IEEE INFOCOM 2023 …, 2023 - ieeexplore.ieee.org
Facial expression recognition has enormous potential for downstream applications by
revealing users' emotional status when interacting with digital content. Previous studies …

Large language models (LLMs): survey, technical frameworks, and future challenges

P Kumar - Artificial Intelligence Review, 2024 - Springer
Artificial intelligence (AI) has significantly impacted various fields. Large language models
(LLMs) like GPT-4, BARD, PaLM, Megatron-Turing NLG, Jurassic-1 Jumbo etc., have …

Protecting Activity Sensing Data Privacy Using Hierarchical Information Dissociation

G Wang, H Guo, Y Wang, B Chen, C Zhou… - arXiv preprint arXiv …, 2024 - arxiv.org
Smartphones and wearable devices have been integrated into our daily lives, offering
personalized services. However, many apps become overprivileged as their collected …

[PDF][PDF] DATA STREAMING AND SHARING BASED ON AUTHORIZED MULTIMEDIA USING ARTIFICIAL INTELLIGENCE MULTIMEDIA WIRELESS SENSOR …

R VARATHARAJAN - Neural, Parallel, and Scientific …, 2020 - dynamicpublishers.org
The rapid development of multimedia wireless sensor networks (MWSNs) and
communication devices are becoming more and more common nowadays. Design and …