Parameter-efficient transfer learning for NLP

N Houlsby, A Giurgiu, S Jastrzebski… - International …, 2019 - proceedings.mlr.press
Fine-tuning large pretrained models is an effective transfer mechanism in NLP. However, in
the presence of many downstream tasks, fine-tuning is parameter inefficient: an entire new …

Offensive language detection in Tamil YouTube comments by adapters and cross-domain knowledge transfer

M Subramanian, R Ponnusamy, S Benhur… - Computer Speech & …, 2022 - Elsevier
Over the past few years, researchers have been focusing on the identification of offensive
language on social networks. In places where English is not the primary language, social …

One person, one model, one world: Learning continual user representation without forgetting

F Yuan, G Zhang, A Karatzoglou, J Jose… - Proceedings of the 44th …, 2021 - dl.acm.org
Learning user representations is a vital technique toward effective user modeling and
personalized recommender systems. Existing approaches often derive an individual set of …

Adaptable multi-domain language model for transformer asr

T Lee, MJ Lee, TG Kang, S Jung… - ICASSP 2021-2021 …, 2021 - ieeexplore.ieee.org
We propose an adapter based multi-domain Transformer based language model (LM) for
Transformer ASR. The model consists of a big size common LM and small size adapters …

[PDF][PDF] Whisper Multilingual Downstream Task Tuning Using Task Vectors

JH Kang, JH Lee, MH Lee, JH Chang - Proc. Interspeech 2024, 2024 - isca-archive.org
Recently, the size of automatic speech recognition (ASR) models has been increasing,
similar to large language models (LLMs), and efficient tuning to enhance the performance of …

Sentiment Analysis of COVID-19 Tweets: How Does BERT Perform?

K Sadia, S Basak - Proceedings of International Joint Conference on …, 2021 - Springer
In a world, where microblogging is a regular component of modern life, COVID-19, a sudden
pandemic, made its own way with public opinion in social networking sites too. People …

Block-to-scene pre-training for point cloud hybrid-domain masked autoencoders

Y Zha, T Dai, Y Wang, H Guo, T Zhang… - arXiv preprint arXiv …, 2024 - arxiv.org
Point clouds, as a primary representation of 3D data, can be categorized into scene domain
point clouds and object domain point clouds based on the modeled content. Masked …

A study of fine tuning pre-trained Korean BERT for question answering performance development

CH Lee, YJ Lee, DH Lee - Journal of Information Technology …, 2020 - koreascience.kr
Abstract Language Models such as BERT has been an important factor of deep learning-
based natural language processing. Pre-training the transformer-based language models …

[PDF][PDF] On finetuning Adapter-based Transformer models for classifying Abusive Social Media Tamil Comments

M Subramanian, K Shanmugavadivel, N Subbarayan… - 2023 - scholar.archive.org
Speaking or expressing oneself in an abusive manner is a form of verbal abuse that targets
individuals or groups on the basis of their membership in a particular social group, which is …

Deep Learning Based Chatbot in Fintech Applications

Q Xie - 2023 - search.proquest.com
Recent advancements in artificial intelligence, particularly in natural language processing
(NLP) and large language models (LLM), have utilized deep neural networks trained with …