Deep learning in sentiment analysis: Recent architectures

T Abdullah, A Ahmet - ACM Computing Surveys, 2022 - dl.acm.org
Humans are increasingly integrated with devices that enable the collection of vast
unstructured opinionated data. Accurately analysing subjective information from this data is …

Balancing discriminability and transferability for source-free domain adaptation

JN Kundu, AR Kulkarni, S Bhambri… - International …, 2022 - proceedings.mlr.press
Conventional domain adaptation (DA) techniques aim to improve domain transferability by
learning domain-invariant representations; while concurrently preserving the task …

GPL: Generative pseudo labeling for unsupervised domain adaptation of dense retrieval

K Wang, N Thakur, N Reimers, I Gurevych - arXiv preprint arXiv …, 2021 - arxiv.org
Dense retrieval approaches can overcome the lexical gap and lead to significantly improved
search results. However, they require large amounts of training data which is not available …

Coco-dr: Combating distribution shifts in zero-shot dense retrieval with contrastive and distributionally robust learning

Y Yu, C Xiong, S Sun, C Zhang, A Overwijk - arXiv preprint arXiv …, 2022 - arxiv.org
We present a new zero-shot dense retrieval (ZeroDR) method, COCO-DR, to improve the
generalization ability of dense retrieval by combating the distribution shifts between source …

Low-resource dense retrieval for open-domain question answering: A comprehensive survey

X Shen, S Vakulenko, M Del Tredici… - arXiv preprint arXiv …, 2022 - arxiv.org
Dense retrieval (DR) approaches based on powerful pre-trained language models (PLMs)
achieved significant advances and have become a key component for modern open-domain …

Domain adaptation for deep entity resolution

J Tu, J Fan, N Tang, P Wang, C Chai, G Li… - Proceedings of the …, 2022 - dl.acm.org
Entity resolution (ER) is a core problem of data integration. The state-of-the-art (SOTA)
results on ER are achieved by deep learning (DL) based methods, trained with a lot of …

On the domain adaptation and generalization of pretrained language models: A survey

X Guo, H Yu - arXiv preprint arXiv:2211.03154, 2022 - arxiv.org
Recent advances in NLP are brought by a range of large-scale pretrained language models
(PLMs). These PLMs have brought significant performance gains for a range of NLP tasks …

Vibe: Topic-driven temporal adaptation for twitter classification

Y Zhang, J Li, W Li - arXiv preprint arXiv:2310.10191, 2023 - arxiv.org
Language features are evolving in real-world social media, resulting in the deteriorating
performance of text classification in dynamics. To address this challenge, we study temporal …

Domain confused contrastive learning for unsupervised domain adaptation

Q Long, T Luo, W Wang, SJ Pan - arXiv preprint arXiv:2207.04564, 2022 - arxiv.org
In this work, we study Unsupervised Domain Adaptation (UDA) in a challenging self-
supervised approach. One of the difficulties is how to learn task discrimination in the …

Climategpt: Towards ai synthesizing interdisciplinary research on climate change

D Thulke, Y Gao, P Pelser, R Brune, R Jalota… - arXiv preprint arXiv …, 2024 - arxiv.org
This paper introduces ClimateGPT, a model family of domain-specific large language
models that synthesize interdisciplinary research on climate change. We trained two 7B …