R Zhang, Y Ji, Y Zhang… - Proceedings of the 2022 …, 2022 - aclanthology.org
Current NLP models heavily rely on effective representation learning algorithms. Contrastive learning is one such technique to learn an embedding space such that similar data sample …
JJ Peper, L Wang - arXiv preprint arXiv:2211.07743, 2022 - arxiv.org
Generative models have demonstrated impressive results on Aspect-based Sentiment Analysis (ABSA) tasks, particularly for the emerging task of extracting Aspect-Category …
Sentence representation learning is a crucial task in natural language processing, as the quality of learned representations directly influences downstream tasks, such as sentence …
Supervised contrastive learning (SCL) frameworks treat each class as independent and thus consider all classes to be equally important. This neglects the common scenario in which …
Despite being spoken by a large population of speakers worldwide, Cantonese is under- resourced in terms of the data scale and diversity compared to other major languages. This …
W Zhu, P Wang, X Wang, Y Ni… - ICASSP 2023-2023 IEEE …, 2023 - ieeexplore.ieee.org
Contrastive learning (CL) has achieved great success in various fields with self-supervised learning. However, CL under the supervised setting is not fully explored, especially how to …
Financial sector and especially the insurance industry collect vast volumes of text on a daily basis and through multiple channels (their agents, customer care centers, emails, social …
H Ye, R Sunderraman, S Ji - IEEE Transactions on Knowledge …, 2024 - ieeexplore.ieee.org
The eXtreme Multi-label text Classification (XMC) refers to training a classifier that assigns a text sample with relevant labels from an extremely large-scale label set (eg, millions of …
S Mwongela, J Patel, S Rajasekharan… - International …, 2023 - pml4dc.github.io
The leading approaches in modern Natural Language Processing (NLP) are notoriously data-hungry. A good example is Transformer models, which achieve surging and state-of …