CLSEP: Contrastive learning of sentence embedding with prompt

Q Wang, W Zhang, T Lei, Y Cao, D Peng… - Knowledge-Based …, 2023 - Elsevier
Sentence embedding, which aims to learn an effective representation of the sentence, is
beneficial for downstream tasks. By utilizing contrastive learning, most recent sentence …

Contrastive learning models for sentence representations

L Xu, H Xie, Z Li, FL Wang, W Wang, Q Li - ACM Transactions on …, 2023 - dl.acm.org
Sentence representation learning is a crucial task in natural language processing, as the
quality of learned representations directly influences downstream tasks, such as sentence …

Nugget: Neural agglomerative embeddings of text

G Qin, B Van Durme - International Conference on Machine …, 2023 - proceedings.mlr.press
Embedding text sequences is a widespread requirement in modern language
understanding. Existing approaches focus largely on constant-size representations. This is …

A semantic-enhancement-based social network user-alignment algorithm

Y Huang, P Zhao, Q Zhang, L Xing, H Wu, H Ma - Entropy, 2023 - mdpi.com
User alignment can associate multiple social network accounts of the same user. It has
important research implications. However, the same user has various behaviors and friends …

Medical question summarization with entity-driven contrastive learning

S Wei, W Lu, X Peng, S Wang, YF Wang… - arXiv preprint arXiv …, 2023 - arxiv.org
By summarizing longer consumer health questions into shorter and essential ones, medical
question answering (MQA) systems can more accurately understand consumer intentions …

Knowledge Graph‐Based Hierarchical Text Semantic Representation

Y Wu, X Pan, J Li, S Dou, J Dong… - International journal of …, 2024 - Wiley Online Library
Document representation is the basis of language modeling. Its goal is to turn natural
language text that flows into a structured form that can be stored and processed by a …

Semantic-Aware Contrastive Sentence Representation Learning with Large Language Models

H Wang, L Cheng, Z Li, DW Soh, L Bing - arXiv preprint arXiv:2310.10962, 2023 - arxiv.org
Contrastive learning has been proven to be effective in learning better sentence
representations. However, to train a contrastive learning model, large numbers of labeled …

SimCSE++: Improving contrastive learning for sentence embeddings from two perspectives

J Xu, W Shao, L Chen, L Liu - arXiv preprint arXiv:2305.13192, 2023 - arxiv.org
This paper improves contrastive learning for sentence embeddings from two perspectives:
handling dropout noise and addressing feature corruption. Specifically, for the first …

Large Language Models can Contrastively Refine their Generation for Better Sentence Representation Learning

H Wang, Z Li, L Cheng, L Bing - … of the 2024 Conference of the …, 2024 - aclanthology.org
Recently, large language models (LLMs) have emerged as a groundbreaking technology
and their unparalleled text generation capabilities have sparked interest in their application …

Identical and Fraternal Twins: Fine-Grained Semantic Contrastive Learning of Sentence Representations

Q Xiao, S Li, L Chen - ECAI 2023, 2023 - ebooks.iospress.nl
The enhancement of unsupervised learning of sentence representations has been
significantly achieved by the utility of contrastive learning. This approach clusters the …