[图书][B] Pretrained transformers for text ranking: Bert and beyond

J Lin, R Nogueira, A Yates - 2022 - books.google.com
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …

Transfer learning approaches for building cross-language dense retrieval models

S Nair, E Yang, D Lawrie, K Duh, P McNamee… - … on Information Retrieval, 2022 - Springer
The advent of transformer-based models such as BERT has led to the rise of neural ranking
models. These models have improved the effectiveness of retrieval systems well beyond that …

Cross-language information retrieval

P Galuščáková, DW Oard, S Nair - arXiv preprint arXiv:2111.05988, 2021 - arxiv.org
Two key assumptions shape the usual view of ranked retrieval:(1) that the searcher can
choose words for their query that might appear in the documents that they wish to see, and …

On cross-lingual retrieval with multilingual text encoders

R Litschko, I Vulić, SP Ponzetto, G Glavaš - Information Retrieval Journal, 2022 - Springer
Pretrained multilingual text encoders based on neural transformer architectures, such as
multilingual BERT (mBERT) and XLM, have recently become a default paradigm for cross …

Improving cross-lingual information retrieval on low-resource languages via optimal transport distillation

Z Huang, P Yu, J Allan - Proceedings of the Sixteenth ACM International …, 2023 - dl.acm.org
Benefiting from transformer-based pre-trained language models, neural ranking models
have made significant progress. More recently, the advent of multilingual pre-trained …

C3: Continued pretraining with contrastive weak supervision for cross language ad-hoc retrieval

E Yang, S Nair, R Chandradevan… - Proceedings of the 45th …, 2022 - dl.acm.org
Pretrained language models have improved effectiveness on numerous tasks, including ad-
hoc retrieval. Recent work has shown that continuing to pretrain a language model with …

Cross-lingual language model pretraining for retrieval

P Yu, H Fei, P Li - Proceedings of the Web Conference 2021, 2021 - dl.acm.org
Existing research on cross-lingual retrieval cannot take good advantage of large-scale
pretrained language models such as multilingual BERT and XLM. We hypothesize that the …

Simple yet effective neural ranking and reranking baselines for cross-lingual information retrieval

J Lin, D Alfonso-Hermelo, V Jeronymo… - arXiv preprint arXiv …, 2023 - arxiv.org
The advent of multilingual language models has generated a resurgence of interest in cross-
lingual information retrieval (CLIR), which is the task of searching documents in one …

Steering large language models for cross-lingual information retrieval

P Guo, Y Ren, Y Hu, Y Cao, Y Li, H Huang - Proceedings of the 47th …, 2024 - dl.acm.org
In today's digital age, accessing information across language barriers poses a significant
challenge, with conventional search systems often struggling to interpret and retrieve …

Soft Prompt Decoding for Multilingual Dense Retrieval

Z Huang, H Zeng, H Zamani, J Allan - Proceedings of the 46th …, 2023 - dl.acm.org
In this work, we explore a Multilingual Information Retrieval (MLIR) task, where the collection
includes documents in multiple languages. We demonstrate that applying state-of-the-art …