[图书][B] Pretrained transformers for text ranking: Bert and beyond

J Lin, R Nogueira, A Yates - 2022 - books.google.com
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …

Document ranking with a pretrained sequence-to-sequence model

R Nogueira, Z Jiang, J Lin - arXiv preprint arXiv:2003.06713, 2020 - arxiv.org
This work proposes a novel adaptation of a pretrained sequence-to-sequence model to the
task of document ranking. Our approach is fundamentally different from a commonly …

Passage Re-ranking with BERT

R Nogueira, K Cho - arXiv preprint arXiv:1901.04085, 2019 - arxiv.org
Recently, neural models pretrained on a language modeling task, such as ELMo (Peters et
al., 2017), OpenAI GPT (Radford et al., 2018), and BERT (Devlin et al., 2018), have achieved …

Multi-stage document ranking with BERT

R Nogueira, W Yang, K Cho, J Lin - arXiv preprint arXiv:1910.14424, 2019 - arxiv.org
The advent of deep neural networks pre-trained via language modeling tasks has spurred a
number of successful applications in natural language processing. This work explores one …

Utilizing BERT for Information Retrieval: Survey, Applications, Resources, and Challenges

J Wang, JX Huang, X Tu, J Wang, AJ Huang… - ACM Computing …, 2024 - dl.acm.org
Recent years have witnessed a substantial increase in the use of deep learning to solve
various natural language processing (NLP) problems. Early deep learning models were …

CEDR: Contextualized embeddings for document ranking

S MacAvaney, A Yates, A Cohan… - Proceedings of the 42nd …, 2019 - dl.acm.org
Although considerable attention has been given to neural ranking architectures recently, far
less attention has been paid to the term representations that are used as input to these …

A deep look into neural ranking models for information retrieval

J Guo, Y Fan, L Pang, L Yang, Q Ai, H Zamani… - Information Processing …, 2020 - Elsevier
Ranking models lie at the heart of research on information retrieval (IR). During the past
decades, different techniques have been proposed for constructing ranking models, from …

PARADE: Passage Representation Aggregation forDocument Reranking

C Li, A Yates, S MacAvaney, B He, Y Sun - ACM Transactions on …, 2023 - dl.acm.org
Pre-trained transformer models, such as BERT and T5, have shown to be highly effective at
ad hoc passage and document ranking. Due to the inherent sequence length limits of these …

Rethinking search: making domain experts out of dilettantes

D Metzler, Y Tay, D Bahri, M Najork - Acm sigir forum, 2021 - dl.acm.org
When experiencing an information need, users want to engage with a domain expert, but
often turn to an information retrieval system, such as a search engine, instead. Classical …

An introduction to neural information retrieval

B Mitra, N Craswell - Foundations and Trends® in Information …, 2018 - nowpublishers.com
Neural ranking models for information retrieval (IR) use shallow or deep neural networks to
rank search results in response to a query. Traditional learning to rank models employ …