This work proposes a novel adaptation of a pretrained sequence-to-sequence model to the task of document ranking. Our approach is fundamentally different from a commonly …
R Nogueira, K Cho - arXiv preprint arXiv:1901.04085, 2019 - arxiv.org
Recently, neural models pretrained on a language modeling task, such as ELMo (Peters et al., 2017), OpenAI GPT (Radford et al., 2018), and BERT (Devlin et al., 2018), have achieved …
The advent of deep neural networks pre-trained via language modeling tasks has spurred a number of successful applications in natural language processing. This work explores one …
J Wang, JX Huang, X Tu, J Wang, AJ Huang… - ACM Computing …, 2024 - dl.acm.org
Recent years have witnessed a substantial increase in the use of deep learning to solve various natural language processing (NLP) problems. Early deep learning models were …
Although considerable attention has been given to neural ranking architectures recently, far less attention has been paid to the term representations that are used as input to these …
Ranking models lie at the heart of research on information retrieval (IR). During the past decades, different techniques have been proposed for constructing ranking models, from …
Pre-trained transformer models, such as BERT and T5, have shown to be highly effective at ad hoc passage and document ranking. Due to the inherent sequence length limits of these …
When experiencing an information need, users want to engage with a domain expert, but often turn to an information retrieval system, such as a search engine, instead. Classical …
B Mitra, N Craswell - Foundations and Trends® in Information …, 2018 - nowpublishers.com
Neural ranking models for information retrieval (IR) use shallow or deep neural networks to rank search results in response to a query. Traditional learning to rank models employ …