[图书][B] Pretrained transformers for text ranking: Bert and beyond

J Lin, R Nogueira, A Yates - 2022 - books.google.com
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …

Large language models can accurately predict searcher preferences

P Thomas, S Spielman, N Craswell… - Proceedings of the 47th …, 2024 - dl.acm.org
Much of the evaluation and tuning of a search system relies on relevance labels---
annotations that say whether a document is useful for a given search and searcher. Ideally …

Improving efficient neural ranking models with cross-architecture knowledge distillation

S Hofstätter, S Althammer, M Schröder… - arXiv preprint arXiv …, 2020 - arxiv.org
Retrieval and ranking models are the backbone of many applications such as web search,
open domain QA, or text-based recommender systems. The latency of neural ranking …

Utilizing BERT for Information Retrieval: Survey, Applications, Resources, and Challenges

J Wang, JX Huang, X Tu, J Wang, AJ Huang… - ACM Computing …, 2024 - dl.acm.org
Recent years have witnessed a substantial increase in the use of deep learning to solve
various natural language processing (NLP) problems. Early deep learning models were …

Learning passage impacts for inverted indexes

A Mallia, O Khattab, T Suel, N Tonellotto - Proceedings of the 44th …, 2021 - dl.acm.org
Neural information retrieval systems typically use a cascading pipeline, in which a first-stage
model retrieves a candidate set of documents and one or more subsequent stages re-rank …

PARADE: Passage Representation Aggregation forDocument Reranking

C Li, A Yates, S MacAvaney, B He, Y Sun - ACM Transactions on …, 2023 - dl.acm.org
Pre-trained transformer models, such as BERT and T5, have shown to be highly effective at
ad hoc passage and document ranking. Due to the inherent sequence length limits of these …

BERT-QE: contextualized query expansion for document re-ranking

Z Zheng, K Hui, B He, X Han, L Sun, A Yates - arXiv preprint arXiv …, 2020 - arxiv.org
Query expansion aims to mitigate the mismatch between the language used in a query and
in a document. However, query expansion methods can suffer from introducing non-relevant …

Efficient and effective tree-based and neural learning to rank

S Bruch, C Lucchese, FM Nardini - Foundations and Trends® …, 2023 - nowpublishers.com
As information retrieval researchers, we not only develop algorithmic solutions to hard
problems, but we also insist on a proper, multifaceted evaluation of ideas. The literature on …

Societal biases in retrieved contents: Measurement framework and adversarial mitigation of bert rankers

N Rekabsaz, S Kopeinik, M Schedl - … of the 44th International ACM SIGIR …, 2021 - dl.acm.org
Societal biases resonate in the retrieved contents of information retrieval (IR) systems,
resulting in reinforcing existing stereotypes. Approaching this issue requires established …

Orcas: 18 million clicked query-document pairs for analyzing search

N Craswell, D Campos, B Mitra, E Yilmaz… - Proceedings of the 29th …, 2020 - dl.acm.org
Users of Web search engines reveal their information needs through queries and clicks,
making click logs a useful asset for information retrieval. However, click logs have not been …