Cross-domain sentence modeling for relevance transfer with BERT

Z Akkalyoncu Yilmaz - 2019 - uwspace.uwaterloo.ca
Standard bag-of-words term-matching techniques in document retrieval fail to exploit rich
semantic information embedded in the document texts. One promising recent trend in …

Cross-domain modeling of sentence-level evidence for document retrieval

ZA Yilmaz, W Yang, H Zhang, J Lin - Proceedings of the 2019 …, 2019 - aclanthology.org
This paper applies BERT to ad hoc document retrieval on news articles, which requires
addressing two challenges: relevance judgments in existing test collections are typically …

Highlighting exact matching via marking strategies for ad hoc document ranking with pretrained contextualized language models

L Boualili, JG Moreno, M Boughanem - Information Retrieval Journal, 2022 - Springer
Pretrained language models (PLMs) exemplified by BERT have proven to be remarkably
effective for ad hoc ranking. As opposed to pre-BERT models that required specialized …

Document retrieval using deep learning

S Choudhary, H Guttikonda… - 2020 Systems and …, 2020 - ieeexplore.ieee.org
Document Retrieval has seen significant advancements in the last few decades. Latest
developments in Natural Language Processing have made it possible to incorporate context …

Applying BERT to document retrieval with birch

ZA Yilmaz, S Wang, W Yang, H Zhang… - Proceedings of the 2019 …, 2019 - aclanthology.org
We present Birch, a system that applies BERT to document retrieval via integration with the
open-source Anserini information retrieval toolkit to demonstrate end-to-end search over …

Contextualized offline relevance weighting for efficient and effective neural retrieval

X Chen, B He, K Hui, Y Wang, L Sun… - Proceedings of the 44th …, 2021 - dl.acm.org
Online search latency is a major bottleneck in deploying large-scale pre-trained language
models, eg BERT, in retrieval applications. Inspired by the recent advances in transformer …

Cross-lingual relevance transfer for document retrieval

P Shi, J Lin - arXiv preprint arXiv:1911.02989, 2019 - arxiv.org
Recent work has shown the surprising ability of multi-lingual BERT to serve as a zero-shot
cross-lingual transfer model for a number of language processing tasks. We combine this …

An in-depth analysis of passage-level label transfer for contextual document ranking

K Rudra, ZT Fernando, A Anand - Information Retrieval Journal, 2023 - Springer
Pre-trained contextual language models such as BERT, GPT, and XLnet work quite well for
document retrieval tasks. Such models are fine-tuned based on the query-document/query …

Brown university at trec deep learning 2019

G Zerveas, R Zhang, L Kim, C Eickhoff - arXiv preprint arXiv:2009.04016, 2020 - arxiv.org
This paper describes Brown University's submission to the TREC 2019 Deep Learning track.
We followed a 2-phase method for producing a ranking of passages for a given input query …

Multi-Layer Contextual Passage Term Embedding for Ad-Hoc Retrieval

W Cai, Z Hu, Y Luo, D Liang, Y Feng, J Chen - Information, 2022 - mdpi.com
Nowadays, pre-trained language models such as Bidirectional Encoder Representations
from Transformer (BERT) are becoming a basic building block in Information Retrieval tasks …