C Li, A Yates, S MacAvaney, B He, Y Sun - arXiv e-prints, 2020 - ui.adsabs.harvard.edu
Pretrained transformer models, such as BERT and T5, have shown to be highly effective at
ad-hoc passage and document ranking. Due to inherent sequence length limits of these …