L Gao, J Callan - arXiv preprint arXiv:2104.08253, 2021 - boston.lti.cs.cmu.edu
Pre-trained language models (LM) have become go-to text representation encoders. Prior
research used deep LMs to encode text sequences such as sentences and passages into …