Reproduces Passage Re-ranking with BERT (Rodrigo Nogueira, Kyunghyun Cho). 2019. https://arxiv.org/abs/1901.04085.
monoMLM/monobert
: training with bert-base-uncased. Hugging-face: xpmir/monobert
Some derivatives:
monoMLM/monoelectra
: training with electra
Reproduces Nogueira, R., Jiang, Z., Lin, J., 2020. Document Ranking with a Pretrained Sequence-to-Sequence Model. https://arxiv.org/abs/2003.06713
monot5/normal
: training with t5-base, usingtrue
andfalse
tokens for the answers.