Skip to content

xpmir/cross-encoders

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Cross-Encoders

MonoBERT

Reproduces Passage Re-ranking with BERT (Rodrigo Nogueira, Kyunghyun Cho). 2019. https://arxiv.org/abs/1901.04085.

  • monoMLM/monobert: training with bert-base-uncased. Hugging-face: xpmir/monobert

Some derivatives:

  • monoMLM/monoelectra: training with electra

MonoT5

Reproduces Nogueira, R., Jiang, Z., Lin, J., 2020. Document Ranking with a Pretrained Sequence-to-Sequence Model. https://arxiv.org/abs/2003.06713

  • monot5/normal: training with t5-base, using true and false tokens for the answers.

Releases

No releases published

Packages

No packages published

Languages