Releases: allanj/pytorch_neural_crf
Releases · allanj/pytorch_neural_crf
Fine-Tuning LSTM-CRF model with HuggingFace's Transformers
On top of the previous version, we add another option to fine-tune BERT/Roberta/etc-based LSTM-CRF model using HuggingFace's Transformers package.
Our latest benchmark results show that we obtained SOTA results on both CoNLL-2003 and OntoNotes datasets. And we also show that Roberta-CRF is better than BERT-CRF.
LSTM-CRF model with static context embedding
As we are going to pushing the version with the transformers package (by huggingface) integrated, it is necessary to release the current version as a base. We try to make sure everyone who uses this repo can get back to this version if he/she wants to start from there.
The major functionality:
- State-of-the-art BiLSTM-CRF architecture
- It allows users to incorporate static context embedding.
- Users can save the model and use it for other downstream tasks.