Fix knowledge_distillation=False error #1264
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
When I was fine-tuning
bge-en-icl
, I got an error when use--knowledge_distillation False
error like:
I found that the reason is that when
knowledge_distillation=False
teacher_scores
it becomes None.FlagEmbedding/FlagEmbedding/finetune/embedder/decoder_only/icl/dataset.py
Lines 92 to 100 in 777a9ac
FlagEmbedding/FlagEmbedding/finetune/embedder/decoder_only/icl/dataset.py
Lines 128 to 129 in 777a9ac