Skip to content

Commit

Permalink
[docs] update SoftmaxLoss arguments (UKPLab#2894)
Browse files Browse the repository at this point in the history
model sentence embedding dimension and num labels are missing and are mandatory, leading to errors
  • Loading branch information
KiLJ4EdeN authored Aug 20, 2024
1 parent c0fc0e8 commit add421f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/sentence_transformer/training_overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -586,7 +586,7 @@ Training on multiple datasets looks like this:
# (anchor, positive), (anchor, positive, negative)
mnrl_loss = MultipleNegativesRankingLoss(model)
# (sentence_A, sentence_B) + class
softmax_loss = SoftmaxLoss(model)
softmax_loss = SoftmaxLoss(model, model.get_sentence_embedding_dimension(), 3)
# (sentence_A, sentence_B) + score
cosent_loss = CoSENTLoss(model)
Expand Down

0 comments on commit add421f

Please sign in to comment.