From add421f21508cd2baf4cd32af31624c63b355a1d Mon Sep 17 00:00:00 2001 From: Abdolkarim Saeedi Date: Tue, 20 Aug 2024 09:49:05 +0330 Subject: [PATCH] [`docs`] update SoftmaxLoss arguments (#2894) model sentence embedding dimension and num labels are missing and are mandatory, leading to errors --- docs/sentence_transformer/training_overview.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/sentence_transformer/training_overview.md b/docs/sentence_transformer/training_overview.md index 35be59100..80aa2c86b 100644 --- a/docs/sentence_transformer/training_overview.md +++ b/docs/sentence_transformer/training_overview.md @@ -586,7 +586,7 @@ Training on multiple datasets looks like this: # (anchor, positive), (anchor, positive, negative) mnrl_loss = MultipleNegativesRankingLoss(model) # (sentence_A, sentence_B) + class - softmax_loss = SoftmaxLoss(model) + softmax_loss = SoftmaxLoss(model, model.get_sentence_embedding_dimension(), 3) # (sentence_A, sentence_B) + score cosent_loss = CoSENTLoss(model)