-
Notifications
You must be signed in to change notification settings - Fork 2.5k
Issues: UKPLab/sentence-transformers
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Memory leaked when the model and trainer were reinitialized
#3136
opened Dec 14, 2024 by
earthlovebpt
How to set a proper batchsize when using CachedMultipleNegativesRankingLoss?
#3134
opened Dec 13, 2024 by
awmoe
grad_norm 0.0 while finetuning using group_by_label batch sampler
#3130
opened Dec 10, 2024 by
AmoghM
MSMARCO training with SentenceTransformersTrainer instead of deprecated training scripts
#3128
opened Dec 10, 2024 by
sirCamp
missing docuementation for encode for image embedding models
#3118
opened Dec 4, 2024 by
KennethEnevoldsen
adding new tokens to tokenizer without disturbing the base models embedding weight metrics of tokens
#3116
opened Dec 4, 2024 by
riyajatar37003
save function does save modules.json incorectly for jina-embeddings-v3
bug
Something isn't working
#3111
opened Dec 3, 2024 by
guenthermi
Parse arguments with HfArgumentParser
bug
Something isn't working
#3090
opened Nov 27, 2024 by
chocoded
SentenceTransformer._first_module() and issues created with bespoke architectures
#3087
opened Nov 26, 2024 by
slobstone
Add
normalize_embeddings
Argument to SentenceTransformer
for Simplified Embedding Normalization
#3064
opened Nov 17, 2024 by
AIMacGyver
FileNotFoundError when using SentenceTransformerTrainingArguments(load_best_model_at_end=True) and Peft
bug
Something isn't working
good first issue
Good for newcomers
#3056
opened Nov 14, 2024 by
GTimothee
'scale' hyperparameter in MultipleNegativesRankingLoss
question
Further information is requested
#3054
opened Nov 14, 2024 by
gnatesan
Previous Next
ProTip!
Exclude everything labeled
bug
with -label:bug.