-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
RuntimeError: CUDA out of memory #18
Comments
Try lowering the batch size. Setting batch_size_gpl=16 might work but will take longer to run. The reason for this is the amount of GPU RAM consumed by each batch. 32 examples at once might be too big, but 16 might fit. |
How to use multiple GPUs in this training script? |
Cross-encoder and sentence-transformer does not officially support multi-GPU training, though there are some (very old) forks that have been experimenting with this feature: UKPLab/sentence-transformers#1215 |
I'm getting the same error even when setting |
Hi,
When trying to generate intermediate results with the following command:
I got the following error:
My corpus consists of small paragraphs of 3-4 lines and I used
use_amp
option. How could I deal with it?The text was updated successfully, but these errors were encountered: