Replies: 1 comment 2 replies
-
it has 10gb already allocated and its asking for 8gb more - no way thats just 512x512, make sure you don't have hires or upscale settings set somewhere. |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi folks, I had this distribution purring along... and i deleted some old directories, turned out to the version of python i was pointing at - i reinstalled python, but not my install of VLAD crashes with memory errors over 512X512 - i was doing 1024x1024 no problem before, so i know my 3080 TI can do it...
TL/DR : Can anyone tell me how to resolve this error? would love to get back to where i was :( thank you!!
RuntimeError: CUDA out of memory. Tried to allocate 8.00 GiB (GPU 0; 12.00 GiB total capacity; 10.18 GiB already
allocated; 0 bytes free; 10.21 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting
max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
Beta Was this translation helpful? Give feedback.
All reactions