8gb Flux LoRa training - is it possible? #1544
Replies: 3 comments 2 replies
-
ok, so, atm this configuration works with my nvidia 2700 max-q 8gb gpu. This is bf16 mixed precision and is EXTREMELY (40s/it) slow. Sadly with fp16 mixed precision it gave a avr_loss=nan or else it would be much faster. I'll keep experimenting |
Beta Was this translation helpful? Give feedback.
-
There is a 4-bit model that takes ~8.5GB and its author used it for fine tuning, but it is probably not supported out of the box in kohya |
Beta Was this translation helpful? Give feedback.
-
I can't even have a proper sdxl training on 8gb, what do you think. |
Beta Was this translation helpful? Give feedback.
-
as per title, is it possible to squeeze a training to fit 8gb vram?
also
what are the option to use less vram? (apart from lowering network dim and using 512 images)
Beta Was this translation helpful? Give feedback.
All reactions