You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I tried training a LoRA for FLUX, and I’m on my second attempt. The first one took longer than expected—the process started around 8:00 PM and finished at 4:00-5:00 AM. I’m wondering what factors influence the training time? Is it the size of the images in the dataset or the number of images? If both, which has a bigger impact? I noticed in the code that there’s image resizing (though I’m not sure if that’s what it refers to), but to what percentage? Is it better to use more smaller images or fewer but larger ones? Also, when running the suggested command, it produces an image at the 600th iteration. After this generation, I noticed a drop in performance compared to the first 600 iterations—is that normal?
PS: Thank you so much for this fantastic update, I’ve been waiting for it for a long time!”
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I tried training a LoRA for FLUX, and I’m on my second attempt. The first one took longer than expected—the process started around 8:00 PM and finished at 4:00-5:00 AM. I’m wondering what factors influence the training time? Is it the size of the images in the dataset or the number of images? If both, which has a bigger impact? I noticed in the code that there’s image resizing (though I’m not sure if that’s what it refers to), but to what percentage? Is it better to use more smaller images or fewer but larger ones? Also, when running the suggested command, it produces an image at the 600th iteration. After this generation, I noticed a drop in performance compared to the first 600 iterations—is that normal?
PS: Thank you so much for this fantastic update, I’ve been waiting for it for a long time!”
Beta Was this translation helpful? Give feedback.
All reactions