You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, when running your code representation_learning_trainer.py, I was confused about the step parameter. For example, to train "FFHQ128-130M-z512-64M" with batch_size = 128, how does the step parameter related to "64M" training samples: as FFHQ contains 70000 images, does it indicate (64000000/70000) times of iterations, and does step parameter stand for number of training epoch?
Could you let me know if I have any misunderstanding. Thank you:)
The text was updated successfully, but these errors were encountered:
Thanks for your attention.
A step means a step of optimization, and training_samples = batch_size * step .
We don't explicitly have a parameter to stop the training process and it can infinitely run.
So "FFHQ128-130M-z512-64M" with batch_size = 128 means you need train it until step $\ge$ (64000000/128) and munually stop it.
Hi, when running your code
representation_learning_trainer.py
, I was confused about thestep
parameter. For example, to train "FFHQ128-130M-z512-64M" withbatch_size = 128
, how does the step parameter related to "64M" training samples: as FFHQ contains 70000 images, does it indicate (64000000/70000) times of iterations, and doesstep
parameter stand for number of training epoch?Could you let me know if I have any misunderstanding. Thank you:)
The text was updated successfully, but these errors were encountered: