You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It seems like in line 80 of train.py, the step is changed only when used_sample > phase x 2. This makes sense in resolution >= 16x16, because each step is transition + stablization.
But in 8x8, alpha=1 all the time. So if I am understanding correctly, all transitions and stabilization are trained for "phase" (600,000) samples, except the 8x8 stabilization for phase x 2 (1,200,000) samples, right?
Is this behavior consistent with the original implementation? I expected 8x8 stablization to also train for "phase" samples.
Also, thank you for this amazing repo!
The text was updated successfully, but these errors were encountered:
It seems like in line 80 of train.py, the step is changed only when used_sample > phase x 2. This makes sense in resolution >= 16x16, because each step is transition + stablization.
But in 8x8, alpha=1 all the time. So if I am understanding correctly, all transitions and stabilization are trained for "phase" (600,000) samples, except the 8x8 stabilization for phase x 2 (1,200,000) samples, right?
Is this behavior consistent with the original implementation? I expected 8x8 stablization to also train for "phase" samples.
Also, thank you for this amazing repo!
The text was updated successfully, but these errors were encountered: