You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Line 872: Epoch 0 all_batch_num 510 loss 2.18 acc 0.61 0.8349 time 225.83
Line 1418: Epoch 1 all_batch_num 510 loss 2.70 acc 0.75 0.8872 time 224.98
Line 1964: Epoch 2 all_batch_num 510 loss 2.43 acc 0.83 0.9069 time 221.33
Line 2510: Epoch 3 all_batch_num 510 loss 2.96 acc 0.87 0.9104 time 221.77
Line 3056: Epoch 4 all_batch_num 510 loss 3.74 acc 0.88 0.8987 time 216.13
Line 3602: Epoch 5 all_batch_num 510 loss 4.38 acc 0.89 0.9288 time 210.82
Line 4148: Epoch 6 all_batch_num 510 loss 5.18 acc 0.90 0.9268 time 218.51
Line 4694: Epoch 7 all_batch_num 510 loss 6.28 acc 0.90 0.9312 time 210.94
Line 5240: Epoch 8 all_batch_num 510 loss 7.62 acc 0.91 0.9326 time 218.76
Line 5786: Epoch 9 all_batch_num 510 loss 8.98 acc 0.91 0.9245 time 221.23
Line 6332: Epoch 10 all_batch_num 510 loss 10.36 acc 0.91 0.9182 time 212.32
Line 6878: Epoch 11 all_batch_num 510 loss 11.74 acc 0.92 0.9292 time 211.89
The text was updated successfully, but these errors were encountered:
@FrankWork when I am training the adversarial network by rebuilding network with TF , the domain_loss is decreasing when the loss_adv is increasing.
But due to the discrimanor of loss_adv, the loss_adv will be decreasing so that the domain_loss will be increasing and the acc will be decreasing.
So it will lose convergence and what i can do to prevent this ? Or I build the network incorrectly?
The text was updated successfully, but these errors were encountered: