Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why training an ITER is over #307

Open
XXMxxm220 opened this issue May 23, 2023 · 7 comments
Open

Why training an ITER is over #307

XXMxxm220 opened this issue May 23, 2023 · 7 comments

Comments

@XXMxxm220
Copy link

Why training an ITER is over?

@XXMxxm220
Copy link
Author

训练一个epoch后 不再继续

@zhanghaowei01
Copy link

您好,问题解决了没

@XXMxxm220
Copy link
Author

您好,问题解决了没

解决啦

@zhanghaowei01
Copy link

分享一下呗

@XXMxxm220
Copy link
Author

分享一下呗

你好,我采用的方法是在for it 前,加入了for epoch in range epochs:;epochs是我自己设定的训练周期,再将save_model语句也包含在for epoch 内,保证每个epch保存一次模型。

@CoinCheung
Copy link
Owner

Why do you need this? You can compute the total iterations by len(dataset) * n_epoches / batch_size, and set the total iterations as this computation result.

@XXMxxm220
Copy link
Author

Why do you need this? You can compute the total iterations by , and set the total iterations as this computation result.len(dataset) * n_epoches / batch_size

You are right .Thank you for your response.I just wanted to use epoch to mean a training, not iters,so I choose that way. This approach also does not affect the final output

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants