Replies: 2 comments 2 replies
-
Well, depending on your dataset, unless you have like ~170 hours of dataset, 10k epochs is not needed. You can just interrupt your training session at anytime if you feel that it's enough. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Just edit your |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
My training stops at 1000 epochs on google collab. I have enough credits. Any ideas on how to train to 10k epochs without getting interrupted ?
Beta Was this translation helpful? Give feedback.
All reactions