You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
First of all i tried finding similar questions but the only one that sounded similar was about a bug during hyperparameter tuning.
I am trying to train with yolov5x6 on a decently big dataset.
My training on said dataset takes about 2h per Epoch.
Since training time = tuning time (as far as i know), this can take quite a long time if i choose a high enough number of epochs to get a decent mAP50-95.
The big question is how can i efficiently hyperparameter tune in a timely manner and how does Yolov5 support this.
Should i only train with a fraction of my dataset (is there a option to only use samples) ?
Should i reduce my number of Epochs ?
If yes, then how do i determine my minimum epochs for the tuning ?
Is there a rule of thumb to have at least X% of mAP at that epoch or something like that ?
I already did some testruns with hyperparameter tuning so i could infere that point in my performance graphs.
(L Model for comparison: 20 Epochs: mAP50 = 0.95; mAP50-95 = 0.681)
Or do i simply use 1 Epoch to get as many results in my limited time ?
Can i hyperparameter tune on a yolov5 n or s model then adapt those hyperparameters to the x model ?
It may not translate well to the increase in model size but getting closer to great hyperparameters would be at least a good start.
Lastly i couldnt find Information about how i can train only specific Hyperparameters. Is there an option during evolve ? Can i create or modify any files to restrict changing certain parameters ?_
After excessively searching the forum i found a solution to "lock" certain hyperparameters:
Open train.py, go to the meta section and set mutation scale to 0 to not evolve certain hyper params. #787 (comment)
Thank you in advance if you can take time to answer this.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
First of all i tried finding similar questions but the only one that sounded similar was about a bug during hyperparameter tuning.
I am trying to train with yolov5x6 on a decently big dataset.
My training on said dataset takes about 2h per Epoch.
Since training time = tuning time (as far as i know), this can take quite a long time if i choose a high enough number of epochs to get a decent mAP50-95.
The big question is how can i efficiently hyperparameter tune in a timely manner and how does Yolov5 support this.
Should i only train with a fraction of my dataset (is there a option to only use samples) ?
Should i reduce my number of Epochs ?
If yes, then how do i determine my minimum epochs for the tuning ?
Is there a rule of thumb to have at least X% of mAP at that epoch or something like that ?
I already did some testruns with hyperparameter tuning so i could infere that point in my performance graphs.
(L Model for comparison: 20 Epochs: mAP50 = 0.95; mAP50-95 = 0.681)
Or do i simply use 1 Epoch to get as many results in my limited time ?
Can i hyperparameter tune on a yolov5 n or s model then adapt those hyperparameters to the x model ?
It may not translate well to the increase in model size but getting closer to great hyperparameters would be at least a good start.
Lastly i couldnt find Information about how i can train only specific Hyperparameters.Is there an option during evolve ?Can i create or modify any files to restrict changing certain parameters ?_After excessively searching the forum i found a solution to "lock" certain hyperparameters:
Open train.py, go to the meta section and set mutation scale to 0 to not evolve certain hyper params.
#787 (comment)
Thank you in advance if you can take time to answer this.
Beta Was this translation helpful? Give feedback.
All reactions