Replies: 1 comment
-
Hi @IvanVC21, Good point! From what I remember, we tune based on time budget, not number of iterations, so it is hard to guess what will be the total number of iterations during the tuning. If you would like to tune Optuna for better performance, I think that it will be better to use raw Optuna package and write all ML code manually. The 10 starter trials is pretty good default. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I was looking at your XGBoost Optuna Hyperpameter Optimization code and it looks like you dont specify an specific sampler for the Optuna study, therefore it will get to the default (which is TPE Sampler). This TPE Sampler has a parameter named n_startup_trials, which controls the number of trials with random hyperparameters at the beginning of the optimization, and is at its default at 10.
Lets say that I use your optimizer and run it for 2000 trials. If I do that, at trial 10 it would start optimizing it is search for hyperparameters and not allowing for the random search to explore the whole hyperparameter space. So if the user is defining a large hyperparemeter space, then it wont have the chance to be properly explored, so that why I suggest that you could modify n_startup_trials to be around 10-20% of the number of trials so you could give a chance to randomly explore hyperparameter space before starting to optimize the search.
Beta Was this translation helpful? Give feedback.
All reactions