-
Notifications
You must be signed in to change notification settings - Fork 517
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] KeyError: 'params/eta' #1244
Comments
Thanks. Would you like to create a PR? |
Yes. #1246 |
One way I worked around this bug is to ensure that If you are using the function trainable API, consider splitting out constants from I believe this bug happens when trying to merge the constants with the sampled hyperparameters in |
Thank you @yxtay. |
Bug details
I am running HPO for XGBoost with Ray and Bendsearch.
At flaml/tune/searcher/search_thread.py#L66, in my case, the
config
isand the
self._const
isafter update step, I will get
Values in
config['params']
sampled from search space are all dropped.How to solve
I solved it by recursively update
config
. Here is an example:Then just replace
config.update(self._const)
withrecursive_update(config, self._const)
, then i can get:My Traceback
The text was updated successfully, but these errors were encountered: