Multiple best values for a list of hyperparameter? #2470
-
I am using Optuna with Hydra for hyperparameter tuning. This is my config file. defaults:
- override hydra/sweeper: optuna
- override hydra/sweeper/sampler: tpe
hydra:
mode: MULTIRUN
sweeper:
sampler:
seed: 123
direction: minimize
study_name: sphere
storage: null
n_trials: 20
n_jobs: 1
params:
+n: 2,3,4
x: range(0, 5, step=1.0)
y: choice(-5 ,0 ,5)
+a_lims: "{a_lower : 0.5, a_upper : 1.1, a : 0.91},{a_lower : 0.7, a_upper : 1.15, a : 1.05}, {a_lower : 0.9, a_upper : 1.15, a : 1.09}, {a_lower : 0.97, a_upper : 1.05, a : 1.01}"
x: 1
y: 1 Here i am trying to optimise my training loss for each combination of Can I do this without writing a parent python script controlling what value of |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
So you want to run 12 different optimizations, right? You want to optimize I don't think Hydra's Optuna sweeper supports this use-case. The Optuna sweeper is designed to run global optimization on all sweep parameters. Wrapping your routine with a parent python script would work. Click here for an example using `hydra_zen.launch`:pip install hydra-zen # script.py
from hydra_zen import launch
from omegaconf import DictConfig
yaml_data = """
defaults:
- override hydra/sweeper: optuna
- override hydra/sweeper/sampler: tpe
hydra:
mode: MULTIRUN
sweeper:
sampler:
seed: 123
direction: minimize
study_name: sphere
storage: null
n_trials: 20
n_jobs: 1
params:
x: range(0, 5, step=1.0)
y: choice(-5 ,0 ,5)
x: 1
y: 1
"""
def sphere(cfg: DictConfig) -> float:
x: float = cfg.x
y: float = cfg.y
if cfg.get("error", False):
raise RuntimeError("cfg.error is True")
return x**2 + y**2
for n in (2, 3, 4):
for a_lims in (
"{a_lower : 0.5, a_upper : 1.1, a : 0.91}",
"{a_lower : 0.7, a_upper : 1.15, a : 1.05}",
"{a_lower : 0.9, a_upper : 1.15, a : 1.09}",
"{a_lower : 0.97, a_upper : 1.05, a : 1.01}",
):
launch(
config=yaml_data,
task_function=sphere,
multirun=True,
overrides=[f"+n={n}", f"+a_lims={a_lims}"],
) python script.py
... # one optimization run for each combination of n/a_lims |
Beta Was this translation helpful? Give feedback.
So you want to run 12 different optimizations, right? You want to optimize
x
andy
once for each combination ofn
anda_lims
?I don't think Hydra's Optuna sweeper supports this use-case. The Optuna sweeper is designed to run global optimization on all sweep parameters.
Wrapping your routine with a parent python script would work.
Another idea is to use a for-loop in python to loop over values of
n
anda_lims
, then use a utility likehydra_zen.launch
to run the Hydra app once for each value ofn
anda_lims
.Click here for an example using `hydra_zen.launch`: