Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Major] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers #1604

Merged
merged 5 commits into from
Jul 26, 2024

Conversation

kneureither
Copy link
Contributor

🔬 Background

🔮 Key changes

Removed the arguments future_regressors_num_hidden_layers and future_regressors_d_hidden and replaced them with a single argument future_regressors_layers : Optional[List[int]]. (following the practice of ar_layers)

  • length holds the number of hidden layers
  • values hold the dimension of each layer

📋 Review Checklist

  • I have performed a self-review of my own code.
  • I have commented my code, added docstrings and data types to function definitions.
  • I have added pytests to check whether my feature / fix works.

@kneureither kneureither changed the title Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers [fix] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers Jun 28, 2024
@kneureither kneureither changed the title [fix] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers [minor] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers Jun 28, 2024
@ourownstory ourownstory changed the title [minor] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers [Major] Combined future regressor layer arguments into one single integer list regressors_layers like ar_layers Jul 1, 2024
@ourownstory
Copy link
Owner

Labelled as Major as it is a breaking change.

Copy link
Owner

@ourownstory ourownstory left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you - Looking good overall!
I left few suggestions.

Number of hidden layers in the neural network model for future regressors.
Ignored if ``future_regressors_model`` is ``linear``.
future_regressors_layers: list of int
array of hidden layer dimensions of the future regressor nets. Specifies number of hidden layers (number of entries)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to avoid confusion, phase array of.. as list of ...
Should also state default values.
Looks like we currently have [4,4]. We might want roll this back to no hidden layers by default. Thoughts?

@@ -424,8 +421,7 @@ def __init__(
season_global_local: np_types.SeasonGlobalLocalMode = "global",
seasonality_local_reg: Optional[Union[bool, float]] = False,
future_regressors_model: np_types.FutureRegressorsModel = "linear",
future_regressors_d_hidden: int = 4,
future_regressors_num_hidden_layers: int = 2,
future_regressors_layers: Optional[list] = [4, 4],
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The default for this should be identical or lower than ar_layers and lagged_reg_layers

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I simply used the previous default, but agree it makes sense to set the default to []. Done that.

@MaiBe-ctrl MaiBe-ctrl merged commit 18e3e4d into ourownstory:main Jul 26, 2024
8 of 10 checks passed
Copy link

Model Benchmark

Benchmark Metric main current diff
AirPassengers MAE_val 30.8317 30.8317 0.0%
AirPassengers RMSE_val 31.8177 31.8177 0.0%
AirPassengers Loss_val 0.01301 0.01301 0.0%
AirPassengers train_loss 0.00071 0.00071 0.0%
AirPassengers reg_loss 0 0 0.0%
AirPassengers MAE 6.88164 6.88164 0.0%
AirPassengers RMSE 8.92097 8.92097 0.0%
AirPassengers Loss 0.00076 0.00076 0.0%
AirPassengers time 17.5903 18.01 2.39%
EnergyPriceDaily MAE_val 5.64183 5.64183 0.0%
EnergyPriceDaily RMSE_val 7.19915 7.19915 0.0%
EnergyPriceDaily Loss_val 0.02893 0.02893 0.0%
EnergyPriceDaily train_loss 0.02957 0.02957 0.0%
EnergyPriceDaily reg_loss 0 0 0.0%
EnergyPriceDaily MAE 6.39657 6.39657 0.0%
EnergyPriceDaily RMSE 8.56213 8.56213 0.0%
EnergyPriceDaily Loss 0.02936 0.02936 0.0%
EnergyPriceDaily time 85.4153 89.36 4.62% ⚠️
YosemiteTemps MAE_val 0.59751 0.59751 0.0%
YosemiteTemps RMSE_val 0.88859 0.88859 0.0%
YosemiteTemps Loss_val 0.00046 0.00046 0.0%
YosemiteTemps train_loss 0.00126 0.00126 0.0%
YosemiteTemps reg_loss 0 0 0.0%
YosemiteTemps MAE 0.97464 0.97464 0.0%
YosemiteTemps RMSE 1.70813 1.70813 0.0%
YosemiteTemps Loss 0.00126 0.00126 0.0%
YosemiteTemps time 334.987 350.09 4.51% ⚠️
PeytonManning MAE_val 0.35542 0.35542 0.0%
PeytonManning RMSE_val 0.50403 0.50403 0.0%
PeytonManning Loss_val 0.01803 0.01803 0.0%
PeytonManning train_loss 0.01466 0.01466 0.0%
PeytonManning reg_loss 0 0 0.0%
PeytonManning MAE 0.34755 0.34755 0.0%
PeytonManning RMSE 0.49449 0.49449 0.0%
PeytonManning Loss 0.01465 0.01465 0.0%
PeytonManning time 100.415 110.75 10.29%
\nModel training plots\n ## Model Training ### PeytonManning ![](https://asset.cml.dev/4f97574d46ef2d0c0bfed957461e33cf42e0ef85?cml=svg%2Bxml&cache-bypass=ac81b40e-35bc-414c-8b61-dc228b1e60f1) ### YosemiteTemps ![](https://asset.cml.dev/6d0e9107934a60b10516d34271843a60d26e56a5?cml=svg%2Bxml&cache-bypass=dc66f907-ebbb-4061-8e5c-32a31c4928cd) ### AirPassengers ![](https://asset.cml.dev/883586b8ccb26eeb23da09d0ccb248f42b1999b3?cml=svg%2Bxml&cache-bypass=de9eedad-d8e4-4208-9641-a164d3514bf7) ### EnergyPriceDaily ![](https://asset.cml.dev/e6a0dbda784dea6c3b0a6124cf00ea2a6732a75c?cml=svg%2Bxml&cache-bypass=64fd0eaa-6b0c-4221-9c99-194af3461bc8) \n

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants