Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gp output scaler #309

Merged
merged 10 commits into from
Dec 7, 2023
Merged

gp output scaler #309

merged 10 commits into from
Dec 7, 2023

Conversation

simonsung06
Copy link
Contributor

Added output scaler option for GP's

There is repeated code for the Pydantic validator so we probably need to think about a better way to do this in the future.

@simonsung06 simonsung06 requested a review from jduerholt November 9, 2023 08:03
Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi Simon,

I think I have an idea, all data models in which we added the scaler show the following inheritance:

`class SingleTaskGPSurrogate(BotorchSurrogate, TrainableSurrogate):``

What do you think about creating a new data model TrainableBotorchSurrogate, add the scaler stuff incl. the validator there and inherit from there?

There is also the possibility to reuse validators, maybe you can have a look there, in pydantic2 it should be also easier to reuse.

Best,

Johannes

@simonsung06
Copy link
Contributor Author

Hi Johannes. That is a possible idea about would be suitable for now I think. I'm not sure about how pydantic2 works though and how it changes things...

For now I'll make your suggested change and commit it when i get the chance

@simonsung06
Copy link
Contributor Author

simonsung06 commented Dec 4, 2023

Hi @jduerholt,

I've made the suggested change by adding a new TrainableBotorchSurrogate data model.

In addition, the random forest data model was also modified because it originally inherits from BotorchSurrogate and TrainableSurrogate too. Whilst this is not completely necessary, I changed it to inherit TrainableBotorchSurrogate so RandomForestSurrogate will now also take scaler and output_scalar keywords arguments, which both default to ScalerEnum.IDENTITY. This meant changing the loads and dumps functions so that torch type scalers could be dumped and loaded if they are used.

@jduerholt
Copy link
Contributor

Thx, I will have a look over the course of the week!

Copy link
Contributor

@jduerholt jduerholt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. Many thanks!

@jduerholt jduerholt merged commit 094b320 into main Dec 7, 2023
@jduerholt jduerholt deleted the gp-output-scaler branch December 7, 2023 15:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants