Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression tests for model outputs #59

Open
joeloskarsson opened this issue Jun 12, 2024 · 0 comments
Open

Regression tests for model outputs #59

joeloskarsson opened this issue Jun 12, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@joeloskarsson
Copy link
Collaborator

Something that would be really nice to have is regression testing for model outputs. In short, whenever we refactor something in models we want them to still be able to load checkpoints (or well, see #48 ) and give exactly the same output when being fed with the same data.

One way to achieve this could be to

  1. Check out main branch
  2. Run some example data through the model and save the predictions (potentially also some internal representation tensors but likely unneccesary and hard to do in practice)
  3. Check out PR
  4. Run the same example data through the model and compare outputs to saved predictions.

I'm not too familiar with pytest and the github workflows to know all the details of how to do this. @SimonKamuk, @leifdenby do you think something like this is doable? Or are there any better ways to achieve this?

@joeloskarsson joeloskarsson added the enhancement New feature or request label Jun 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant