Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NUM : missing abs in metrics evaluation #180

Open
2 of 7 tasks
LexABzH opened this issue Aug 15, 2023 · 0 comments
Open
2 of 7 tasks

NUM : missing abs in metrics evaluation #180

LexABzH opened this issue Aug 15, 2023 · 0 comments
Labels
bug Something isn't working

Comments

@LexABzH
Copy link
Collaborator

LexABzH commented Aug 15, 2023

Describe the bug

In the num template, the get_and_save_metrics is missing some abs in errors evaluations.

Concerned template

  • NLP template
  • NUM template
  • VISION template
  • API template
  • How templates are generated - Jinja

To Reproduce

CF. get_and_save_metrics in https://github.com/OSS-Pole-Emploi/gabarit/blob/main/gabarit/template_num/num_project/package_name/models_training/regressors/model_regressor.py

Expected behavior

abs_err should call abs function. Same for rel_err ?

Actual behavior

We can have negative values for an "absolute error" ...

Is this a regression?

That is, did this use to work the way you expected in the past?

  • Yes
  • No

Debug info

  • Gabarit version: latest
  • Python version: any
  • OS version: any
@LexABzH LexABzH added the bug Something isn't working label Aug 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant