Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ML analysis] adjust start and end dates automatically based on model usage date extent #215

Open
nathanielrindlaub opened this issue Jun 12, 2024 · 0 comments
Labels

Comments

@nathanielrindlaub
Copy link
Member

In scripts/analyzeMLObjectLevel.js and scripts/analyzeMLSequenceLevel.js it is assumed that the model being analyzed was used for the entire duration of the date range.

The scripts, and Animl in general, doesn't know when a model was deployed, renamed, or automation rules applied, and we currently do not store inference request data at the image level (though we should), so it's up to the user to ensure the model was used for the entire date range.

Currenlty, if Animl never requested inference for the model being analyzed for some image(s) in the date range, but there are validating labels in those images, those images will be counted as false negatives, which will significantly skew the results (model will appear to to have worse recall than it actually does).

We could minimize this as much as possible by adding a step early on in the scripts that queries the Project for all images that have ml-generated Labels from whatever model were analyzing, and using the first instance of the model's use and last to adjust the START_DATE and END_DATE.

We'd definitely want to notify the user that this was happening in the console.

Related: #76

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant