Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support inference on multiple models simultaneously #73

Open
drewoldag opened this issue Sep 27, 2024 · 0 comments
Open

Support inference on multiple models simultaneously #73

drewoldag opened this issue Sep 27, 2024 · 0 comments
Labels
enhancement New feature or request

Comments

@drewoldag
Copy link
Collaborator

An open question for the science users - would it beneficial to be able to define a set of models to use for inference?

Given a collection of models that must all be used to predict results for a given data set we could either run the data through each model individually (or in manually in parallel with a collection of scripts) or we could parallelize this within FIBAD such that the input data is accessed once, and processed in parallel by many models.

@drewoldag drewoldag added the enhancement New feature or request label Sep 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant