-
Notifications
You must be signed in to change notification settings - Fork 303
[Bot] Update Inference Providers documentation #1723
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bot] Update Inference Providers documentation #1723
Conversation
….com:huggingface/hub-docs into update-inference-providers-docs-automated-pr
48356c5
to
e838067
Compare
e838067
to
e100967
Compare
….com:huggingface/hub-docs into update-inference-providers-docs-automated-pr
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hopefully we want to have at least 1 model deployed for each task in the coming days (by either the automated or manual hf-inference deploys), cc @Vaibhavs10 @tomaarsen @oOraph
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes! deploying models as we speak.. we're skipping the following from automated deploys:
GPU_ONLY_TASKS = [
"image-to-image",
"text-to-image",
"text-to-video",
"text-generation",
"automatic-speech-recognition",
"object-detection",
]
NOT_IMPLEMENTED_TASKS = [
"text-ranking",
]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For GPU_ONLY_TASKS we can just deploy one of each manually imo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(for the ones where we don't already have a third party provider)
This PR automatically upgrades the
@huggingface/tasks
and@huggingface/inference
packages and regenerates the Inference Providers documentation by running:cd scripts/inference-providers pnpm update @huggingface/tasks@latest @huggingface/inference@latest pnpm run generate
This PR was automatically created by the Update Inference Providers Documentation workflow.
Please review the changes before merging.