-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
switch to the granite provided by rhel ai rather than huggingface #152
Comments
Sadly
IMO it would be much better if we could use https://github.com/containers/ramalama or The most demanding part would be to re-create/undo the symlinking, since we can't rely on that in KFP. |
Additionally, |
@tumido, for a small clarification, the token is not needed if we download an image from the instructlab repository. By the way, I'm curious how other projects handle the token. |
Yeah, correct. However so far we've been using |
FTR, we took this back to stakeholders, since we're not sure at the moment how aligned is OCI Artifacts or even Hugging face as model sources relevant to Phase 3 as it should stay close to Phase 2 experience. Our offer is to revert to sourcing the model from object storage in the phase 3 as well. |
ilab model download --repository docker://registry.redhat.io/rhelai1/granite-7b-starter --release latest
Transition the above command into the pipeline rather than using huggingface
The text was updated successfully, but these errors were encountered: