-
Notifications
You must be signed in to change notification settings - Fork 47
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Which version of vllm should be installed #5
Comments
You should be fine with this (at least for vllm 0.5.0). Let me know if you have errors when running the code. |
I am afraid the version of transformers does matter.
Both errors may be caused by the tokenizer version issue.
Can you look into this issue? @angelahzyuan |
@swyoon Hi, the setup instructions are for Llama3 and Mistral only. Gemma-2 is a newly released model, and the issues are due to compatibility with transformers and vllm. We suggest trying other models first. If you want to use Gemma-2, you might need the most recent versions of the following dependencies: Most recent version of transformers from git. "pip install git+https://github.com/huggingface/transformers.git", most recent version of vllm (install from source), (pip install -U) accelerate, and trl. |
@angelahzyuan thank you so much for a very prompt answer. |
Hi, when I follow the default steps to set up environment:
pip install vllm
it will automaticly install vllm 0.5.0.post1, and transformers>=4.40.0 is required.
When installing SPPO ( transformers==4.36.2 are required), I got the following errors:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
vllm 0.5.0.post1 requires tokenizers>=0.19.1, but you have tokenizers 0.15.2 which is incompatible.
vllm 0.5.0.post1 requires transformers>=4.40.0, but you have transformers 4.36.2 which is incompatible.
Should I degrade the vllm version or ignore this error, how could I fix this error?
The text was updated successfully, but these errors were encountered: