Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updates for vllm v0.6.4 #183

Merged
merged 3 commits into from
Dec 2, 2024
Merged

Updates for vllm v0.6.4 #183

merged 3 commits into from
Dec 2, 2024

Conversation

tjohnson31415
Copy link
Contributor

@tjohnson31415 tjohnson31415 commented Nov 15, 2024

vLLM v0.6.4 has been released. This PR is to make changes to the adapter to support this new release.

TODO: I only noticed this one change and wanted to make a commit out of it. More testing should be done to see if additional changes are needed.

Description

  • remove use of deprecated LLMInputs that is now just a type instead of a class REF

How Has This Been Tested?

Manually tested by running the changed code with vLLM v0.6.4 installed.

Merge criteria:

  • The commits are squashed in a cohesive manner and have meaningful messages.
  • Testing instructions have been added in the PR body (for PRs involving changes that are not immediately obvious).
  • The developer has manually tested the changes and verified that the changes work

@codecov-commenter
Copy link

codecov-commenter commented Nov 15, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 62.46%. Comparing base (2c80c72) to head (702ef9a).

Additional details and impacted files
@@           Coverage Diff           @@
##             main     #183   +/-   ##
=======================================
  Coverage   62.46%   62.46%           
=======================================
  Files          28       28           
  Lines        1721     1721           
  Branches      211      211           
=======================================
  Hits         1075     1075           
  Misses        544      544           
  Partials      102      102           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@prashantgupta24 prashantgupta24 marked this pull request as ready for review November 19, 2024 18:17
@dtrifiro
Copy link
Contributor

The reason why opendatahub-io/vllm@main tests are failing is because we haven't pushed the 0.6.4.post1 tag yet, meaning that we're building a vlllm version which is < 0.6.4. When installing vllm-tgis-adapter, the >=0.6.4 requirements causes the installed CPU version to get uninstalled and vllm==0.6.4.post1 to be installed. From the failure logs:

 Attempting uninstall: vllm
    Found existing installation: vllm 0.6.2+cpu
    Uninstalling vllm-0.6.2+cpu:
      Removing file or directory /home/runner/.nox/tests-3-12/bin/vllm
      Removing file or directory /home/runner/.nox/tests-3-12/lib/python3.12/site-packages/vllm-0.6.2+cpu.dist-info/
      Removing file or directory /home/runner/.nox/tests-3-12/lib/python3.12/site-packages/vllm/
      Successfully uninstalled vllm-0.6.2+cpu
...
...
Collecting vllm>=0.6.4 (from vllm-tgis-adapter==0.5.4.dev12+gd64bdc6)
...

The sync PR is here: opendatahub-io/vllm#235 I'll be merging it by next week

@fialhocoelho
Copy link
Contributor

@dtrifiro I encountered issues during the build with the mistral_common version, which was set to 1.4.4 in the upstream. Adjusting it to mistral_common[opencv] >= 1.5.0 resolved the problem.

@dtrifiro dtrifiro added this pull request to the merge queue Dec 2, 2024
Merged via the queue into main with commit 8487c4e Dec 2, 2024
3 checks passed
@dtrifiro dtrifiro deleted the update-v0.6.4 branch December 2, 2024 14:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants