Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vLLM requirement to be relaxed #989

Closed
A-F-V opened this issue Jun 19, 2024 · 3 comments · Fixed by #1005
Closed

vLLM requirement to be relaxed #989

A-F-V opened this issue Jun 19, 2024 · 3 comments · Fixed by #1005

Comments

@A-F-V
Copy link

A-F-V commented Jun 19, 2024

What behavior of the library made you think about the improvement?

I am trying to contribute on mac, but I run into the following error when I am setting up my dev environment:

pip install -e ".[test]"
pre-commit install

Collecting vllm (from outlines==0.0.46.dev3+g7d8269f)
  Using cached vllm-0.5.0.post1.tar.gz (743 kB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [18 lines of output]
...
      AssertionError: vLLM only supports Linux platform (including WSL).
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

This makes it difficult for Mac and for non-WSL developers to contribute

How would you like it to behave?

To not complain about vLLM requirements not being met to support Mac or non-WSL Windows development

@A-F-V A-F-V changed the title vLLM Linux constraint to be relaxed vLLM requirement to be relaxed Jun 19, 2024
@lapp0
Copy link
Contributor

lapp0 commented Jun 21, 2024

I'm not sure how this can be accomplished without breaking vLLM tests in CI. Do you know of a good way to do this?

Otherwise, I think you can just run pip install -e . --no-deps then ignore vLLM test failures until MacOS / ARM support in vLLM is merged

We could also skip vLLM tests if the hardware is inappropriate like we do with mlxlm here https://github.com/outlines-dev/outlines/blob/main/tests/generate/conftest.py#L6

@lapp0
Copy link
Contributor

lapp0 commented Jun 25, 2024

@A-F-V could you please try this PR's branch and let me know whether it works on your end?

#1005

It doesn't attempt to install vllm on my end.

@A-F-V
Copy link
Author

A-F-V commented Jun 25, 2024

I can confirm this PR did solve my issue, and I was also able to run the pytest suite. Thank you :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants