Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CI/Build] mypy: Resolve some errors from checking vllm/engine #9267

Merged
merged 7 commits into from
Oct 16, 2024

Conversation

russellb
Copy link
Collaborator

@russellb russellb commented Oct 11, 2024

This PR is a chunk of mypy checking progress.

  1. I got to where I knew everything would pass when running mypy with
    --follow-imports skip. (though some new issues have crept back in since I passed this point)

  2. I then started working on expanding the coverage for what can be checked
    using --follow-imports silent. Ongoing progress will come in chunks to make it
    easier to review. I started with 100 errors and it's down to 60 right now.

Part of issue #3680

Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@russellb russellb marked this pull request as draft October 11, 2024 01:27
@russellb russellb marked this pull request as ready for review October 11, 2024 01:30
@DarkLight1337 DarkLight1337 added the ready ONLY add when PR is ready to merge/full CI is needed label Oct 11, 2024
vllm/config.py Outdated Show resolved Hide resolved
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall looks ok, PTAL at my comments though.

Copy link
Member

@njhill njhill left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @russellb!

vllm/engine/output_processor/util.py Outdated Show resolved Hide resolved
vllm/engine/multiprocessing/client.py Outdated Show resolved Hide resolved
vllm/core/scheduler.py Outdated Show resolved Hide resolved
vllm/engine/output_processor/stop_checker.py Outdated Show resolved Hide resolved
@russellb
Copy link
Collaborator Author

@njhill @DarkLight1337 Thank you both for the close review! I will let you know once I have addressed the feedback.

@russellb russellb marked this pull request as draft October 11, 2024 23:45
@russellb russellb force-pushed the mypy branch 2 times, most recently from ac36d1b to eb164bd Compare October 15, 2024 17:51
@russellb russellb marked this pull request as ready for review October 15, 2024 17:53
@russellb russellb marked this pull request as draft October 16, 2024 12:32
@russellb
Copy link
Collaborator Author

marked as a draft to fix some things - will mark ready for review again when done

I had all directories listed in `tools/mypy.sh` to ensure they worked
with `--follow-imports skip`. This change includes the last tweaks
necessary for all of those to pass.

As a next step, remove the ones that are already running
with `--follow-imports silent` based on the config in
`pyproject.toml`. Following commits will start resolving errors that
occur when adding more coverage here.

Signed-off-by: Russell Bryant <[email protected]>
This fixes 27 of the 100 errors that occur when running:

    mypy --follow-imports silent vllm/engine

It is submitted as a checkpoint to make it easier to review.

Signed-off-by: Russell Bryant <[email protected]>
`backend` was used in two different places in this function as two
different types. There was no error here, but `mypy` didn't like it.

Signed-off-by: Russell Bryant <[email protected]>
The code here previously raised an error in mypy because the `forward`
method may not exist on the decorated class.  Add explicit validation
of this requirement.

Signed-off-by: Russell Bryant <[email protected]>
In the support_torch_compile() decorator, mypy doesn't like the lines
overriding class methods. The code looks fine to me, so just ignore
the warnings here.

Signed-off-by: Russell Bryant <[email protected]>
Several places there are assumptions that a type is a more specific
type than the generic one specified in function parameters. The
parameters are set because it's based on a more generic parent class,
so we can't override them here.

Signed-off-by: Russell Bryant <[email protected]>
@russellb russellb marked this pull request as ready for review October 16, 2024 15:40
@russellb russellb added ready ONLY add when PR is ready to merge/full CI is needed and removed ready ONLY add when PR is ready to merge/full CI is needed labels Oct 16, 2024
@russellb
Copy link
Collaborator Author

looks like there's a real failure in a metrics test. I'm looking at it now.

mypy pointed out that a couple of variables are Optional (could be
None), but we are adding them to a float or int. Ensure that we don't
try to add None.

Signed-off-by: Russell Bryant <[email protected]>
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be good to go now, thanks again for your efforts!

@DarkLight1337 DarkLight1337 merged commit 776dbd7 into vllm-project:main Oct 16, 2024
65 checks passed
@@ -92,7 +93,7 @@ class RequestOutput:
def __init__(
self,
request_id: str,
prompt: Optional[str],
prompt: Optional[PromptType],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the after-the-fact comment and for missing in review .. I think this one might not be the right change ... it breaks things in other places and is also part of the API so probably shouldn't change in this way.

I think probably the reason it was changed was due to the recent beam_search logic which still needs a lot of work and I think incorrectly is assigning a var of PromptType to this.

I can address this in another PR that I'm working on right now.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@njhill sorry about that. I think it was OK at one point, and then when I rebased on main some new errors started happening and I didn't fix them before merge. Let me know if you'd like me to just revert this if you want it fixed quickly.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @russellb no problem I have a PR almost ready

njhill added a commit to njhill/vllm that referenced this pull request Oct 17, 2024
This reverts a recent typing change that was made in vllm-project#9267 which was due to another recent incorrect change in the "external" beam search implementation. It would be a breaking change to the type of RequestOutput.prompt in the API (and caused some typing breakages to the front-end API code which references this field).

I have made some related changes/fixes to the beam search impl including properly returning logprobs, but it still needs quite a bit more work overall (for subsequent PRs).
charlifu pushed a commit to charlifu/vllm that referenced this pull request Oct 23, 2024
vrdn-23 pushed a commit to vrdn-23/vllm that referenced this pull request Oct 23, 2024
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
garg-amit pushed a commit to garg-amit/vllm that referenced this pull request Oct 28, 2024
FerdinandZhong pushed a commit to FerdinandZhong/vllm that referenced this pull request Oct 29, 2024
sumitd2 pushed a commit to sumitd2/vllm that referenced this pull request Nov 14, 2024
KuntaiDu pushed a commit to KuntaiDu/vllm that referenced this pull request Nov 20, 2024
mfournioux pushed a commit to mfournioux/vllm that referenced this pull request Nov 20, 2024
tlrmchlsmth pushed a commit to neuralmagic/vllm that referenced this pull request Nov 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants