Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Fix MiniCPMV and Mllama BNB bug #9917

Merged
merged 7 commits into from
Nov 4, 2024

Conversation

jeejeelee
Copy link
Collaborator

@jeejeelee jeejeelee commented Nov 1, 2024

FILL IN THE PR DESCRIPTION HERE

FIX #9914

ping @mgoin

cc @chenqianfzh as well. Regarding multimodal models, additional BNB implementation logic may be required

Signed-off-by: Jee Jee Li <[email protected]>
Signed-off-by: Jee Jee Li <[email protected]>
Copy link

github-actions bot commented Nov 1, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@jeejeelee
Copy link
Collaborator Author

Now, after fixing this, I can generate reasonable results using my local image

@jeejeelee
Copy link
Collaborator Author

Hmm, It seems that BNB has issues handling weights in ReplicatedLinear when TP > 1 @chenqianfzh

Signed-off-by: Jee Jee Li <[email protected]>
@jeejeelee jeejeelee requested a review from mgoin November 1, 2024 15:29
Signed-off-by: Jee Jee Li <[email protected]>
@jeejeelee jeejeelee changed the title [Bugfix] Fix MiniCPMV BNB bug [Bugfix] Fix MiniCPMV and Mllama BNB bug Nov 1, 2024
@mgoin mgoin added the ready ONLY add when PR is ready to merge/full CI is needed label Nov 2, 2024
Signed-off-by: Jee Jee Li <[email protected]>
@jeejeelee jeejeelee force-pushed the minicpmv-bnb-support branch from c0264e6 to a12c16c Compare November 2, 2024 16:52
@jeejeelee
Copy link
Collaborator Author

Hmm, It seems that BNB has issues handling weights in ReplicatedLinear when TP > 1 @chenqianfzh

I have handled this issue, @mgoin please review it again,thanks~

Copy link
Member

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks reasonable to me thanks, just curious about two things

vllm/model_executor/model_loader/loader.py Outdated Show resolved Hide resolved
@@ -1005,16 +1007,21 @@ def _unquantized_generator(self, hf_weights_files, use_safetensors,
if any(target_module in weight_name for target_module in
self.target_modules) and weight_name.endswith(".weight"):
weight_name = weight_name.replace(".weight", ".qweight")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A general questions I have about BNB is why do we use .qweight in the BNBLinearMethod but the model checkpoints actually use .weight? It seems we could avoid some logic by having the quant method directly use .weight
Screenshot 2024-11-02 at 2 14 23 PM

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps it's to maintain consistency with other quantization algorithms like GPTQ implementations see:

layer.register_parameter("qweight", qweight)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, let's try to remove this in the future if possible! The "consistency" is just coincidental - we usually aim for parameters to have the same name as in the checkpoint format to make weight loading simple

@jeejeelee jeejeelee requested a review from mgoin November 3, 2024 01:29
@jeejeelee jeejeelee force-pushed the minicpmv-bnb-support branch from efe89d8 to e410d9c Compare November 4, 2024 02:13
Copy link
Collaborator

@Isotr0py Isotr0py left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM too!

@Isotr0py Isotr0py enabled auto-merge (squash) November 4, 2024 02:58
@Isotr0py Isotr0py merged commit c49f040 into vllm-project:main Nov 4, 2024
62 checks passed
@jeejeelee jeejeelee deleted the minicpmv-bnb-support branch November 4, 2024 03:37
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Nov 4, 2024
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Nov 4, 2024
richardsliu pushed a commit to richardsliu/vllm that referenced this pull request Nov 4, 2024
bigPYJ1151 pushed a commit to bigPYJ1151/vllm that referenced this pull request Nov 5, 2024
DarkLight1337 pushed a commit that referenced this pull request Nov 5, 2024
hissu-hyvarinen pushed a commit to ROCm/vllm that referenced this pull request Nov 6, 2024
JC1DA pushed a commit to JC1DA/vllm that referenced this pull request Nov 11, 2024
sumitd2 pushed a commit to sumitd2/vllm that referenced this pull request Nov 14, 2024
KuntaiDu pushed a commit to KuntaiDu/vllm that referenced this pull request Nov 20, 2024
mfournioux pushed a commit to mfournioux/vllm that referenced this pull request Nov 20, 2024
tlrmchlsmth pushed a commit to neuralmagic/vllm that referenced this pull request Nov 23, 2024
Signed-off-by: Jee Jee Li <[email protected]>
Signed-off-by: Tyler Michael Smith <[email protected]>
sleepwalker2017 pushed a commit to sleepwalker2017/vllm that referenced this pull request Dec 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: minicpmv2.6 BNB in-flight quantization error
3 participants