Skip to content

Issues: vllm-project/llm-compressor

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

KeyError: 'model.layers.0.self_attn.k_scale' bug Something isn't working
#967 opened Dec 10, 2024 by wxsms
Error when quantizing LLama 3.3 70b to FP8 bug Something isn't working
#963 opened Dec 6, 2024 by Syst3m1cAn0maly
Gptj use gptq quant bug bug Something isn't working
#961 opened Dec 6, 2024 by yemyhdtrc6088
Qwen2VL FP8_DYNAMIC Failed bug Something isn't working
#951 opened Dec 4, 2024 by LugerW-A
quantization + sparsification - model outputs zeros bug Something isn't working
#942 opened Nov 28, 2024 by nirey10
Several wandb init bug Something isn't working
#934 opened Nov 26, 2024 by fzyzcjy
Got Error when I load a 2of4 model using vllm. bug Something isn't working
#926 opened Nov 19, 2024 by jiangjiadi
W8A8 quant for GPT-J failed bug Something isn't working
#909 opened Nov 12, 2024 by zhouyuan
Any plan about W4A8 enhancement New feature or request
#873 opened Oct 29, 2024 by Arcmoon-Hu
SmoothQuant doesn't respect ignored modules for VLMs bug Something isn't working
#687 opened Sep 26, 2024 by mgoin
KV Cache Quantization example cause problem bug Something isn't working
#660 opened Sep 25, 2024 by weicheng59
[USAGE] FP8 W8A8 (+KV) with LORA Adapters enhancement New feature or request
#164 opened Sep 11, 2024 by paulliwog
Layers not skipped with ignore=[ "re:.*"] bug Something isn't working
#91 opened Aug 15, 2024 by horheynm
ProTip! Updated in the last three days: updated:>2024-12-09.