Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add gptqmodel support #2247

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

add gptqmodel support #2247

wants to merge 4 commits into from

Conversation

jiqing-feng
Copy link
Contributor

Support gptqmodel, we plan to replace autogptq by gptqmodel in the future.

@BenjaminBossan
Copy link
Member

Thanks for this PR to add support for gptqmodel. Let's wait for the transformers PR to be merged before proceeding with this one.

Signed-off-by: jiqing-feng <[email protected]>
@jiqing-feng
Copy link
Contributor Author

jiqing-feng commented Dec 4, 2024

Thanks for this PR to add support for gptqmodel. Let's wait for the transformers PR to be merged before proceeding with this one.

Hi @BenjaminBossan , there is a weird pytest error I never met,

(idp) root@sprocean:/home/jiqingfe/peft# pytest
ERROR: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
pytest: error: unrecognized arguments: --cov=src/peft --cov-report=term-missing
  inifile: /home/jiqingfe/peft/pyproject.toml
  rootdir: /home/jiqingfe/peft

(idp) root@sprocean:/home/jiqingfe/peft# pip list | grep peft
peft                        0.13.3.dev0            /home/jiqingfe/peft

The pytest works only when I removed this code
Do you know why it happens?

@jiqing-feng
Copy link
Contributor Author

jiqing-feng commented Dec 4, 2024

Testing changes contain:

  1. Remove gpu limitation for gptq tests.
  2. GPTQ lib: @require_gptq means we can run these tests with gptqmodel or auto-gptq

For gptq testing:

pytest tests/test_gpu_examples.py::PeftGPTQGPUTests
pytest tests/test_common_gpu.py::PeftCommonTests::test_lora_gptq_quantization_from_pretrained_safetensors

@BenjaminBossan
Copy link
Member

there is a weird pytest error I never met,

This comes from a pytest plugin we use to monitor code coverage. Running python -m pip install pytest-cov should fix that for you.

(btw. code coverage is great to check if the new code you added is covered by unit tests)

jiqing-feng and others added 2 commits December 4, 2024 16:21
Signed-off-by: jiqing-feng <[email protected]>
* add get_gptq_quant_linear

* cleanup

* rename to quant_linear

* rename to get_gptqmodel_quant_linear

* rename to QuantLinear

* fix get device_map

* import hf_select_quant_linear

* pass checkpoint_format

* fix lora

* if is_gptqmodel_available, pass beckend auto_trainable

* pass backend auto_trainable

* cleanup

* Update gptq.py

---------

Co-authored-by: Qubitium-ModelCloud <[email protected]>
@jiqing-feng jiqing-feng marked this pull request as ready for review December 7, 2024 03:56
@Qubitium
Copy link

Qubitium commented Dec 10, 2024

@BenjaminBossan This PR is ready and tested under cpu, intel/xpu, and nvidia/cuda. However, please note this PR is 1 of 3 in a cumulative set of 3 prs where the primary PR is in Optimum PR huggingface/optimum#2064 (awaiting review/approval) while this and Transformer PR huggingface/transformers#35012 are dependent on getting the Optimum PR merged first.

@jiqing-feng jiqing-feng changed the title [WIP]add gptqmodel support add gptqmodel support Dec 10, 2024
@BenjaminBossan
Copy link
Member

Thanks for the update. Let's wait for the optimum and the transformers PRs to be merged first, since changes there could affect this PR. Feel free to ping me as soon as those PRs were merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants