Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for more sizes of LLaMA #169

Open
cnbeining opened this issue Apr 27, 2023 · 5 comments
Open

Add support for more sizes of LLaMA #169

cnbeining opened this issue Apr 27, 2023 · 5 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@cnbeining
Copy link
Contributor

Hey folks,

Trying to get 13/30B model with 4 bit fine tuning - any chance you folk could release the script used to convert the 7B version of model to 4 bit?

Thanks,

@shreyansh26
Copy link

That would be really helpful!

@shreyansh26
Copy link

I think this is the standard repo which people use - https://github.com/qwopqwop200/GPTQ-for-LLaMa

@StochasticRomanAgeev
Copy link
Contributor

Hi @cnbeining @shreyansh26,
We have contributing guide for model addition
We will appreciate any help on this field!

@StochasticRomanAgeev StochasticRomanAgeev added enhancement New feature or request help wanted Extra attention is needed labels May 8, 2023
@StochasticRomanAgeev
Copy link
Contributor

Hi again, @cnbeining @shreyansh26
In latest release we added Generic model, you can use it for llama-13b models in our library!
Also maybe a good option for you will be falcon-7b model, that is now supported by xturing.

@StochasticRomanAgeev
Copy link
Contributor

We are also working on addition kbit quantisation to generic model, so it should be released soon.
Then you will be able to use llama-13b or any model with 4bit quantization.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

3 participants