Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: is it compatible with quantized model? #4

Open
cfrancois7 opened this issue Jan 12, 2024 · 1 comment
Open

Question: is it compatible with quantized model? #4

cfrancois7 opened this issue Jan 12, 2024 · 1 comment

Comments

@cfrancois7
Copy link

Hi,

We're running a project in the CivicLab in Grenoble, France. The project is based on LLM and is building as opensource project.
We're trying to build, test, prototype and evaluate an assistant for public consultation.

We're looking for CPT Framework and also the opportunity to make new partnerships or relationships for counselling.

At this stage I have only one 8 RAM GPU, so I'm using a mistral 7B quantized based model.
Soon I'll get access to an A5 family GPU (around 24 Go) thanks to a contributor.

So my question is: is your framework is compatible with quantized model and QLora? Is it model agnostic?

Also we're interesting by new contributor to help us make this prototype comes true. It'll be a pleasure to exchange about project with the OpenLLM-France. community

@Jeronymous
Copy link
Member

Hello @cfrancois7 and welcome here

is your framework is compatible with quantized model and QLora? Is it model agnostic?

In this repo, it is possible to:

  • Train models with Lora
  • Quantize trained models (following indications in the README)

AllClaire models trained using this repository can be found in several quantized forms.

I don't think QLora is supported. But we should check whether it is suppored in lit-gpt now.

Thanks for your interest.
If you haven't done it I encourage you to join OpenLLM France using Discord: https://discord.gg/tZf7BR4dY7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants