Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any alternate local model? #13

Open
SouravaBehera opened this issue Aug 13, 2024 · 2 comments
Open

Any alternate local model? #13

SouravaBehera opened this issue Aug 13, 2024 · 2 comments

Comments

@SouravaBehera
Copy link

Token indices sequence length is longer than the specified maximum sequence length for this model (2816 > 2048). Running this sequence through the model will result in indexing errors
2024-08-13 11:31:16,523 - WARNING - Prompt is too long for LLM. Chunking the input.

I am getting this warning for the mentioned model in the repo.

@Dicklesworthstone
Copy link
Owner

I would use the APIs rather than the local model. But you can also try decreasing the chunk size.

@Shahin-rmz
Copy link

Hi,
thanks for very interesting project.
I would like to use my own fine tuned local LLM (Llama3.1),
how is it possible?
Best

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants