-
Notifications
You must be signed in to change notification settings - Fork 365
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LLAMA 3 #679
Comments
it does function right now, but in my experience rather poorly, likely due to the weird prompt formatting that llama 3 has. |
If llama3 is being overly verbose, add "<|eot_id|>" to the AntiPrompts. I've had good results with it so far and it seems to have more personality than Mistral Instruct v0.2. |
I added <|eot_id|>, but it seems to have no effect |
I'm not sure if our antiprompt detection will properly handle special tokens like that. I know there's special case handling for EOS in some places. That could be a good improvement to the antiprompt processing for someone to make. |
Hi, #708 has added an example of LLaMA3 chat session. I'll appreciate it if you would like to try it and report problem to us if any. To run it, please pull the latest code of master branch and run the example project. :) |
Will PR #6920 from llama.cpp resolve this issue? |
@adammikulis I'll update the native library binaries following ggerganov/llama.cpp#6920. Besides you could also try the current master branch, which has already provided an example of LLaMA3. |
I think |
That's right, we'll add such things soon. For now you could use "�" as anti-prompt as a temporary resolution. |
This has actually been done in PR #712 (along with updating the binaries). |
llama3 has been supported for a while, so I'll close this issue now. |
Hi Team,
LLAMA 3 can be used in this frame?
Thanks
The text was updated successfully, but these errors were encountered: