Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

idea: add toppy, use min p #2

Open
awtrisk opened this issue Dec 11, 2023 · 3 comments
Open

idea: add toppy, use min p #2

awtrisk opened this issue Dec 11, 2023 · 3 comments

Comments

@awtrisk
Copy link

awtrisk commented Dec 11, 2023

Toppy's a great 7B model - the best, in my opinion, and many others have also loved it for RP.
Aside from that, Min P is a sampler implemented in every open source LLM backend. (except transformers)

@SayanoAI
Copy link
Owner

You can install the model yourself by putting it in the models/LLM folder and creating your own model config. Last time I checked, Min P hasn't been implemented to the main koboldcpp release package yet. Let me know if that has changed.

@awtrisk
Copy link
Author

awtrisk commented Dec 18, 2023

You can install the model yourself by putting it in the models/LLM folder and creating your own model config. Last time I checked, Min P hasn't been implemented to the main koboldcpp release package yet. Let me know if that has changed.

..when did you last check? they added min p on november 4

@awtrisk
Copy link
Author

awtrisk commented Dec 18, 2023

also, i meant adding a config for toppy, as its the most used and most praised 7B iirc based on mistral for rp

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants