Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prompt format to follow + Multi tools for Hermes-2-Theta-Llama-3-8B with Ollama #26

Open
aiseei opened this issue May 18, 2024 · 5 comments

Comments

@aiseei
Copy link

aiseei commented May 18, 2024

Hi - great work and thanks for this source!

I am not clear on the prompt format to use with model & Ollama.

  1. one at
    https://huggingface.co/NousResearch/Hermes-2-Theta-Llama-3-8B
  2. as per the examples. https://github.com/NousResearch/Hermes-Function-Calling/blob/main/examples/ollama-multiple-fn.ipynb
    or
  3. https://github.com/NousResearch/Hermes-Function-Calling/blob/main/prompt_assets/sys_prompt.yml

Any inputs will be greatly appreciated!

Thanks

@kiiwee
Copy link
Contributor

kiiwee commented May 21, 2024

I am currently trying to make tooling with Ollama work but as mentioned, the model was trained with a tool role which Ollama still doesn't support

@aiseei
Copy link
Author

aiseei commented May 29, 2024

@kiiwee - we are trying to make it work directly with llama-cpp server . https://github.com/ggerganov/llama.cpp/tree/master/examples/server

Have u had any experience with that ?

@kiiwee
Copy link
Contributor

kiiwee commented May 29, 2024

@aiseei managed to make it work with the llama.cpp python lib by inserting the tool template into the response and feed back of the tool for example, when the tool responds add the xml template and return it with a tool role.
As for the llama.cop server im going to try to function call (not the openai way) from a node app to test soon but i dont see why it wouldn't

Im still quite puzzled why ollama decided to strictly allow 3 roles only (system, assistant and user)

@aiseei
Copy link
Author

aiseei commented May 30, 2024

@kiiwee
Copy link
Contributor

kiiwee commented Jun 1, 2024

@aiseei #29 check this pull request. I think it should work the same with the Theta version.
I used the Ollama example and added a callback to the model with the tool response since I always needed that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants