Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama implementation error cuased by FunctionMessage #2

Open
qqgeogor opened this issue May 11, 2024 · 0 comments
Open

Ollama implementation error cuased by FunctionMessage #2

qqgeogor opened this issue May 11, 2024 · 0 comments

Comments

@qqgeogor
Copy link

As I change openai model to ChatOllama model like llama3, the function message has a convert message error since there is only system,user,assistant role in ChatOllama
is there any way to fix the code in an easy way.

And also as I change the model to llama3, it seems even for the first step of viewing dataframe to perform .head() information, llm model fails to generate a respose to call functions which I tried using OllamaFunctions().bind_tools().

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant