-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
You need to enter your API key in the Models tab of Settings in order to chat. #36
Comments
@andy8992 any updates or can I close this? |
I haven't had the issue in a bit. I'll look at that. Not sure why but that more button wasn't obvious that it contained settings. Perhaps the direction arrow and it's placement near the scrollbar made me think it was related to additional commands rather than settings. |
It looks like I'm having the same issue on a fresh install: My ollama base URL is in the form https://ollama.example.com, and I know it's connecting successfully because I'm able to view all of the installed models. |
Can you download the latest pre-release from here: From the main chat window (the one in your screenshot), open the Help menu and there should be a "Go to Log Folder" entry. Then open Thanks |
Ollama seems to be responding OK on that version. However, at first I was getting a bunch of errors that looked like this:
and
and
I was able to fix it by disabling every plugin. It seems multiple plugins are enabled out of the box, so querying my local Ollama instance with text would trigger a bunch of tool usage that would error out (since none of the tool plugins are configured with API keys). I assume anyone installing the app for the first time would run into the same issue. |
I understand that but if you do not have API Keys, only two plugins are enabled by default: "Download" and "YouTube"... Not sure what to do on this: need to think that through. |
All of a sudden asked this when running commands.
I have no intention of using anything other than Ollama (which IS set up). This was working fine before.
I can chat normally but if I run one of my custom commands it's clearly not using the model I have chosen/used last.
I've changed no settings since this began happening. Presumably update related.
Forcing it to use a specific Ollama model works but it ought to be using my most recent model.
The text was updated successfully, but these errors were encountered: