Ollama options not displayed in chat #3238
-
What happened?I have configured Ollama in the librechat.yaml file but it's not appearing in the drop-down to choose from:
did See the screenshot, "Ollama" not there. Cannot find any of the Ollama models. Steps to Reproduce
What browsers are you seeing the problem on?Chrome Relevant log outputNo response ScreenshotsCode of Conduct
|
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 2 replies
-
Add an api key for ollama: |
Beta Was this translation helpful? Give feedback.
-
Thank you Danny, will do. Just wanted to quickly respond here to thank you for the rapid response from you. We're pulling an all-nighter on a project and this will help. |
Beta Was this translation helpful? Give feedback.
-
That has worked perfectly. One thing to add in case others are looking for it, is that https in the Ollama URL doesn't work. It needs to be http. But as it's locally hosted it should be secure behind all your firewalls or on your VPN so that shouldn't matter. |
Beta Was this translation helpful? Give feedback.
example:
https://www.librechat.ai/docs/configuration/librechat_yaml/ai_endpoints/ollama