Ollama not showing up in the menu on Librechat #5837
kuya-gonzales
started this conversation in
Help Wanted
Replies: 1 comment 1 reply
-
Check your ENDPOINTS environment variable in the env file. It should include “custom” |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey there,
I've been scratching my head all day trying to get Ollama to show up on the menu in librechat.
Ollama is running in the compose coupled in the config
This is my librechat.yaml setup
name: "Ollama"
apiKey: "ollama"
baseURL: "http://host.docker.internal:11434/v1/chat/completions"
models:
default: [
"deepseek-r1:14b"
]
fetch: false # fetching list of models is not supported
titleConvo: true
titleModel: "current_model"
summarize: false
summaryModel: "current_model"
forcePrompt: false
modelDisplayLabel: "Ollama"
and my override yaml.
api:
volumes:
source: ./librechat.yaml
target: /app/librechat.yaml
ollama:
image: ollama/ollama:latest
deploy:
resources:
reservations:
devices:
- driver: nvidia
capabilities: [compute, utility]
ports:
- "11434:11434"
volumes:
- ./ollama:/root/.ollama
Ollama also seems to be up and running correctly.
any help would be lovely!
Beta Was this translation helpful? Give feedback.
All reactions