-
Notifications
You must be signed in to change notification settings - Fork 406
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
App Submission: Open WebUI #1977
base: master
Are you sure you want to change the base?
Conversation
Thanks for yet another submission @al-lac!
I think what you've done here makes sense. Users get easy access to an ollama instance + a web UI that can manage model downloads and use them. I've just taken a quick look at the compose file and I think we could stick the web-ui behind the app proxy (with or without umbrelOS authentication) and then even remove the host-bind port for Ollama so that Docker uses its internal DNS to connect containers. Let me give it a quick test and let you know my findings so we can decide how to move forward. Also, we've gone ahead and created the gallery assets in the meantime: |
…r ollama containers are running
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@al-lac I've committed some atomic changes that you can follow. What do you think of this set up for Open Web UI? Feel free to revert anything that you feel doesn't make sense. I've explained some of my changes below.
# We could uncomment this if we want to expose the ollama port to the host for other external apps/programs to use. | ||
# ports: | ||
# - 11434:11434 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The open-webui
container connects to the ollama
container using Docker's internal DNS via the OLLAMA_BASE_URL
so we don't need to bind this port to the host if Open WebUI is the only application connecting to Ollama.
However, if users want to connect to their Ollama instance via some other external application then we'd need to have this exposed on the host. But I think that for that use-case it probably makes sense to have ollama running as it's own standalone service like you mention:
@nmfretz It could make sense to split ollama into a separate app.
What are your thoughts on removing this port binding for now? That way this app is a pre-packaged Web UI + Ollama instance that is easy for the average user to use. In the future we can consider either exposing ollama's port on the host or including a standalone ollama instance as another app.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense!
I am actually leaning towards giving ollama into its own app or at least keeping the port binding. This would enable the users to connect from their Workstations or use it with other services such as Lobe Chat, Home Assistant, Mattermost and more.
Also we could update ollama better if it was a separate app.
Sadly the ollama website is just an indication that it is running, without any guide on how to connect. So the downside would for sure be that it is not as easy to use as it is now.
But I am fine with whatever you think is for the best here!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya, I hear you. It would be nice to have Ollama as a standalone service that other services could connect to.
Then Open WebUI could just be shipped as the open-webui
container, and if users optionally want Ollama functionality they would download the separate Ollama app and connect through Open WebUI's UI. Another option would be to have ollama be required
in Open WebUI's manifest so that it must first be installed by the user and then when they install Open WebUI it will be pre-configured with Ollama... but it might actually be nice for Ollama to be entirely optional for Open WebUI. For example, a user may only want to use Open WebUI with an OpenAI API.
I see what you mean though about the less-than-ideal status page for Ollama:
Ideally we'd be able to ship Ollama without having to include some custom frontend like we do for things like Back That Mac Up, Tor Snowflake Proxy, etc, since that just adds overhead when updating the app (e.g., needing to check if the frontend needs updating).
Let me run this by the team and see what we think about having Ollama run separately with just its simple status html. Potentially we could allow this and then include some general instructions in the app description on how to connect to other apps.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Proof of concept for ollama as a standalone app: #1977 (comment)
@al-lac, this branch is a proof-of-concept for having Ollama as a standalone app: https://github.com/getumbrel/umbrel-apps/tree/standalone-ollama. In that branch I've removed ollama from Open WebUI's compose file (https://github.com/getumbrel/umbrel-apps/blob/standalone-ollama/open-webui/docker-compose.yml) and added ollama as a standalone app (https://github.com/getumbrel/umbrel-apps/blob/standalone-ollama/ollama/docker-compose.yml). Feel free to give it a test and see what you think. Here's a screen recording of:
standalone-ollama.mp4 |
Hey @nmfretz, this is great! I really prefer it this way. Tested pretty much the same as you did, also tried it in mattermost (copilot) and Lobe Chat. For Lobe Chat to be able to connect, this environment variable needs to be set on ollama: environment:
OLLAMA_ORIGINS: "*" |
App Submission
Open WebUI
@nmfretz It could make sense to split ollama into a separate app. But it has no UI other than showing that it is running when you access the port.
Both parts do not work properly without the root user.
256x256 SVG icon
https://filetransfer.io/data-package/N7NbYURd#link
Gallery images
I have tested my app on: