Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
App Submission: Open WebUI #1977
base: master
Are you sure you want to change the base?
App Submission: Open WebUI #1977
Changes from all commits
905a767
35a7e63
b3d078f
27e0b24
079acc0
b60546d
a0417c2
5b0c599
0bddabb
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
Check notice on line 1 in open-webui/docker-compose.yml
GitHub Actions / Lint apps
Potentially using unsafe user in service "ollama"
Check notice on line 1 in open-webui/docker-compose.yml
GitHub Actions / Lint apps
Potentially using unsafe user in service "web"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The
open-webui
container connects to theollama
container using Docker's internal DNS via theOLLAMA_BASE_URL
so we don't need to bind this port to the host if Open WebUI is the only application connecting to Ollama.However, if users want to connect to their Ollama instance via some other external application then we'd need to have this exposed on the host. But I think that for that use-case it probably makes sense to have ollama running as it's own standalone service like you mention:
What are your thoughts on removing this port binding for now? That way this app is a pre-packaged Web UI + Ollama instance that is easy for the average user to use. In the future we can consider either exposing ollama's port on the host or including a standalone ollama instance as another app.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense!
I am actually leaning towards giving ollama into its own app or at least keeping the port binding. This would enable the users to connect from their Workstations or use it with other services such as Lobe Chat, Home Assistant, Mattermost and more.
Also we could update ollama better if it was a separate app.
Sadly the ollama website is just an indication that it is running, without any guide on how to connect. So the downside would for sure be that it is not as easy to use as it is now.
But I am fine with whatever you think is for the best here!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ya, I hear you. It would be nice to have Ollama as a standalone service that other services could connect to.
Then Open WebUI could just be shipped as the
open-webui
container, and if users optionally want Ollama functionality they would download the separate Ollama app and connect through Open WebUI's UI. Another option would be to have ollama berequired
in Open WebUI's manifest so that it must first be installed by the user and then when they install Open WebUI it will be pre-configured with Ollama... but it might actually be nice for Ollama to be entirely optional for Open WebUI. For example, a user may only want to use Open WebUI with an OpenAI API.I see what you mean though about the less-than-ideal status page for Ollama:
Ideally we'd be able to ship Ollama without having to include some custom frontend like we do for things like Back That Mac Up, Tor Snowflake Proxy, etc, since that just adds overhead when updating the app (e.g., needing to check if the frontend needs updating).
Let me run this by the team and see what we think about having Ollama run separately with just its simple status html. Potentially we could allow this and then include some general instructions in the app description on how to connect to other apps.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Proof of concept for ollama as a standalone app: #1977 (comment)
Check warning on line 25 in open-webui/umbrel-app.yml
GitHub Actions / Lint apps
"icon" and "gallery" needs to be empty for new app submissions