Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

App Submission: Open WebUI #1977

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
Open

Conversation

al-lac
Copy link
Contributor

@al-lac al-lac commented Dec 21, 2024

App Submission

Open WebUI

@nmfretz It could make sense to split ollama into a separate app. But it has no UI other than showing that it is running when you access the port.

Both parts do not work properly without the root user.

256x256 SVG icon

https://filetransfer.io/data-package/N7NbYURd#link

Gallery images

open-webui-1
open-webui-2
open-webui-3
open-webui-4

I have tested my app on:

  • umbrelOS on a Raspberry Pi
  • umbrelOS on an Umbrel Home
  • umbrelOS on Linux VM

@nmfretz
Copy link
Contributor

nmfretz commented Jan 13, 2025

Thanks for yet another submission @al-lac!

@nmfretz It could make sense to split ollama into a separate app. But it has no UI other than showing that it is running when you access the port.

I think what you've done here makes sense. Users get easy access to an ollama instance + a web UI that can manage model downloads and use them.

I've just taken a quick look at the compose file and I think we could stick the web-ui behind the app proxy (with or without umbrelOS authentication) and then even remove the host-bind port for Ollama so that Docker uses its internal DNS to connect containers. Let me give it a quick test and let you know my findings so we can decide how to move forward.

Also, we've gone ahead and created the gallery assets in the meantime:
https://github.com/getumbrel/umbrel-apps-gallery/tree/master/open-webui

@getumbrel getumbrel deleted a comment from github-actions bot Jan 13, 2025
Copy link
Contributor

@nmfretz nmfretz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@al-lac I've committed some atomic changes that you can follow. What do you think of this set up for Open Web UI? Feel free to revert anything that you feel doesn't make sense. I've explained some of my changes below.

open-webui/docker-compose.yml Show resolved Hide resolved
Comment on lines +14 to +16
# We could uncomment this if we want to expose the ollama port to the host for other external apps/programs to use.
# ports:
# - 11434:11434
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The open-webui container connects to the ollama container using Docker's internal DNS via the OLLAMA_BASE_URL so we don't need to bind this port to the host if Open WebUI is the only application connecting to Ollama.

However, if users want to connect to their Ollama instance via some other external application then we'd need to have this exposed on the host. But I think that for that use-case it probably makes sense to have ollama running as it's own standalone service like you mention:

@nmfretz It could make sense to split ollama into a separate app.

What are your thoughts on removing this port binding for now? That way this app is a pre-packaged Web UI + Ollama instance that is easy for the average user to use. In the future we can consider either exposing ollama's port on the host or including a standalone ollama instance as another app.

Copy link
Contributor Author

@al-lac al-lac Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense!

I am actually leaning towards giving ollama into its own app or at least keeping the port binding. This would enable the users to connect from their Workstations or use it with other services such as Lobe Chat, Home Assistant, Mattermost and more.

Also we could update ollama better if it was a separate app.

Sadly the ollama website is just an indication that it is running, without any guide on how to connect. So the downside would for sure be that it is not as easy to use as it is now.

But I am fine with whatever you think is for the best here!

Copy link
Contributor

@nmfretz nmfretz Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ya, I hear you. It would be nice to have Ollama as a standalone service that other services could connect to.

Then Open WebUI could just be shipped as the open-webui container, and if users optionally want Ollama functionality they would download the separate Ollama app and connect through Open WebUI's UI. Another option would be to have ollama be required in Open WebUI's manifest so that it must first be installed by the user and then when they install Open WebUI it will be pre-configured with Ollama... but it might actually be nice for Ollama to be entirely optional for Open WebUI. For example, a user may only want to use Open WebUI with an OpenAI API.

I see what you mean though about the less-than-ideal status page for Ollama:
image

Ideally we'd be able to ship Ollama without having to include some custom frontend like we do for things like Back That Mac Up, Tor Snowflake Proxy, etc, since that just adds overhead when updating the app (e.g., needing to check if the frontend needs updating).

Let me run this by the team and see what we think about having Ollama run separately with just its simple status html. Potentially we could allow this and then include some general instructions in the app description on how to connect to other apps.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Proof of concept for ollama as a standalone app: #1977 (comment)

open-webui/docker-compose.yml Show resolved Hide resolved
open-webui/umbrel-app.yml Show resolved Hide resolved
@nmfretz
Copy link
Contributor

nmfretz commented Jan 13, 2025

image

@getumbrel getumbrel deleted a comment from github-actions bot Jan 13, 2025
@nmfretz
Copy link
Contributor

nmfretz commented Jan 14, 2025

@al-lac, this branch is a proof-of-concept for having Ollama as a standalone app: https://github.com/getumbrel/umbrel-apps/tree/standalone-ollama.

In that branch I've removed ollama from Open WebUI's compose file (https://github.com/getumbrel/umbrel-apps/blob/standalone-ollama/open-webui/docker-compose.yml) and added ollama as a standalone app (https://github.com/getumbrel/umbrel-apps/blob/standalone-ollama/ollama/docker-compose.yml).

Feel free to give it a test and see what you think. Here's a screen recording of:

  • connecting to ollama from Open WebUI using the docker container name ollama_ollama_1 (resolved by Docker's DNS)
  • connecting to ollama from Open WebUI using umbrel's Docker's gateway IP 10.21.0.1
  • connecting to ollama from another computer on the same local network using umbrel.local
standalone-ollama.mp4

@al-lac
Copy link
Contributor Author

al-lac commented Jan 14, 2025

Hey @nmfretz, this is great! I really prefer it this way.

Tested pretty much the same as you did, also tried it in mattermost (copilot) and Lobe Chat.

For Lobe Chat to be able to connect, this environment variable needs to be set on ollama:

environment:
      OLLAMA_ORIGINS: "*"

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants