Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

App Submission: Open WebUI #1977

Open
wants to merge 9 commits into
base: master
Choose a base branch
from
Empty file added open-webui/data/ollama/.gitkeep
Empty file.
Empty file.
27 changes: 27 additions & 0 deletions open-webui/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
version: '3.7'

Check notice on line 1 in open-webui/docker-compose.yml

View workflow job for this annotation

GitHub Actions / Lint apps

Potentially using unsafe user in service "ollama"

The default container user "root" can lead to security vulnerabilities. If you are using the root user, please try to specify a different user (e.g. "1000:1000") in the compose file or try to set the UID/PUID and GID/PGID environment variables to 1000.

Check notice on line 1 in open-webui/docker-compose.yml

View workflow job for this annotation

GitHub Actions / Lint apps

Potentially using unsafe user in service "web"

The default container user "root" can lead to security vulnerabilities. If you are using the root user, please try to specify a different user (e.g. "1000:1000") in the compose file or try to set the UID/PUID and GID/PGID environment variables to 1000.

services:
app_proxy:
environment:
APP_HOST: open-webui_web_1
APP_PORT: 8080
PROXY_AUTH_ADD: "false"
al-lac marked this conversation as resolved.
Show resolved Hide resolved

ollama:
image: ollama/ollama:0.5.4@sha256:18bfb1d605604fd53dcad20d0556df4c781e560ebebcd923454d627c994a0e37
volumes:
- ${APP_DATA_DIR}/data/ollama:/root/.ollama
# We could uncomment this if we want to expose the ollama port to the host for other external apps/programs to use.
# ports:
# - 11434:11434
Comment on lines +14 to +16
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The open-webui container connects to the ollama container using Docker's internal DNS via the OLLAMA_BASE_URL so we don't need to bind this port to the host if Open WebUI is the only application connecting to Ollama.

However, if users want to connect to their Ollama instance via some other external application then we'd need to have this exposed on the host. But I think that for that use-case it probably makes sense to have ollama running as it's own standalone service like you mention:

@nmfretz It could make sense to split ollama into a separate app.

What are your thoughts on removing this port binding for now? That way this app is a pre-packaged Web UI + Ollama instance that is easy for the average user to use. In the future we can consider either exposing ollama's port on the host or including a standalone ollama instance as another app.

Copy link
Contributor Author

@al-lac al-lac Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Makes sense!

I am actually leaning towards giving ollama into its own app or at least keeping the port binding. This would enable the users to connect from their Workstations or use it with other services such as Lobe Chat, Home Assistant, Mattermost and more.

Also we could update ollama better if it was a separate app.

Sadly the ollama website is just an indication that it is running, without any guide on how to connect. So the downside would for sure be that it is not as easy to use as it is now.

But I am fine with whatever you think is for the best here!

Copy link
Contributor

@nmfretz nmfretz Jan 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ya, I hear you. It would be nice to have Ollama as a standalone service that other services could connect to.

Then Open WebUI could just be shipped as the open-webui container, and if users optionally want Ollama functionality they would download the separate Ollama app and connect through Open WebUI's UI. Another option would be to have ollama be required in Open WebUI's manifest so that it must first be installed by the user and then when they install Open WebUI it will be pre-configured with Ollama... but it might actually be nice for Ollama to be entirely optional for Open WebUI. For example, a user may only want to use Open WebUI with an OpenAI API.

I see what you mean though about the less-than-ideal status page for Ollama:
image

Ideally we'd be able to ship Ollama without having to include some custom frontend like we do for things like Back That Mac Up, Tor Snowflake Proxy, etc, since that just adds overhead when updating the app (e.g., needing to check if the frontend needs updating).

Let me run this by the team and see what we think about having Ollama run separately with just its simple status html. Potentially we could allow this and then include some general instructions in the app description on how to connect to other apps.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Proof of concept for ollama as a standalone app: #1977 (comment)

restart: on-failure

web:
image: ghcr.io/open-webui/open-webui:v0.5.4@sha256:42e8fa544facc38d731e3d516fbf478abe435bb4b80798e0934930afea6c5bab
volumes:
- ${APP_DATA_DIR}/data/open-webui:/app/backend/data
depends_on:
- ollama
environment:
OLLAMA_BASE_URL: "http://open-webui_ollama_1:11434"
al-lac marked this conversation as resolved.
Show resolved Hide resolved
restart: on-failure
33 changes: 33 additions & 0 deletions open-webui/umbrel-app.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
manifestVersion: 1
id: open-webui
name: Open WebUI
tagline: User-friendly AI Interface
category: ai
version: "0.5.4"
port: 2876
description: >-
🌐 Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline.


🤖 It supports various LLM runners, including Ollama and OpenAI-compatible APIs.


🦙 This app comes packaged with [Ollama](https://ollama.com/) pre-configured to work out of the box.

To start downloading models via Ollama, simply type in a model name in the model search bar and click "Pull from Ollama.com".
A list of models can be found at https://ollama.com/search.
al-lac marked this conversation as resolved.
Show resolved Hide resolved
developer: Ollama
website: https://openwebui.com/
submitter: al-lac
submission: https://github.com/getumbrel/umbrel-apps/pull/1977
repo: https://github.com/open-webui/open-webui
support: https://github.com/open-webui/open-webui/issues
gallery:

Check warning on line 25 in open-webui/umbrel-app.yml

View workflow job for this annotation

GitHub Actions / Lint apps

"icon" and "gallery" needs to be empty for new app submissions

The "icon" and "gallery" fields must be empty for new app submissions as it is being created by the Umbrel team.
- 1.jpg
- 2.jpg
- 3.jpg
defaultUsername: ""
defaultPassword: ""
dependencies: []
releaseNotes: ""
path: ""
Loading