Merge branch 'open-webui:main' into rocm-compose

This commit is contained in:
Justin Hayes
2024-04-16 16:00:17 -04:00
committed by GitHub
6 changed files with 329 additions and 8 deletions

View File

@@ -0,0 +1,33 @@
# Environment Variable Configuration
## App/Backend ##
Here is a list of supported environment variables used by `backend/config.py` intended to provide Open WebUI startup configurability. See also the [logging environment variables](/getting-started/logging#appbackend).
| Environment Variable | App/Backend |
| --------------------------------- | --------------------------------------------------------------------------- |
| `CUSTOM_NAME` | Sets `WEBUI_NAME` but polls _api.openwebui.com_ for metadata |
| `DEFAULT_MODELS` | Set a default Language Model, default: `None` |
| `ENABLE_SIGNUP` | Toggle user account creation, default: `"True"` |
| `ENV` | Environment setting, default: `"dev"` |
| `K8S_FLAG` | Support Kubernetes style Ollama hostname `.svc.cluster.local` |
| `MODEL_FILTER_ENABLED` | Toggle Language Model filtering, default: `"False"` |
| `MODEL_FILTER_LIST` | Set Language Model filter list |
| `OLLAMA_API_BASE_URL` | Deprecated, see `OLLAMA_BASE_URL` |
| `OLLAMA_BASE_URL` | Configure Ollama backend URL, default: `"http://localhost:11434"` |
| `OLLAMA_BASE_URLS` | Configure load balanced Ollama backend hosts, see `OLLAMA_BASE_URL` |
| `OPENAI_API_KEY` | Set OpenAI API key |
| `OPENAI_API_KEYS` | Support multiple Open API keys |
| `OPENAI_API_BASE_URL` | Configure OpenAI base API URL |
| `OPENAI_API_BASE_URLS` | Support balanced OpenAI base API URLs |
| `RAG_EMBEDDING_MODEL` | Configure a Sentence-Transformer model, default: `"all-MiniLM-L6-v2"` |
| `RAG_EMBEDDING_MODEL_AUTO_UPDATE` | Toggle automatic update of the Sentence-Transformer model, default: `False` |
| `USE_CUDA_DOCKER` | Build docker image with NVIDIA CUDA support, default: `False` |
| `USE_OLLAMA_DOCKER` | Build Docker image with bundled Ollama instance, default: `"false"` |
| `USER_PERMISSIONS_CHAT_DELETION` | Toggle user permission to delete chats, default: `"True"` |
| `WEBHOOK_URL` | Set webhook for integration with Slack/Microsoft Teams |
| `WEBUI_AUTH_TRUSTED_EMAIL_HEADER` | Define trusted request header for authentication |
| `WEBUI_NAME` | Main WebUI name, default: `"Open WebUI"` |
| `WEBUI_SECRET_KEY` | Override randomly generated string used for JSON Web Token |
| `WEBUI_VERSION` | Override WebUI version, default: `"v1.0.0-alpha.100"` |
| `WHISPER_MODEL_AUTO_UPDATE` | Toggle automatic update of the Whisper model, default: `False` |

View File

@@ -198,7 +198,13 @@ For more details on networking in Docker and addressing common connectivity issu
<details>
<summary>Rootless (Podman) local-only Open WebUI with Systemd service and auto-update</summary>
- **Important:** Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only.
:::note
Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only.
:::
:::warning
Rootless container execution with Podman (and Docker/ContainerD) does **not** support [AppArmor confinment](https://github.com/containers/podman/pull/19303). This may increase the attack vector due to [requirement of user namespace](https://rootlesscontaine.rs/caveats). Caution should be exercised and judement (in contrast to the root daemon) rendered based on threat model.
:::
1. Pull the latest image:
```bash
@@ -206,12 +212,22 @@ For more details on networking in Docker and addressing common connectivity issu
```
2. Create a new container using desired configuration:
**Note:** `-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_BASE_URL=http://ollama.local:11434'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example.
:::note
`-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_BASE_URL=http://ollama.local:11434'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example.
:::
```bash
podman create -p 127.0.0.1:3000:8080 --network slirp4netns:allow_host_loopback=true --add-host=ollama.local:10.0.2.2 --env 'OLLAMA_BASE_URL=http://ollama.local:11434' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main
```
:::note
[Podman 5.0](https://www.redhat.com/en/blog/podman-50-unveiled) has updated the default rootless network backend to use the more performant [pasta](https://passt.top/passt/about/). While `slirp4netns:allow_host_loopback=true` still achieves the same local-only intention, it's now recommended use a simple TCP forward instead like: `--network=pasta:-T,11434 --add-host=ollama.local:127.0.0.1`. Full example:
:::
```bash
podman create -p 127.0.0.1:3000:8080 --network=pasta:-T,11434 --add-host=ollama.local:127.0.0.1 --env 'OLLAMA_BASE_URL=http://ollama.local:11434' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main
```
3. Prepare for systemd user service:
```bash
mkdir -p ~/.config/systemd/user/
@@ -241,6 +257,22 @@ For more details on networking in Docker and addressing common connectivity issu
podman auto-update --dry-run
```
:::tip
This process is compatible with Windows 11 WSL deployments when using Ollama within the WSL environment or using the Ollama Windows Preview. When using the native Ollama Windows Preview version, one additional step is required: enable [mirrored networking mode](https://learn.microsoft.com/en-us/windows/wsl/networking#mirrored-mode-networking).
:::
### Enabling Windows 11 mirrored networking
1. Populate `%UserProfile%\.wslconfig` with:
```
[wsl2]
networkingMode=mirrored
```
2. Restart WSL:
```
wsl --shutdown
```
</details>
### Alternative Installation Methods

View File

@@ -18,6 +18,12 @@ If you're experiencing connection issues, its often due to the WebUI docker c
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
If you're experiencing connection issues with the SSL error of huggingface.co, please checked the huggingface server, if it is down, you could set the `HF_ENDPOINT` to `https://hf-mirror.com/` in the `docker run` command.
```bash
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
### General Connection Errors
**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates.