From 696c3550997188bfc1d17b7b04a8e5d58295f1f1 Mon Sep 17 00:00:00 2001 From: Justin Hayes Date: Wed, 1 May 2024 12:24:55 -0400 Subject: [PATCH 1/3] Fix GPU instructions --- docs/getting-started/index.md | 29 ++--------------------------- 1 file changed, 2 insertions(+), 27 deletions(-) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index 936a739..e6a752e 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -128,37 +128,12 @@ When using Docker to install Open WebUI, make sure to include the `-v open-webui #### Nvidia CUDA -To run Ollama with Nvidia GPU support, utilize the Nvidia-docker tool for GPU access, and set the appropriate environment variables for CUDA support: +To run Open WebUI with Nvidia GPU support: ```bash -docker run -d -p 3000:8080 \ ---gpus all \ ---add-host=host.docker.internal:host-gateway \ ---volume open-webui:/app/backend/data \ ---name open-webui \ ---restart always \ -ghcr.io/open-webui/open-webui:main +docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda ``` -#### AMD ROCm - -To run Ollama with AMD GPU support, set the `HSA_OVERRIDE_GFX_VERSION` environment variable and ensure the Docker container can access the GPU: - -```bash -docker run -d -p 3000:8080 \ --e HSA_OVERRIDE_GFX_VERSION=11.0.0 \ ---device /dev/kfd \ ---device /dev/dri \ ---group-add video \ ---add-host=host.docker.internal:host-gateway \ ---volume open-webui:/app/backend/data \ ---name open-webui \ ---restart always \ -ghcr.io/open-webui/open-webui:main -``` - -Replace `HSA_OVERRIDE_GFX_VERSION=11.0.0` with the version appropriate for your AMD GPU model as described in the earlier sections. This command ensures compatibility and optimal performance with AMD GPUs. - #### Open WebUI: Server Connection Error Encountering connection issues between the Open WebUI Docker container and the Ollama server? This problem often arises because distro-packaged versions of Docker—like those from the Ubuntu repository—do not support the `host.docker.internal` alias for reaching the host directly. Inside a container, referring to `localhost` or `127.0.0.1` typically points back to the container itself, not the host machine. From 02d28b43e13f81b36b8bf32f253b769f1a871827 Mon Sep 17 00:00:00 2001 From: Justin Hayes Date: Wed, 1 May 2024 12:25:46 -0400 Subject: [PATCH 2/3] Fix enable image generation variable name --- docs/tutorial/images.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/tutorial/images.md b/docs/tutorial/images.md index 877525a..889efec 100644 --- a/docs/tutorial/images.md +++ b/docs/tutorial/images.md @@ -20,7 +20,7 @@ Open WebUI supports image generation through the **AUTOMATIC1111** [API](https:/ ``` 3. For Docker installation of WebUI with the environment variables preset, use the following command: ``` - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e AUTOMATIC1111_BASE_URL=http://host.docker.internal:7860/ -e IMAGE_GENERATION_ENABLED=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main + docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e AUTOMATIC1111_BASE_URL=http://host.docker.internal:7860/ -e ENABLE_IMAGE_GENERATION=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` ### Configuring Open WebUI @@ -49,7 +49,7 @@ ComfyUI provides an alternative interface for managing and interacting with imag ``` 3. For Docker installation of WebUI with the environment variables preset, use the following command: ``` - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e COMFYUI_BASE_URL=http://host.docker.internal:7860/ -e IMAGE_GENERATION_ENABLED=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main + docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e COMFYUI_BASE_URL=http://host.docker.internal:7860/ -e ENABLE_IMAGE_GENERATION=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` ### Configuring Open WebUI From fd2f9573a4e8385ba709e53e68c5a3451a58285d Mon Sep 17 00:00:00 2001 From: Justin Hayes Date: Wed, 1 May 2024 12:34:04 -0400 Subject: [PATCH 3/3] Fix install instructions layout --- docs/getting-started/index.md | 24 ++++++++++-------------- 1 file changed, 10 insertions(+), 14 deletions(-) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index e6a752e..2ddba94 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -108,7 +108,13 @@ When using Docker to install Open WebUI, make sure to include the `-v open-webui docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` -### Installing Both Open WebUI and Ollama Together +- **To run Open WebUI with Nvidia GPU support**, use this command: + + ```bash + docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda + ``` + +### Installing Ollama - **With GPU Support**, Use this command: @@ -122,19 +128,9 @@ When using Docker to install Open WebUI, make sure to include the `-v open-webui docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama ``` -- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄 +After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄 -### GPU Support - -#### Nvidia CUDA - -To run Open WebUI with Nvidia GPU support: - -```bash -docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda -``` - -#### Open WebUI: Server Connection Error +### Open WebUI: Server Connection Error Encountering connection issues between the Open WebUI Docker container and the Ollama server? This problem often arises because distro-packaged versions of Docker—like those from the Ubuntu repository—do not support the `host.docker.internal` alias for reaching the host directly. Inside a container, referring to `localhost` or `127.0.0.1` typically points back to the container itself, not the host machine. @@ -158,7 +154,7 @@ For more details on networking in Docker and addressing common connectivity issu docker compose up -d --build ``` -- **For GPU Support:** Use an additional Docker Compose file: +- **For Nvidia GPU Support:** Use an additional Docker Compose file: ```bash docker compose -f docker-compose.yaml -f docker-compose.gpu.yaml up -d --build