From 6744dc2d9747cdbbd9e5bef5bb2641fa7a9088ec Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Wed, 21 Feb 2024 01:00:35 -0800 Subject: [PATCH] doc: troubleshooting --- docs/getting-started/installation.md | 6 ++--- docs/getting-started/troubleshooting.md | 32 +++++++++++++++++++++++++ docs/intro.md | 4 ++++ 3 files changed, 39 insertions(+), 3 deletions(-) create mode 100644 docs/getting-started/troubleshooting.md diff --git a/docs/getting-started/installation.md b/docs/getting-started/installation.md index 8986193..2a3389e 100644 --- a/docs/getting-started/installation.md +++ b/docs/getting-started/installation.md @@ -1,4 +1,4 @@ -# Installation +# Alternative Installation ### Installing Both Ollama and Open WebUI Using Kustomize @@ -25,13 +25,13 @@ helm package ./kubernetes/helm/ For cpu-only pod ```bash -helm install ollama-webui ./ollama-webui-*.tgz +helm install open-webui ./open-webui-*.tgz ``` For gpu-enabled pod ```bash -helm install ollama-webui ./ollama-webui-*.tgz --set ollama.resources.limits.nvidia.com/gpu="1" +helm install open-webui ./open-webui-*.tgz --set ollama.resources.limits.nvidia.com/gpu="1" ``` Check the `kubernetes/helm/values.yaml` file to know which parameters are available for customization diff --git a/docs/getting-started/troubleshooting.md b/docs/getting-started/troubleshooting.md new file mode 100644 index 0000000..b1e2c4a --- /dev/null +++ b/docs/getting-started/troubleshooting.md @@ -0,0 +1,32 @@ +# Troubleshooting + +## Understanding the Open WebUI Architecture + +The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues. + +- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama/api` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_API_BASE_URL` environment variable. Therefore, a request made to `/ollama/api` in the WebUI is effectively the same as making a request to `OLLAMA_API_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_API_BASE_URL/tags` in the backend. + +- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer. + +## Open WebUI: Server Connection Error + +If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`. + +**Example Docker Command**: + +```bash +docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main +``` + +### General Connection Errors + +**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates. + +**Troubleshooting Steps**: + +1. **Verify Ollama URL Format**: + - When running the Web UI container, ensure the `OLLAMA_API_BASE_URL` is correctly set, including the `/api` suffix. (e.g., `http://192.168.1.1:11434/api` for different host setups). + - In the Open WebUI, navigate to "Settings" > "General". + - Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]/api` (e.g., `http://localhost:11434/api`), including the `/api` suffix. + +By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord. diff --git a/docs/intro.md b/docs/intro.md index fbd090d..0e90188 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -75,4 +75,8 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` +## Troubleshooting + +If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s). + Continue with the full [getting started guide](./getting-started.md).