From a3e41db8d7eb7e8339a264ad7bca19e4c5560f55 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Wed, 10 Apr 2024 01:26:23 -0700 Subject: [PATCH] chore: formatting --- README.md | 22 +++++++++++----------- 1 file changed, 11 insertions(+), 11 deletions(-) diff --git a/README.md b/README.md index 386b00f58..9b4d5b4bc 100644 --- a/README.md +++ b/README.md @@ -97,22 +97,22 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open > [!IMPORTANT] > When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. - **If Ollama is on your computer**, use this command: +**If Ollama is on your computer**, use this command: - ```bash - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main - ``` +```bash +docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main +``` - **If Ollama is on a Different Server**, use this command: +**If Ollama is on a Different Server**, use this command: - To connect to Ollama on another server, change the `OLLAMA_BASE_URL` to the server's URL: +To connect to Ollama on another server, change the `OLLAMA_BASE_URL` to the server's URL: - ```bash - docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main - ``` +```bash +docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main +``` + +After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! πŸ˜„ - After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! πŸ˜„ - #### Open WebUI: Server Connection Error If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.