mirror of
https://github.com/open-webui/docs
synced 2025-05-20 03:08:56 +00:00
doc: update
This commit is contained in:
parent
a1f49f55ef
commit
07bb0fa680
@ -15,8 +15,8 @@ title: "🚀 Getting Started"
|
|||||||
|
|
||||||
:::
|
:::
|
||||||
|
|
||||||
## Before You Begin
|
<details>
|
||||||
|
<summary>Before You Begin</summary>
|
||||||
1. **Installing Docker:**
|
1. **Installing Docker:**
|
||||||
|
|
||||||
- **For Windows and Mac Users:**
|
- **For Windows and Mac Users:**
|
||||||
@ -48,41 +48,9 @@ title: "🚀 Getting Started"
|
|||||||
3. **Verify Ollama Installation:**
|
3. **Verify Ollama Installation:**
|
||||||
- After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you.
|
- After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you.
|
||||||
|
|
||||||
## Installing with Docker 🐳
|
</details>
|
||||||
|
|
||||||
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
## One-line Command to Install Ollama and Open WebUI Together
|
||||||
|
|
||||||
- **If Ollama is on your computer**, use this command:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
||||||
```
|
|
||||||
|
|
||||||
- **To build the container yourself**, follow these steps:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker build -t open-webui .
|
|
||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui
|
|
||||||
```
|
|
||||||
|
|
||||||
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
|
|
||||||
|
|
||||||
### Using Ollama on a Different Server
|
|
||||||
|
|
||||||
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
|
||||||
```
|
|
||||||
|
|
||||||
Or for a self-built container:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker build -t open-webui .
|
|
||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui
|
|
||||||
```
|
|
||||||
|
|
||||||
### Installing Ollama and Open WebUI Together
|
|
||||||
|
|
||||||
#### Using Docker Compose
|
#### Using Docker Compose
|
||||||
|
|
||||||
@ -130,6 +98,38 @@ title: "🚀 Getting Started"
|
|||||||
./run-compose.sh --enable-gpu --build
|
./run-compose.sh --enable-gpu --build
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Quick Start with Docker 🐳
|
||||||
|
|
||||||
|
:::info
|
||||||
|
When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
||||||
|
:::
|
||||||
|
|
||||||
|
- **If Ollama is on your computer**, use this command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
|
```
|
||||||
|
|
||||||
|
- **If Ollama is on a Different Server**, use this command:
|
||||||
|
|
||||||
|
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
|
```
|
||||||
|
|
||||||
|
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
||||||
|
|
||||||
|
#### Open WebUI: Server Connection Error
|
||||||
|
|
||||||
|
If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
|
||||||
|
|
||||||
|
**Example Docker Command**:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
|
```
|
||||||
|
|
||||||
## Installing with Podman
|
## Installing with Podman
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
|
@ -57,7 +57,11 @@ hide_title: true
|
|||||||
|
|
||||||
Don't forget to explore our sibling project, [Open WebUI Community](https://openwebui.com/), where you can discover, download, and explore customized Modelfiles. Open WebUI Community offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
|
Don't forget to explore our sibling project, [Open WebUI Community](https://openwebui.com/), where you can discover, download, and explore customized Modelfiles. Open WebUI Community offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
|
||||||
|
|
||||||
## Installing with Docker 🐳
|
### Quick Start with Docker 🐳
|
||||||
|
|
||||||
|
:::info
|
||||||
|
When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
||||||
|
:::
|
||||||
|
|
||||||
- **If Ollama is on your computer**, use this command:
|
- **If Ollama is on your computer**, use this command:
|
||||||
|
|
||||||
@ -65,9 +69,7 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
|
|||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
|
- **If Ollama is on a Different Server**, use this command:
|
||||||
|
|
||||||
#### Using Ollama on a Different Server
|
|
||||||
|
|
||||||
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
||||||
|
|
||||||
@ -75,6 +77,18 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
|
|||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
|
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
|
||||||
|
|
||||||
|
#### Open WebUI: Server Connection Error
|
||||||
|
|
||||||
|
If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
|
||||||
|
|
||||||
|
**Example Docker Command**:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
|
```
|
||||||
|
|
||||||
## Troubleshooting
|
## Troubleshooting
|
||||||
|
|
||||||
If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s).
|
If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s).
|
||||||
|
Loading…
Reference in New Issue
Block a user