docs/docs/intro.mdx
Timothy Jaeryang Baek 619612c9bf refac
2025-01-08 13:38:25 -08:00

172 lines
7.1 KiB
Plaintext

---
sidebar_position: 0
slug: /
title: 🏡 Home
hide_title: true
---
import { TopBanners } from "@site/src/components/TopBanners";
import { SponsorList } from "@site/src/components/SponsorList";
# Open WebUI
**Open WebUI is an [extensible](https://docs.openwebui.com/features/plugin/), feature-rich, and user-friendly self-hosted AI platform designed to operate entirely offline.** It supports various LLM runners like **Ollama** and **OpenAI-compatible APIs**, with **built-in inference engine** for RAG, making it a **powerful AI deployment solution**.
![GitHub stars](https://img.shields.io/github/stars/open-webui/open-webui?style=social)
![GitHub forks](https://img.shields.io/github/forks/open-webui/open-webui?style=social)
![GitHub watchers](https://img.shields.io/github/watchers/open-webui/open-webui?style=social)
![GitHub repo size](https://img.shields.io/github/repo-size/open-webui/open-webui)
![GitHub language count](https://img.shields.io/github/languages/count/open-webui/open-webui)
![GitHub top language](https://img.shields.io/github/languages/top/open-webui/open-webui)
![GitHub last commit](https://img.shields.io/github/last-commit/open-webui/open-webui?color=red)
![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama-webui%2Follama-wbui&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false)
[![Discord](https://img.shields.io/badge/Discord-Open_WebUI-blue?logo=discord&logoColor=white)](https://discord.gg/5rJgQTnV4s)
[![](https://img.shields.io/static/v1?label=Sponsor&message=%E2%9D%A4&logo=GitHub&color=%23fe8e86)](https://github.com/sponsors/tjbck)
![Open WebUI Demo](/img/demo.gif)
## Quick Start with Docker 🐳
**If Ollama is on your computer**, use this command:
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
**To run Open WebUI with Nvidia GPU support**, use this command:
```bash
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
```
### Open WebUI Bundled with Ollama
This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:
- **With GPU Support**:
Utilize GPU resources by running the following command:
```bash
docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
```
- **For CPU Only**:
If you're not using a GPU, use this command instead:
```bash
docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
```
Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.
After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
### Using the Dev Branch 🌙
:::warning
The `:dev` branch contains the latest unstable features and changes. Use it at your own risk as it may have bugs or incomplete features.
:::
If you want to try out the latest bleeding-edge features and are okay with occasional instability, you can use the `:dev` tag like this:
```bash
docker run -d -p 3000:8080 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:dev
```
### Updating Open WebUI
To update Open WebUI container easily, follow these steps:
#### Manual Update
Use [Watchtower](https://containrrr.dev/watchtower) to update your Docker container manually:
```bash
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui
```
#### Automatic Updates
Keep your container updated automatically every 5 minutes:
```bash
docker run -d --name watchtower --restart unless-stopped -v /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --interval 300 open-webui
```
🔧 **Note**: Replace `open-webui` with your container name if it's different.
## Manual Installation
{/* ### Installation with `uvx`
It is highly recommended to use **`uv`**, a Python runtime manager, to ensure seamless Python environment management and avoid conflicts. Using `uv` is especially helpful when running multiple Python projects with different dependencies.
#### Install `uv`:
Visit the official [`uv` documentation](https://docs.astral.sh/uv/getting-started/installation/) for details. Below are the quick commands for installation based on your operating system:
- **macOS/Linux**:
```bash
curl -LsSf https://astral.sh/uv/install.sh | sh
```
- **Windows**:
```powershell
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"
```
Once `uv` is installed and everything is configured, you can proceed with the following command to install and serve Open WebUI:
```bash
uvx --python 3.11 open-webui serve
```
This command will:
1. Download and install the required Python 3.11 runtime if `uv` doesn't already have it.
2. Install Open WebUI along with its necessary dependencies.
3. Start the Open WebUI server.
After running this command, Open WebUI will be served at [http://localhost:8080](http://localhost:8080). Enjoy! 😄 */}
### Installation with `pip`
For users installing Open WebUI with Python's package manager `pip`, **it is strongly recommended to use Python runtime managers like `uv` or `conda`**. These tools help manage Python environments effectively and avoid conflicts.
Python 3.11 is the development environment. Python 3.12 seems to work but has not been thoroughly tested. Python 3.13 is entirely untested—**use at your own risk**.
1. **Install Open WebUI**:
Open your terminal and run the following command:
```bash
pip install open-webui
```
2. **Start Open WebUI**:
Once installed, start the server using:
```bash
open-webui serve
```
### Updating Open WebUI
To update to the latest version, simply run:
```bash
pip install --upgrade open-webui
```
This method installs all necessary dependencies and starts Open WebUI, allowing for a simple and efficient setup. After installation, you can access Open WebUI at [http://localhost:8080](http://localhost:8080). Enjoy! 😄
## Other Installation Methods
We offer various installation alternatives, including non-Docker native installation methods, Docker Compose, Kustomize, and Helm. Visit our [Open WebUI Documentation](https://docs.openwebui.com/getting-started/) or join our [Discord community](https://discord.gg/5rJgQTnV4s) for comprehensive guidance.
Continue with the full [getting started guide](/getting-started).
## Sponsors 🙌
<TopBanners />
<SponsorList />
We are incredibly grateful for the generous support of our sponsors. Their contributions help us to maintain and improve our project, ensuring we can continue to deliver quality work to our community. Thank you!