docs/docs/intro.md
Timothy J. Baek cfa796f2db doc: install
2024-05-24 17:33:26 -07:00

6.4 KiB

sidebar_position slug title hide_title
0 / 🏡 Home true

Open WebUI

Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs.

GitHub stars GitHub forks GitHub watchers GitHub repo size GitHub language count GitHub top language GitHub last commit Hits Discord

Open WebUI Demo

:::info Important Note on User Roles and Privacy:

  • Admin Creation: The first account created on Open WebUI gains Administrator privileges, controlling user management and system settings.

  • User Registrations: Subsequent sign-ups start with Pending status, requiring Administrator approval for access.

  • Privacy and Data Security: All your data, including login details, is locally stored on your device. Open WebUI ensures strict confidentiality and no external requests for enhanced privacy and security.

:::

:::tip

Disabling Login for Single User

If you want to disable login for a single-user setup, set WEBUI_AUTH to False. This will bypass the login page.

:::warning You cannot switch between single-user mode and multi-account mode after this change. :::

:::danger When using Docker to install Open WebUI, make sure to include the -v open-webui:/app/backend/data in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. :::

Installation with Default Configuration

  • If Ollama is on your computer, use this command:

    docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    
  • If Ollama is on a Different Server, use this command:

    To connect to Ollama on another server, change the OLLAMA_BASE_URL to the server's URL:

    docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    
    • To run Open WebUI with Nvidia GPU support, use this command:
    docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
    

Installation for OpenAI API Usage Only

  • If you're only using OpenAI API, use this command:

    docker run -d -p 3000:8080 -e OPENAI_API_KEY=your_secret_key -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
    

Installing Open WebUI with Bundled Ollama Support

This installation method uses a single container image that bundles Open WebUI with Ollama, allowing for a streamlined setup via a single command. Choose the appropriate command based on your hardware setup:

  • With GPU Support: Utilize GPU resources by running the following command:

    docker run -d -p 3000:8080 --gpus=all -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
    
  • For CPU Only: If you're not using a GPU, use this command instead:

    docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
    

Both commands facilitate a built-in, hassle-free installation of both Open WebUI and Ollama, ensuring that you can get everything up and running swiftly.

After installation, you can access Open WebUI at http://localhost:3000. Enjoy! 😄

Manual Installation

Installation with pip (Beta)

For users who prefer to use Python's package manager pip, Open WebUI offers a installation method. Python 3.11 is required for this method.

  1. Install Open WebUI: Open your terminal and run the following command:

    pip install open-webui
    
  2. Start Open WebUI: Once installed, start the server using:

    open-webui serve
    

This method installs all necessary dependencies and starts Open WebUI, allowing for a simple and efficient setup. After installation, you can access Open WebUI at http://localhost:8080. Enjoy! 😄

Other Installation Methods

We offer various installation alternatives, including non-Docker native installation methods, Docker Compose, Kustomize, and Helm. Visit our Open WebUI Documentation or join our Discord community for comprehensive guidance.

Troubleshooting

If you're facing various issues like "Open WebUI: Server Connection Error", see TROUBLESHOOTING for information on how to troubleshoot and/or join our Open WebUI Discord community.

Updating

Check out our full updating guide.

In case you want to update your local Docker installation to the latest version, you can do it with Watchtower:

docker run --rm --volume /var/run/docker.sock:/var/run/docker.sock containrrr/watchtower --run-once open-webui

In the last part of the command, replace open-webui with your container name if it is different.

Continue with the full getting started guide.