diff --git a/docs/tutorials/docker-install.md b/docs/tutorials/docker-install.md index 0b26c3a..23abb92 100644 --- a/docs/tutorials/docker-install.md +++ b/docs/tutorials/docker-install.md @@ -1,5 +1,5 @@ --- -sidebar_position: 0 +sidebar_position: 5 title: 🐳 Installing Docker --- diff --git a/docs/tutorials/integrations/https-nginx.md b/docs/tutorials/https-nginx.md similarity index 93% rename from docs/tutorials/integrations/https-nginx.md rename to docs/tutorials/https-nginx.md index 53f7613..9fc2cc0 100644 --- a/docs/tutorials/integrations/https-nginx.md +++ b/docs/tutorials/https-nginx.md @@ -1,4 +1,5 @@ --- +sidebar_position: 200 title: "🔒 HTTPS using Nginx" --- @@ -24,10 +25,6 @@ import SelfSigned from './tab-nginx/SelfSigned.md'; import LetsEncrypt from './tab-nginx/LetsEncrypt.md'; - - - - diff --git a/docs/tutorials/integrations/images.md b/docs/tutorials/images.md similarity index 100% rename from docs/tutorials/integrations/images.md rename to docs/tutorials/images.md diff --git a/docs/tutorials/integrations/apache.md b/docs/tutorials/integrations/apache.md deleted file mode 100644 index f746725..0000000 --- a/docs/tutorials/integrations/apache.md +++ /dev/null @@ -1,223 +0,0 @@ ---- -sidebar_position: 7 -title: "🗄️ Hosting UI and Models separately" ---- - -:::warning -This tutorial is a community contribution and is not supported by the OpenWebUI team. It serves only as a demonstration on how to customize OpenWebUI for your specific use case. Want to contribute? Check out the contributing tutorial. -::: - -:::note -If you plan to expose this to the wide area network, consider implementing security like a [network firewall](https://github.com/chr0mag/geoipsets), [web application firewall](https://github.com/owasp-modsecurity/ModSecurity), and [threat intelligence](https://github.com/crowdsecurity/crowdsec). -Additionally, it's strongly recommended to enable HSTS possibly like `Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"` within your **HTTPS** configuration and a redirect of some kind to your **HTTPS URL** within your **HTTP** configuration. For free SSL certification, [Let's Encrypt](https://letsencrypt.org/) is a good option coupled with [Certbot](https://github.com/certbot/certbot) management. -::: - -Sometimes, its beneficial to host Ollama, separate from the UI, but retain the RAG and RBAC support features shared across users: - -## UI Configuration - -For the UI configuration, you can set up the Apache VirtualHost as follows: - -``` -# Assuming you have a website hosting this UI at "server.com" - - ServerName server.com - DocumentRoot /home/server/public_html - - ProxyPass / http://server.com:3000/ nocanon - ProxyPassReverse / http://server.com:3000/ - - RewriteEngine on - RewriteCond %{HTTP:Upgrade} websocket [NC] - RewriteCond %{HTTP:Connection} upgrade [NC] - RewriteRule ^/?(.*) "ws://server.com:3000/$1" [P,L] - -``` - -Enable the site first before you can request SSL: - -:::warning -Use of the `nocanon` option may [affect the security of your backend](https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#proxypass). It's recommended to enable this only if required by your configuration. -_Normally, mod_proxy will canonicalise ProxyPassed URLs. But this may be incompatible with some backends, particularly those that make use of PATH_INFO. The optional nocanon keyword suppresses this and passes the URL path "raw" to the backend. Note that this keyword may affect the security of your backend, as it removes the normal limited protection against URL-based attacks provided by the proxy._ -::: - -`a2ensite server.com.conf` # this will enable the site. a2ensite is short for "Apache 2 Enable Site" - -``` -# For SSL - - ServerName server.com - DocumentRoot /home/server/public_html - - ProxyPass / http://server.com:3000/ nocanon - ProxyPassReverse / http://server.com:3000/ - - RewriteEngine on - RewriteCond %{HTTP:Upgrade} websocket [NC] - RewriteCond %{HTTP:Connection} upgrade [NC] - RewriteRule ^/?(.*) "ws://server.com:3000/$1" [P,L] - - SSLEngine on - SSLCertificateFile /etc/ssl/virtualmin/170514456861234/ssl.cert - SSLCertificateKeyFile /etc/ssl/virtualmin/170514456861234/ssl.key - SSLProtocol all -SSLv2 -SSLv3 -TLSv1 -TLSv1.1 - - SSLProxyEngine on - SSLCACertificateFile /etc/ssl/virtualmin/170514456865864/ssl.ca - - -``` - -I'm using virtualmin here for my SSL clusters, but you can also use certbot directly or your preferred SSL method. To use SSL: - -### Prerequisites - -Run the following commands: - -`snap install certbot --classic` -`snap apt install python3-certbot-apache` (this will install the apache plugin). - -Navigate to the apache sites-available directory: - -`cd /etc/apache2/sites-available/` - -Create server.com.conf if it is not yet already created, containing the above `` configuration (it should match your case. Modify as necessary). Use the one without the SSL: - -Once it's created, run `certbot --apache -d server.com`, this will request and add/create an SSL keys for you as well as create the server.com.le-ssl.conf - -# Configuring Ollama Server - -On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference: - -[Ollama FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md) - -### TL;DR - -The guide doesn't seem to match the current updated service file on linux. So, we will address it here: - -Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.com/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file: - -``` -sudo nano /etc/systemd/system/ollama.service -``` - -Add the following lines: - -``` -Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify -``` - -For instance: - -``` -[Unit] -Description=Ollama Service -After=network-online.target - -[Service] -ExecStart=/usr/local/bin/ollama serve -Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify 192.168.254.109:DIFFERENT_PORT, format -Environment="OLLAMA_ORIGINS=http://192.168.254.106:11434,https://models.server.city" # this line is optional -User=ollama -Group=ollama -Restart=always -RestartSec=3 -Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/s> - -[Install] -WantedBy=default.target -``` - -Save the file by pressing CTRL+S, then press CTRL+X - -When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server. - -# Ollama Model Configuration - -## For the Ollama model configuration, use the following Apache VirtualHost setup - -Navigate to the apache sites-available directory: - -`cd /etc/apache2/sites-available/` - -`nano models.server.city.conf` # match this with your ollama server domain - -Add the follwoing virtualhost containing this example (modify as needed): - -``` - -# Assuming you have a website hosting this UI at "models.server.city" - - - DocumentRoot "/var/www/html/" - ServerName models.server.city - - Options None - Require all granted - - - ProxyRequests Off - ProxyPreserveHost On - ProxyAddHeaders On - SSLProxyEngine on - - ProxyPass / http://server.city:1000/ nocanon # or port 11434 - ProxyPassReverse / http://server.city:1000/ # or port 11434 - - SSLCertificateFile /etc/letsencrypt/live/models.server.city/fullchain.pem - SSLCertificateKeyFile /etc/letsencrypt/live/models.server.city/privkey.pem - Include /etc/letsencrypt/options-ssl-apache.conf - - -``` - -You may need to enable the site first (if you haven't done so yet) before you can request SSL: - -`a2ensite models.server.city.conf` - -#### For the SSL part of Ollama server - -Run the following commands: - -Navigate to the apache sites-available directory: - -`cd /etc/apache2/sites-available/` -`certbot --apache -d server.com` - -``` - - DocumentRoot "/var/www/html/" - ServerName models.server.city - - Options None - Require all granted - - - ProxyRequests Off - ProxyPreserveHost On - ProxyAddHeaders On - SSLProxyEngine on - - ProxyPass / http://server.city:1000/ nocanon # or port 11434 - ProxyPassReverse / http://server.city:1000/ # or port 11434 - - RewriteEngine on - RewriteCond %{SERVER_NAME} =models.server.city - RewriteRule ^ https://%{SERVER_NAME}%{REQUEST_URI} [END,NE,R=permanent] - - -``` - -Don't forget to restart/reload Apache with `systemctl reload apache2` - -Open your site at https://server.com! - -**Congratulations**, your _**Open-AI-like Chat-GPT style UI**_ is now serving AI with RAG, RBAC and multimodal features! Download Ollama models if you haven't yet done so! - -If you encounter any misconfiguration or errors, please file an issue or engage with our discussion. There are a lot of friendly developers here to assist you. - -Let's make this UI much more user friendly for everyone! - -Thanks for making open-webui your UI Choice for AI! - -This doc is made by **Bob Reyes**, your **Open-WebUI** fan from the Philippines. diff --git a/docs/tutorials/integrations/deepseekr1-dynamic.md b/docs/tutorials/integrations/deepseekr1-dynamic.md new file mode 100644 index 0000000..e69de29 diff --git a/docs/tutorials/integrations/index.mdx b/docs/tutorials/integrations/index.mdx index 98e798b..c493561 100644 --- a/docs/tutorials/integrations/index.mdx +++ b/docs/tutorials/integrations/index.mdx @@ -1,4 +1,4 @@ --- -sidebar_position: 3 +sidebar_position: 2 title: "🔗 Integrations" --- \ No newline at end of file diff --git a/docs/tutorials/migration/index.mdx b/docs/tutorials/migration/index.mdx deleted file mode 100644 index 7cf4d68..0000000 --- a/docs/tutorials/migration/index.mdx +++ /dev/null @@ -1,108 +0,0 @@ ---- -sidebar_position: 5 -title: "🔄 Migration" ---- - -## Migrating from Internal to External LiteLLM - -Previous versions (Pre 0.2) of Open WebUI ran `litellm` internally. To improve modularity and flexibility, we recommend running `litellm` in its own dedicated container. This guide will walk you through the migration process. - -### 1. Download Your `config.yaml` File - -Before making any changes, download your existing `config.yaml` file from the Open WebUI Admin Settings window. This file contains your current `litellm` configuration. - -![LiteLLM config.yaml Download](/img/migration_litellm_config.png) - -### 2. Run a Standalone `litellm` Container - -Use the following `docker run` command to launch a dedicated `litellm` container: - -```bash -docker run -d \ - -p 4000:4000 \ - --name litellm \ - -v ./config.yaml:/app/config.yaml \ - -e LITELLM_MASTER_KEY=your_secret_key \ - --restart always \ - ghcr.io/berriai/litellm:main-latest \ - --config /app/config.yaml --port 4000 -``` - -- Replace `./config.yaml` with the actual path to the downloaded `config.yaml` file. -- Set a secure API key for `LITELLM_MASTER_KEY`. This ensures controlled access to your `litellm` instance. - -### 3. Connect `litellm` to Open WebUI - -Once the `litellm` container is up and running: - -1. Go to the Open WebUI settings. -2. Under "Connections," add a new "OpenAI" connection. -3. Set the Base URL to `http://host.docker.internal:4000/v1` (adjust the port if necessary). -4. For the API Key, use the `LITELLM_MASTER_KEY` value you defined in step 2 (e.g., `your_secret_key`). - -Congratulations! You've successfully migrated to an external `litellm` setup, enhancing the flexibility and maintainability of your Open WebUI installation. - -## Migrating your contents from Ollama WebUI to Open WebUI - -Given recent name changes from [Ollama WebUI to Open WebUI](https://github.com/open-webui/open-webui/discussions/602), the docker image has been renamed. Additional steps are required to update for those people that used Ollama WebUI previously and want to start using the new images. - -## Updating to Open WebUI without keeping your data - -If you want to update to the new image but don't want to keep any previous data like conversations, prompts, documents, etc. you can perform the following steps: - -:::danger - -Performing these steps will erase all your current configuration options, chat history, etc. Only LLM Models will be preserved - -::: - -```bash -docker rm -f ollama-webui -docker pull ghcr.io/open-webui/open-webui:main -[insert the equivalent command that you used to install with the new Docker image name] -docker volume rm ollama-webui -``` - -For example, for local installation it would be `docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main`. For other installation commands, check the relevant parts of this README document. - -## Updating to Open WebUI while keeping your data - -If you want to update to the new image migrating all your previous settings like conversations, prompts, documents, etc. you can perform the following steps: - -```bash -docker rm -f ollama-webui -docker pull ghcr.io/open-webui/open-webui:main -# Creates a new volume and uses a temporary container to copy from one volume to another as per https://github.com/moby/moby/issues/31154#issuecomment-360531460 -docker volume create --name open-webui -docker run --rm -v ollama-webui:/from -v open-webui:/to alpine ash -c "cd /from ; cp -av . /to" -[insert the equivalent command that you used to install with the new Docker image name] -``` - -Once you verify that all the data has been migrated you can erase the old volume using the following command: - -```bash -docker volume rm ollama-webui -``` - -### Coming from a local Git repository - -If you came from a git installation where you used `docker compose up` in the project directory, your volumes will be prefixed with the folder name. -Therefore, if your OpenWebUI path was: `/home/myserver/ollama-webui/`, the volumes would be named "ollama-webui_open-webui" and "ollama-webui_ollama". - -To copy the contents over to a conventional docker installation, you may run similar migration commands. In our particular case, the commands would be: -```bash -docker rm -f open-webui -docker rm -f ollama -docker pull ghcr.io/open-webui/open-webui:main -docker pull ghcr.io/open-webui/open-webui:ollama -docker volume create --name open-webui -docker volume create --name ollama -docker run --rm -v ollama-webui_open-webui:/from -v open-webui:/to alpine ash -c "cd /from ; cp -av . /to" -docker run --rm -v ollama-webui_ollama:/from -v ollama:/to alpine ash -c "cd /from ; cp -av . /to" -``` - -Depending on whether you had ollama installed, or already had the same volume names in place, some of the commands **might throw errors**, but they can usually be safely ignored since we're overwriting. - -Then, start both containers as usual, as described in the Getting started guide. - -Once you verify that all the data has been migrated you can erase the old volume using the command `docker volume rm` mentioned above. diff --git a/docs/tutorials/integrations/tab-nginx/LetsEncrypt.md b/docs/tutorials/tab-nginx/LetsEncrypt.md similarity index 100% rename from docs/tutorials/integrations/tab-nginx/LetsEncrypt.md rename to docs/tutorials/tab-nginx/LetsEncrypt.md diff --git a/docs/tutorials/integrations/tab-nginx/SelfSigned.md b/docs/tutorials/tab-nginx/SelfSigned.md similarity index 100% rename from docs/tutorials/integrations/tab-nginx/SelfSigned.md rename to docs/tutorials/tab-nginx/SelfSigned.md diff --git a/docs/tutorials/text-to-speech/index.mdx b/docs/tutorials/text-to-speech/index.mdx index 7c5e5c4..3410dd8 100644 --- a/docs/tutorials/text-to-speech/index.mdx +++ b/docs/tutorials/text-to-speech/index.mdx @@ -1,4 +1,4 @@ --- -sidebar_position: 2 +sidebar_position: 4 title: "🗨️ Text-to-Speech" --- \ No newline at end of file diff --git a/docs/tutorials/tips/index.mdx b/docs/tutorials/tips/index.mdx index 8e9adcc..7394555 100644 --- a/docs/tutorials/tips/index.mdx +++ b/docs/tutorials/tips/index.mdx @@ -1,4 +1,4 @@ --- -sidebar_position: 4 +sidebar_position: 3 title: "💡 Tips & Tricks" --- \ No newline at end of file diff --git a/docs/tutorials/integrations/web_search.md b/docs/tutorials/web_search.md similarity index 99% rename from docs/tutorials/integrations/web_search.md rename to docs/tutorials/web_search.md index 54e67cd..931c8b4 100644 --- a/docs/tutorials/integrations/web_search.md +++ b/docs/tutorials/web_search.md @@ -7,7 +7,7 @@ title: "🌐 Web Search" This tutorial is a community contribution and is not supported by the Open WebUI team. It serves only as a demonstration on how to customize Open WebUI for your specific use case. Want to contribute? Check out the contributing tutorial. ::: -## 🌐 Web Search +# 🌐 Web Search This guide provides instructions on how to set up web search capabilities in Open WebUI using various search engines.