mirror of
https://github.com/open-webui/docs
synced 2025-06-15 19:09:21 +00:00
refac
This commit is contained in:
parent
d973f40866
commit
831a48c89a
@ -1,6 +1,6 @@
|
||||
---
|
||||
sidebar_position: 5
|
||||
title: "Open WebUI API Endpoints"
|
||||
title: "🔗 Open WebUI API Endpoints"
|
||||
---
|
||||
|
||||
## Swagger Docs
|
||||
|
@ -1,6 +1,6 @@
|
||||
---
|
||||
sidebar_position: 4
|
||||
title: "Environment Variable Configuration"
|
||||
title: "🌍 Environment Variable Configuration"
|
||||
---
|
||||
|
||||
|
||||
|
@ -1,6 +1,6 @@
|
||||
---
|
||||
sidebar_position: 1
|
||||
title: "Alternative Installation"
|
||||
title: "🔧 Alternative Installation"
|
||||
---
|
||||
|
||||
### Installing Both Ollama and Open WebUI Using Kustomize
|
||||
|
@ -1,6 +1,6 @@
|
||||
---
|
||||
sidebar_position: 3
|
||||
title: "Open WebUI Logging"
|
||||
title: "📜 Open WebUI Logging"
|
||||
---
|
||||
|
||||
## Browser Client Logging ##
|
||||
|
@ -1,6 +1,6 @@
|
||||
---
|
||||
sidebar_position: 2
|
||||
title: "Updating Open WebUI"
|
||||
title: "🔄 Updating Open WebUI"
|
||||
---
|
||||
|
||||
## Updating your Docker Installation
|
||||
|
72
docs/troubleshooting/connection-error.mdx
Normal file
72
docs/troubleshooting/connection-error.mdx
Normal file
@ -0,0 +1,72 @@
|
||||
---
|
||||
sidebar_position: 0
|
||||
title: "🚧 Server Connection Error"
|
||||
---
|
||||
|
||||
We're here to help you get everything set up and running smoothly. Below, you'll find step-by-step instructions tailored for different scenarios to solve common connection issues with Ollama and external servers like Hugging Face.
|
||||
|
||||
## 🌟 Connection to Ollama Server
|
||||
|
||||
### 🚀 Accessing Ollama from Open WebUI
|
||||
|
||||
Struggling to connect to Ollama from Open WebUI? It could be because Ollama isn’t listening on a network interface that allows external connections. Let’s sort that out:
|
||||
|
||||
1. **Configure Ollama to Listen Broadly** 🎧:
|
||||
Set `OLLAMA_HOST` to `0.0.0.0` to make Ollama listen on all network interfaces.
|
||||
|
||||
2. **Update Environment Variables**:
|
||||
Ensure that the `OLLAMA_HOST` is accurately set within your deployment environment.
|
||||
|
||||
3. **Restart Ollama**🔄:
|
||||
A restart is needed for the changes to take effect.
|
||||
|
||||
💡 After setting up, verify that Ollama is accessible by visiting the WebUI interface.
|
||||
|
||||
For more detailed instructions on configuring Ollama, please refer to the [Ollama's Official Documentation](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux).
|
||||
|
||||
### 🐳 Docker Connection Error
|
||||
|
||||
If you're seeing a connection error when trying to access Ollama, it might be because the WebUI docker container can't talk to the Ollama server running on your host. Let’s fix that:
|
||||
|
||||
1. **Adjust the Network Settings** 🛠️:
|
||||
Use the `--network=host` flag in your Docker command. This links your container directly to your host’s network.
|
||||
|
||||
2. **Change the Port**:
|
||||
Remember that the internal port changes from 3000 to 8080.
|
||||
|
||||
**Example Docker Command**:
|
||||
```bash
|
||||
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
🔗 After running the above, your WebUI should be available at `http://localhost:8080`.
|
||||
|
||||
## 🔒 SSL Connection Issue with Hugging Face
|
||||
|
||||
Encountered an SSL error? It could be an issue with the Hugging Face server. Here's what to do:
|
||||
|
||||
1. **Check Hugging Face Server Status**:
|
||||
Verify if there's a known outage or issue on their end.
|
||||
|
||||
2. **Switch Endpoint**:
|
||||
If Hugging Face is down, switch the endpoint in your Docker command.
|
||||
|
||||
**Example Docker Command for Connected Issues**:
|
||||
```bash
|
||||
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
|
||||
## 🍏 Podman on MacOS
|
||||
|
||||
Running on MacOS with Podman? Here’s how to ensure connectivity:
|
||||
|
||||
1. **Enable Host Loopback**:
|
||||
Use `--network slirp4netns:allow_host_loopback=true` in your command.
|
||||
|
||||
2. **Set OLLAMA_BASE_URL**:
|
||||
Ensure it points to `http://host.containers.internal:11434`.
|
||||
|
||||
**Example Podman Command**:
|
||||
```bash
|
||||
podman run -d --network slirp4netns:allow_host_loopback=true -p 3000:8080 -e OLLAMA_BASE_URL=http://host.containers.internal:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
|
@ -3,277 +3,19 @@ sidebar_position: 300
|
||||
title: "🛠️ Troubleshooting"
|
||||
---
|
||||
|
||||
import { TopBanners } from "@site/src/components/TopBanners";
|
||||
|
||||
<TopBanners />
|
||||
## 🌟 General Troubleshooting Tips
|
||||
|
||||
## Open WebUI: Server Connection Error
|
||||
Encountering issues? Don't worry, we're here to help! 😊 Start with this important step:
|
||||
|
||||
If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
|
||||
- 🔄 Make sure you're using the **latest version** of the software.
|
||||
|
||||
**Example Docker Command**:
|
||||
With this project constantly evolving, updates and fixes are regularly added. Keeping your software up-to-date is crucial to take advantage of all the enhancements and fixes, ensuring the best possible experience. 🚀
|
||||
|
||||
```bash
|
||||
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
### 🤝 Community Support
|
||||
|
||||
If you're experiencing connection issues with the SSL error of huggingface.co, please checked the huggingface server, if it is down, you could set the `HF_ENDPOINT` to `https://hf-mirror.com/` in the `docker run` command.
|
||||
This project thrives on community spirit and passion. If you still face problems after updating, we warmly invite you to join our vibrant community on [Discord](https://discord.com/invite/5rJgQTnV4s). There, you can share your experiences, find solutions, and connect with fellow enthusiasts who might be navigating similar challenges. Engaging with our community doesn't just help solve your problems; it strengthens the entire network of support, so we all grow together. 🌱
|
||||
|
||||
```bash
|
||||
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
🌟 If your issues are pressing and you need a quicker resolution, consider [supporting our project](/sponsorships). Your sponsorship not only fast-tracks your queries in a dedicated sponsor-only channel, but also directly supports the [dedicated maintainer](/mission) who is passionately committed to refining and enhancing this tool for everyone.
|
||||
|
||||
If you're using Podman on MacOS, to reach Ollama running on your computer you must enable the host loopback with `--network slirp4netns:allow_host_loopback=true` and override `OLLAMA_BASE_URL` to `http://host.containers.internal:11434`. The Open WebUI link remains the default: `http://localhost:3000`.
|
||||
|
||||
**Example Podman Command**:
|
||||
|
||||
```bash
|
||||
podman run -d --network slirp4netns:allow_host_loopback=true -p 3000:8080 -e OLLAMA_BASE_URL=http://host.containers.internal:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||
```
|
||||
|
||||
## Microphone access and other permission issues with non-HTTPS connections
|
||||
Chromium-based (Chrome, Brave, MS Edge, Opera, Vivaldi, ...) and firefox-based browsers often restrict site-level permissions on non-HTTPS URLs. This is most obvious when running an instance on your local network and reaching it from another device (example: a phone) of the same network: the microphone permission will most likely get denied right away without a clear way to whitelist the URL. Solutions for this include manually setting up HTTPS or adding an exception to the URL to flag it at secure. Use this at your own risk and we strongly recommend thinking carefully about the security implications before jumping into this. To flag a URL as secure:
|
||||
|
||||
- On chromium based browsers (Chrome, Brave, MS Edge, Opera, Vivaldi, ...): open `chrome://flags/#unsafely-treat-insecure-origin-as-secure` and add your non HTTPS address (for example: `http://192.168.1.35:3000`) then restart the app.
|
||||
|
||||
- On firefox-based browsers: Open `about:config`, and modify or add as a string the property `dom.securecontext.allowlist`, where one or more IPs can be defined split between commas. (Eg: http://127.0.0.1:8080 for a openwebui service provided locally, using port 8080).
|
||||
|
||||
## Difficulty Accessing Ollama from Open WebUI
|
||||
|
||||
If you're encountering difficulties accessing Ollama from the Open WebUI interface, it could be due to Ollama being configured to listen on a restricted network interface by default. To enable access from the Open WebUI, you need to configure Ollama to listen on a broader range of network interfaces.
|
||||
|
||||
Follow these steps to adjust the Ollama configuration:
|
||||
|
||||
1. **Configure Ollama Host**: Set the `OLLAMA_HOST` environment variable to `0.0.0.0`. This tells Ollama to listen on all available network interfaces, enabling connections from external sources, including the Open WebUI.
|
||||
|
||||
2. **Modify Ollama Environment Variables**: Depending on how you're running Ollama, you may need to adjust the environment variables accordingly. If you're running Ollama in a Docker container, ensure that the `OLLAMA_HOST` variable is correctly set within the container environment. For other deployment methods, refer to the respective documentation for instructions on setting environment variables.
|
||||
|
||||
3. **Restart Ollama**: After modifying the environment variables, restart the Ollama service to apply the changes. This ensures that Ollama begins listening on the specified network interfaces.
|
||||
|
||||
Once Ollama is configured to listen on `0.0.0.0`, you should be able to access it from the Open WebUI without any issues.
|
||||
|
||||
For detailed instructions on setting environment variables for Ollama, refer to the [official Ollama documentation](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux).
|
||||
|
||||
## General Connection Errors
|
||||
|
||||
**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates.
|
||||
|
||||
**Troubleshooting Steps**:
|
||||
|
||||
1. **Verify Ollama URL Format**:
|
||||
- When running the Web UI container, ensure the `OLLAMA_BASE_URL` is correctly set. (e.g., `http://192.168.1.1:11434` for different host setups).
|
||||
- In the Open WebUI, navigate to "Settings" > "General".
|
||||
- Confirm that the Ollama Server URL is correctly set to `[OLLAMA URL]` (e.g., `http://localhost:11434`).
|
||||
|
||||
By following these enhanced troubleshooting steps, connection issues should be effectively resolved. For further assistance or queries, feel free to reach out to us on our community Discord.
|
||||
|
||||
## Network Diagrams of different deployments
|
||||
|
||||
#### Mac OS/Windows - Ollama on Host, Open WebUI in container
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
|
||||
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Mac OS/Windows - Ollama and Open WebUI in the same Compose stack
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Boundary(b2, "Compose Stack") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
|
||||
UpdateRelStyle(openwebui, ollama, $offsetX="-100", $offsetY="-50")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Mac OS/Windows - Ollama and Open WebUI in containers, in different networks
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Boundary(b2, "Network A") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Boundary(b3, "Network B") {
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Rel(openwebui, ollama, "Unable to connect")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Mac OS/Windows - Open WebUI in host network
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Rel(user, openwebui, "Unable to connect, host network is the VM's network")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Linux - Ollama on Host, Open WebUI in container
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Container Network") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
|
||||
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Linux - Ollama and Open WebUI in the same Compose stack
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b1, "Container Network") {
|
||||
Boundary(b2, "Compose Stack") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
|
||||
UpdateRelStyle(openwebui, ollama, $offsetX="-100", $offsetY="-50")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Linux - Ollama and Open WebUI in containers, in different networks
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
|
||||
Boundary(b2, "Container Network A") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Boundary(b3, "Container Network B") {
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
Rel(openwebui, ollama, "Unable to connect")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
#### Linux - Open WebUI in host network, Ollama on host
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
|
||||
Rel(openwebui, ollama, "Makes API calls via localhost", "http://localhost:11434")
|
||||
Rel(user, openwebui, "Makes requests via listening port", "http://localhost:8080")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
|
||||
```
|
||||
|
||||
## Reset Admin Password
|
||||
|
||||
If you've forgotten your admin password, you can reset it by following these steps:
|
||||
|
||||
### Reset Admin Password in Docker
|
||||
|
||||
To reset the admin password for Open WebUI in a Docker deployment, generate a bcrypt hash of your new password and run a Docker command to update the database. Replace `your-new-password` with the desired password and execute:
|
||||
|
||||
1. **Generate bcrypt hash** (local machine):
|
||||
|
||||
```bash
|
||||
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n'
|
||||
```
|
||||
|
||||
2. **Update password in Docker** (replace `HASH` and `admin@example.com`):
|
||||
```bash
|
||||
docker run --rm -v open-webui:/data alpine/socat EXEC:"bash -c 'apk add sqlite && echo UPDATE auth SET password='\''HASH'\'' WHERE email='\''admin@example.com'\''; | sqlite3 /data/webui.db'", STDIO
|
||||
```
|
||||
|
||||
### Reset Admin Password Locally
|
||||
|
||||
For local installations of Open WebUI, navigate to the `open-webui` directory and update the password in the `backend/data/webui.db` database.
|
||||
|
||||
1. **Generate bcrypt hash** (local machine):
|
||||
|
||||
```bash
|
||||
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n'
|
||||
```
|
||||
|
||||
2. **Update password locally** (replace `HASH` and `admin@example.com`):
|
||||
```bash
|
||||
sqlite3 backend/data/webui.db "UPDATE auth SET password='HASH' WHERE email='admin@example.com';"
|
||||
```
|
||||
|
||||
## Understanding the Open WebUI Architecture
|
||||
|
||||
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
||||
|
||||
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_BASE_URL` environment variable. Therefore, a request made to `/ollama` in the WebUI is effectively the same as making a request to `OLLAMA_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_BASE_URL/api/tags` in the backend.
|
||||
|
||||
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
||||
Together, let's harness these opportunities to create the best environment and keep pushing the boundaries of what we can achieve with our project. Thank you from the bottom of our hearts for your understanding, cooperation, and belief in our mission! 🙏
|
38
docs/troubleshooting/microphone-error.mdx
Normal file
38
docs/troubleshooting/microphone-error.mdx
Normal file
@ -0,0 +1,38 @@
|
||||
---
|
||||
sidebar_position: 1
|
||||
title: "🎙️ Troubleshooting Microphone Access"
|
||||
---
|
||||
|
||||
Ensuring your application has the proper microphone access is crucial for functionality that depends on audio input. This guide covers how to manage and troubleshoot microphone permissions, particularly under secure contexts.
|
||||
|
||||
## Understanding Secure Contexts 🔒
|
||||
|
||||
For security reasons, accessing the microphone is restricted to pages served over HTTPS or locally from `localhost`. This requirement is meant to safeguard your data by ensuring it is transmitted over secure channels.
|
||||
|
||||
## Common Permission Issues 🚫
|
||||
|
||||
Browsers like Chrome, Brave, Microsoft Edge, Opera, and Vivaldi, as well as Firefox, restrict microphone access on non-HTTPS URLs. This typically becomes an issue when accessing a site from another device within the same network (e.g., using a mobile phone to access a desktop server). Here's how you can manage these issues:
|
||||
|
||||
### Solutions for Non-HTTPS Connections
|
||||
|
||||
1. **Set Up HTTPS:**
|
||||
- It is highly recommended to configure your server to support HTTPS. This not only resolves permission issues but also enhances the security of your data transmissions.
|
||||
|
||||
2. **Temporary Browser Flags (Use with caution):**
|
||||
- These settings force your browser to treat certain insecure URLs as secure. This is useful for development purposes but poses significant security risks. Here's how to adjust these settings for major browsers:
|
||||
|
||||
#### Chromium-based Browsers (e.g., Chrome, Brave)
|
||||
- Open `chrome://flags/#unsafely-treat-insecure-origin-as-secure`.
|
||||
- Enter your non-HTTPS address (e.g., `http://192.168.1.35:3000`).
|
||||
- Restart the browser to apply the changes.
|
||||
|
||||
#### Firefox-based Browsers
|
||||
- Open `about:config`.
|
||||
- Search and modify (or create) the string value `dom.securecontext.allowlist`.
|
||||
- Add your IP addresses separated by commas (e.g., `http://127.0.0.1:8080`).
|
||||
|
||||
### Considerations and Risks 🚨
|
||||
|
||||
While browser flags offer a quick fix, they bypass important security checks which can expose your device and data to vulnerabilities. Always prioritize proper security measures, especially when planning for a production environment.
|
||||
|
||||
By following these best practices, you can ensure that your application properly accesses the microphone while maintaining the security and integrity of your data.
|
176
docs/troubleshooting/network-diagrams.mdx
Normal file
176
docs/troubleshooting/network-diagrams.mdx
Normal file
@ -0,0 +1,176 @@
|
||||
---
|
||||
sidebar_position: 4
|
||||
title: "🕸️ Network Diagrams"
|
||||
---
|
||||
|
||||
Here, we provide clear and structured diagrams to help you understand how various components of the network interact within different setups. This documentation is designed to assist both macOS/Windows and Linux users. Each scenario is illustrated using Mermaid diagrams to show how the interactions are set up depending on the different system configurations and deployment strategies.
|
||||
|
||||
## Table of Contents
|
||||
1. [Mac OS/Windows Setup Options](#macoswindows-setup-options)
|
||||
- [Ollama on Host, Open WebUI in Container](#ollama-on-host-open-webui-in-container)
|
||||
- [Ollama and Open WebUI in Compose Stack](#ollama-and-open-webui-in-compose-stack)
|
||||
- [Ollama and Open WebUI, Separate Networks](#ollama-and-open-webui-separate-networks)
|
||||
- [Open WebUI in Host Network](#open-webui-in-host-network)
|
||||
2. [Linux Setup Options](#linux-setup-options)
|
||||
- [Ollama on Host, Open WebUI in Container](#ollama-on-host-open-webui-in-container-linux)
|
||||
- [Ollama and Open WebUI in Compose Stack](#ollama-and-open-webui-in-compose-stack-linux)
|
||||
- [Ollama and Open WebUI, Separate Networks](#ollama-and-open-webui-separate-networks-linux)
|
||||
- [Open WebUI in Host Network, Ollama on Host](#open-webui-in-host-network-ollama-on-host)
|
||||
|
||||
|
||||
## Mac OS/Windows Setup Options 🖥️
|
||||
|
||||
### Ollama on Host, Open WebUI in Container
|
||||
|
||||
In this scenario, `Ollama` runs directly on the host machine while `Open WebUI` operates within a Docker container.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Ollama and Open WebUI in Compose Stack
|
||||
|
||||
Both `Ollama` and `Open WebUI` are configured within the same Docker Compose stack, simplifying network communications.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Boundary(b2, "Compose Stack") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Ollama and Open WebUI, Separate Networks
|
||||
|
||||
Here, `Ollama` and `Open WebUI` are deployed in separate Docker networks, potentially leading to connectivity issues.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Boundary(b2, "Network A") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Boundary(b3, "Network B") {
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
Rel(openwebui, ollama, "Unable to connect")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Open WebUI in Host Network
|
||||
|
||||
In this configuration, `Open WebUI` utilizes the host network, which impacts its ability to connect in certain environments.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Mac OS/Windows") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Docker Desktop's Linux VM") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
}
|
||||
Rel(user, openwebui, "Unable to connect, host network is the VM's network")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
|
||||
## Linux Setup Options 🐧
|
||||
|
||||
### Ollama on Host, Open WebUI in Container (Linux)
|
||||
|
||||
This diagram is specific to the Linux platform, with `Ollama` running on the host and `Open WebUI` deployed inside a Docker container.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Container Network") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
Rel(openwebui, ollama, "Makes API calls via Docker proxy", "http://host.docker.internal:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Ollama and Open WebUI in Compose Stack (Linux)
|
||||
|
||||
A set-up where both `Ollama` and `Open WebUI` reside within the same Docker Compose stack, allowing for straightforward networking on Linux.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
Boundary(b1, "Container Network") {
|
||||
Boundary(b2, "Compose Stack") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
}
|
||||
Rel(openwebui, ollama, "Makes API calls via inter-container networking", "http://ollama:11434")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Ollama and Open WebUI, Separate Networks (Linux)
|
||||
|
||||
A scenario in which `Ollama` and `Open WebUI` are in different Docker networks under a Linux environment, which could hinder connectivity.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
Boundary(b2, "Container Network A") {
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
}
|
||||
Boundary(b3, "Container Network B") {
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
}
|
||||
Rel(openwebui, ollama, "Unable to connect")
|
||||
Rel(user, openwebui, "Makes requests via exposed port -p 3000:8080", "http://localhost:3000")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
### Open WebUI in Host Network, Ollama on Host (Linux)
|
||||
|
||||
An optimal layout where both `Open WebUI` and `Ollama` use the host’s network, facilitating seamless interaction on Linux systems.
|
||||
|
||||
```mermaid
|
||||
C4Context
|
||||
Boundary(b0, "Hosting Machine - Linux") {
|
||||
Person(user, "User")
|
||||
Component(openwebui, "Open WebUI", "Listening on port 8080")
|
||||
Component(ollama, "Ollama", "Listening on port 11434")
|
||||
}
|
||||
Rel(openwebui, ollama, "Makes API calls via localhost", "http://localhost:11434")
|
||||
Rel(user, openwebui, "Makes requests via listening port", "http://localhost:8080")
|
||||
UpdateRelStyle(user, openwebui, $offsetX="-100", $offsetY="-50")
|
||||
```
|
||||
|
||||
Each setup addresses different deployment strategies and networking configurations to help you choose the best layout for your requirements.
|
52
docs/troubleshooting/password-reset.mdx
Normal file
52
docs/troubleshooting/password-reset.mdx
Normal file
@ -0,0 +1,52 @@
|
||||
---
|
||||
sidebar_position: 0
|
||||
title: "🔑 Reset Admin Password"
|
||||
---
|
||||
|
||||
# Resetting Your Admin Password 🗝️
|
||||
|
||||
If you've forgotten your admin password, don't worry! Below you'll find step-by-step guides to reset your admin password for Docker 🐳 deployments and local installations of Open WebUI.
|
||||
|
||||
## For Docker Deployments 🐳
|
||||
|
||||
Follow these steps to reset the admin password for Open WebUI when deployed using Docker.
|
||||
|
||||
### Step 1: Generate a New Password Hash 🔐
|
||||
|
||||
First, you need to create a bcrypt hash of your new password. Run the following command on your local machine, replacing `your-new-password` with the password you wish to use:
|
||||
|
||||
```bash
|
||||
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n'
|
||||
```
|
||||
|
||||
### Step 2: Update the Password in Docker 🔄
|
||||
|
||||
Next, you'll update the password in your Docker deployment. Replace `HASH` in the command below with the bcrypt hash generated in Step 1. Also, replace `admin@example.com` with the email address linked to your admin account.
|
||||
|
||||
```bash
|
||||
docker run --rm -v open-webui:/data alpine/socat EXEC:"bash -c 'apk add sqlite && echo UPDATE auth SET password='\''HASH'\'' WHERE email='\''admin@example.com'\''; | sqlite3 /data/webui.db'", STDIO
|
||||
```
|
||||
|
||||
## For Local Installations 💻
|
||||
|
||||
If you have a local installation of Open WebUI, here's how you can reset your admin password directly on your system.
|
||||
|
||||
### Step 1: Generate a New Password Hash 🔐
|
||||
|
||||
Just as with the Docker method, start by generating a bcrypt hash of your new password using the following command. Remember to replace `your-new-password` with your new password:
|
||||
|
||||
```bash
|
||||
htpasswd -bnBC 10 "" your-new-password | tr -d ':\n'
|
||||
```
|
||||
|
||||
### Step 2: Update the Password Locally 🔄
|
||||
|
||||
Now, navigate to the `open-webui` directory on your local machine. Update your password by replacing `HASH` with the bcrypt hash from Step 1 and `admin@example.com` with your admin account email, and execute:
|
||||
|
||||
```bash
|
||||
sqlite3 backend/data/webui.db "UPDATE auth SET password='HASH' WHERE email='admin@example.com';"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
📖 By following these straightforward steps, you'll regain access to your Open WebUI admin account in no time. If you encounter any issues during the process, please consider searching for your issue on forums or community platforms.
|
@ -1,4 +1,4 @@
|
||||
---
|
||||
sidebar_position: 1
|
||||
title: "Features"
|
||||
title: "✨ Features"
|
||||
---
|
@ -1,4 +1,4 @@
|
||||
---
|
||||
sidebar_position: 3
|
||||
title: "Integrations"
|
||||
title: "🔗 Integrations"
|
||||
---
|
@ -1,4 +1,4 @@
|
||||
---
|
||||
sidebar_position: 2
|
||||
title: "Tools & Fuctions"
|
||||
title: "🛠️ Tools & Functions"
|
||||
---
|
@ -1,4 +1,4 @@
|
||||
---
|
||||
sidebar_position: 4
|
||||
title: "Tips & Tricks"
|
||||
title: "💡 Tips & Tricks"
|
||||
---
|
Loading…
Reference in New Issue
Block a user