Merge branch 'open-webui:main' into main

This commit is contained in:
Justin Hayes 2024-03-03 00:03:43 -05:00 committed by GitHub
commit 7adb914ecb
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
6 changed files with 161 additions and 62 deletions

View File

@ -15,31 +15,31 @@ title: "🚀 Getting Started"
:::
## Before You Begin
<details>
<summary>Before You Begin</summary>
1. **Installing Docker:**
- **For Windows and Mac Users:**
- **For Windows and Mac Users:**
- Download Docker Desktop from [Docker's official website](https://www.docker.com/products/docker-desktop).
- Follow the installation instructions provided on the website. After installation, open Docker Desktop to ensure it's running properly.
- Download Docker Desktop from [Docker's official website](https://www.docker.com/products/docker-desktop).
- Follow the installation instructions provided on the website. After installation, open Docker Desktop to ensure it's running properly.
- **For Ubuntu and Other Linux Users:**
- Open your terminal.
- Set up your Docker apt repository according to the [Docker documentation](https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository)
- Update your package index:
```bash
sudo apt-get update
```
- Install Docker using the following command:
```bash
sudo apt-get install docker-ce docker-ce-cli containerd.io
```
- Verify the Docker installation with:
```bash
sudo docker run hello-world
```
This command downloads a test image and runs it in a container, which prints an informational message.
- **For Ubuntu and Other Linux Users:**
- Open your terminal.
- Set up your Docker apt repository according to the [Docker documentation](https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository)
- Update your package index:
```bash
sudo apt-get update
```
- Install Docker using the following command:
```bash
sudo apt-get install docker-ce docker-ce-cli containerd.io
```
- Verify the Docker installation with:
```bash
sudo docker run hello-world
```
This command downloads a test image and runs it in a container, which prints an informational message.
2. **Ensure You Have the Latest Version of Ollama:**
@ -48,41 +48,9 @@ title: "🚀 Getting Started"
3. **Verify Ollama Installation:**
- After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you.
## Installing with Docker 🐳
</details>
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
- **If Ollama is on your computer**, use this command:
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- **To build the container yourself**, follow these steps:
```bash
docker build -t open-webui .
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
### Using Ollama on a Different Server
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
```bash
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
Or for a self-built container:
```bash
docker build -t open-webui .
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui
```
### Installing Ollama and Open WebUI Together
## One-line Command to Install Ollama and Open WebUI Together
#### Using Docker Compose
@ -130,6 +98,88 @@ title: "🚀 Getting Started"
./run-compose.sh --enable-gpu --build
```
## Quick Start with Docker 🐳
:::info
When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
:::
- **If Ollama is on your computer**, use this command:
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- **If Ollama is on a Different Server**, use this command:
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
```bash
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
#### Open WebUI: Server Connection Error
If you're experiencing connection issues, its often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
**Example Docker Command**:
```bash
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
## Installing with Podman
<details>
<summary>Rootless (Podman) local-only Open WebUI with Systemd service and auto-update</summary>
- **Important:** Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only.
1. Pull the latest image:
```bash
podman pull ghcr.io/open-webui/open-webui:main
```
2. Create a new container using desired configuration:
**Note:** `-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example.
```bash
podman create -p 127.0.0.1:3000:8080 --network slirp4netns:allow_host_loopback=true --add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main
```
3. Prepare for systemd user service:
```bash
mkdir -p ~/.config/systemd/user/
```
4. Generate user service with Podman:
```bash
podman generate systemd --new open-webui > ~/.config/systemd/user/open-webui.service
```
5. Reload systemd configuration:
```bash
systemctl --user daemon-reload
```
6. Enable and validate new service:
```bash
systemctl --user enable open-webui.service
systemctl --user start open-webui.service
systemctl --user status open-webui.service
```
7. Enable and validate Podman auto-update:
```bash
systemctl --user enable podman-auto-update.timer
systemctl --user enable podman-auto-update.service
systemctl --user status podman-auto-update.timer
```
Dry run with the following command (omit `--dry-run` to force an update):
```bash
podman auto-update --dry-run
```
</details>
### Alternative Installation Methods
For other ways to install, like using Kustomize or Helm, check out [INSTALLATION](/getting-started/installation). Join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s) for more help and information.

View File

@ -57,7 +57,11 @@ hide_title: true
Don't forget to explore our sibling project, [Open WebUI Community](https://openwebui.com/), where you can discover, download, and explore customized Modelfiles. Open WebUI Community offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! 🚀
## Installing with Docker 🐳
### Quick Start with Docker 🐳
:::info
When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
:::
- **If Ollama is on your computer**, use this command:
@ -65,9 +69,7 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
#### Using Ollama on a Different Server
- **If Ollama is on a Different Server**, use this command:
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
@ -75,6 +77,18 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
#### Open WebUI: Server Connection Error
If you're experiencing connection issues, its often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
**Example Docker Command**:
```bash
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
## Troubleshooting
If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s).

View File

@ -39,7 +39,7 @@ docker run --rm -v ollama-webui:/from -v open-webui:/to alpine ash -c "cd /from
[insert the equivalent command that you used to install with the new Docker image name]
```
Once you verify that all the data has been migrated you can erase the old volumen using the following command:
Once you verify that all the data has been migrated you can erase the old volume using the following command:
```bash
docker volume rm ollama-webui

28
docs/research.md Normal file
View File

@ -0,0 +1,28 @@
---
sidebar_position: 7
title: "🧑‍🔬 Open WebUI for Research"
---
# 🧑‍🔬 Open WebUI for Research
## Interested in Using Open WebUI for Research?
🔍 **Are you interested in leveraging Open WebUI for your research?** We're excited about the prospect of collaborating with you! Alongside our continuous work on maintaining the Open WebUI repository, we're keen on developing a customized pipeline featuring a tailored UI crafted specifically to fulfill your research needs.
🧪 **Research-Centric Features:**
- Our goal is to provide a comprehensive web UI catering to the demands of conducting user studies, especially in the dynamic fields of AI and HCI.
- Features include surveys, analytics, and participant tracking, all meticulously crafted to streamline your research processes.
📈 **User Study Tools:**
- We empower researchers with precision and accuracy by offering specialized tools such as heat maps and behavior tracking modules.
- These tools are indispensable in capturing and analyzing intricate user behavior patterns.
Moreover, we are committed to supporting survey and analytics features. Our approach involves building custom pipelines, complete with an intuitive UI, tailored to the unique requirements of each project. We understand that one size does not fit all when it comes to research, and thus, our solutions are custom-made and exclusive to each case.
Please note that for custom projects, we adhere to our regular rate as listed on the sponsorship page. Additionally, we kindly request being listed as one of the co-authors. This ensures that our collaboration yields the best possible outcome, leveraging my expertise as a core maintainer to deliver high-quality results. We provide continuous support throughout the project lifecycle to ensure smooth integration and satisfaction with the final deliverables.
However, we are open to exploring collaborative opportunities free of charge under certain circumstances, such as projects with significant potential for mutual benefit or those aligned with our research interests. So, don't hesitate to reach out if you're interested in collaborating!
**Contact:** [jaeryang_baek[at]sfu[dot]ca](mailto:jaeryang_baek@sfu.ca)

View File

@ -13,8 +13,13 @@ Here are some exciting tasks on our roadmap:
- ⚙️ **Custom Python Backend Actions**: Empower your Open WebUI by creating or downloading custom Python backend actions. Unleash the full potential of your web interface with tailored actions that suit your specific needs, enhancing functionality and versatility.
- 🔧 **Fine-tune Model (LoRA)**: Fine-tune your model directly from the user interface. This feature allows for precise customization and optimization of the chat experience to better suit your needs and preferences.
- 🧠 **Long-Term Memory**: Witness the power of persistent memory in our agents. Enjoy conversations that feel continuous as agents remember and reference past interactions, creating a more cohesive and personalized user experience.
- 🧪 **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research.
- 📈 **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy.
- 📚 **Enhanced Documentation**: Elevate your setup and customization experience with improved, comprehensive documentation.
### 🧑‍🔬 Research Tools
- 🧪 **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research.
- 📈 **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy.
Read more about our [research offerings](/research)
Feel free to contribute and help us make Open WebUI even better! 🙌

View File

@ -17,3 +17,5 @@ title: "☁️ Deployment"
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/jlvjipGNwSU?si=RrPk-tMRFU_badO8" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/PhCoRPY7hCE?si=flHuovmiwx7DwKZb" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/zc3ltJeMNpM?si=FJvfCccQYIntnAJR" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>