mirror of
https://github.com/open-webui/docs
synced 2025-06-16 11:28:36 +00:00
pipelines
This commit is contained in:
parent
8fd00f5f91
commit
6c588bfc34
@ -7,5 +7,117 @@ import { TopBanners } from "@site/src/components/TopBanners";
|
||||
|
||||
<TopBanners />
|
||||
|
||||
Page Under Construction 🚧✨
|
||||
Please check back later for the latest updates. Thank you for your patience! 🙏
|
||||
<p align="center">
|
||||
<a href="#"><img src="/img/pipelines/header.png" alt="Pipelines Logo"></a>
|
||||
</p>
|
||||
|
||||
# Pipelines: UI-Agnostic OpenAI API Plugin Framework
|
||||
|
||||
Welcome to **Pipelines**, an [Open WebUI](https://github.com/open-webui) initiative. Pipelines bring modular, customizable workflows to any UI client supporting OpenAI API specs – and much more! Easily extend functionalities, integrate unique logic, and create dynamic workflows with just a few lines of code.
|
||||
|
||||
## 🚀 Why Choose Pipelines?
|
||||
|
||||
- **Limitless Possibilities:** Easily add custom logic and integrate Python libraries, from AI agents to home automation APIs.
|
||||
- **Seamless Integration:** Compatible with any UI/client supporting OpenAI API specs. (Only pipe-type pipelines are supported; filter types require clients with Pipelines support.)
|
||||
- **Custom Hooks:** Build and integrate custom pipelines.
|
||||
|
||||
### Examples of What You Can Achieve:
|
||||
|
||||
- [**Function Calling Pipeline**](https://github.com/open-webui/pipelines/examples/filters/function_calling_filter_pipeline.py): Easily handle function calls and enhance your applications with custom logic.
|
||||
- [**Custom RAG Pipeline**](https://github.com/open-webui/pipelines/examples/pipelines/rag/llamaindex_pipeline.py): Implement sophisticated Retrieval-Augmented Generation pipelines tailored to your needs.
|
||||
- [**Message Monitoring Using Langfuse**](https://github.com/open-webui/pipelines/examples/filters/langfuse_filter_pipeline.py): Monitor and analyze message interactions in real-time using Langfuse.
|
||||
- [**Rate Limit Filter**](https://github.com/open-webui/pipelines/examples/filters/rate_limit_filter_pipeline.py): Control the flow of requests to prevent exceeding rate limits.
|
||||
- [**Real-Time Translation Filter with LibreTranslate**](https://github.com/open-webui/pipelines/examples/filters/libretranslate_filter_pipeline.py): Seamlessly integrate real-time translations into your LLM interactions.
|
||||
- [**Toxic Message Filter**](https://github.com/open-webui/pipelines/examples/filters/detoxify_filter_pipeline.py): Implement filters to detect and handle toxic messages effectively.
|
||||
- **And Much More!**: The sky is the limit for what you can accomplish with Pipelines and Python. [Check out our scaffolds](https://github.com/open-webui/pipelines/examples/scaffolds) to get a head start on your projects and see how you can streamline your development process!
|
||||
|
||||
## 🔧 How It Works
|
||||
|
||||
<p align="center">
|
||||
<a href="/img/pipelines/workflow.png"><img src="/img/pipelines/workflow.png" alt="Pipelines Workflow"></a>
|
||||
</p>
|
||||
|
||||
Integrating Pipelines with any OpenAI API-compatible UI client is simple. Launch your Pipelines instance and set the OpenAI URL on your client to the Pipelines URL. That's it! You're ready to leverage any Python library for your needs.
|
||||
|
||||
## ⚡ Quick Start with Docker
|
||||
|
||||
> [!WARNING]
|
||||
> Pipelines are a plugin system with arbitrary code execution — **don't fetch random pipelines from sources you don't trust**.
|
||||
|
||||
For a streamlined setup using Docker:
|
||||
|
||||
1. **Run the Pipelines container:**
|
||||
|
||||
```sh
|
||||
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
|
||||
```
|
||||
|
||||
2. **Connect to Open WebUI:**
|
||||
|
||||
- Navigate to the **Settings > Connections > OpenAI API** section in Open WebUI.
|
||||
- Set the API URL to `http://localhost:9099` and the API key to `0p3n-w3bu!`. Your pipelines should now be active.
|
||||
|
||||
> [!NOTE]
|
||||
> If your Open WebUI is running in a Docker container, replace `localhost` with `host.docker.internal` in the API URL.
|
||||
|
||||
3. **Manage Configurations:**
|
||||
|
||||
- In the admin panel, go to **Admin Settings > Pipelines tab**.
|
||||
- Select your desired pipeline and modify the valve values directly from the WebUI.
|
||||
|
||||
> [!TIP]
|
||||
> If you are unable to connect, it is most likely a Docker networking issue. We encourage you to troubleshoot on your own and share your methods and solutions in the discussions forum.
|
||||
|
||||
If you need to install a custom pipeline with additional dependencies:
|
||||
|
||||
- **Run the following command:**
|
||||
|
||||
```sh
|
||||
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -e PIPELINES_URLS="https://github.com/open-webui/pipelines/blob/main/examples/filters/detoxify_filter_pipeline.py" -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
|
||||
```
|
||||
|
||||
Alternatively, you can directly install pipelines from the admin settings by copying and pasting the pipeline URL, provided it doesn't have additional dependencies.
|
||||
|
||||
That's it! You're now ready to build customizable AI integrations effortlessly with Pipelines. Enjoy!
|
||||
|
||||
## 📦 Installation and Setup
|
||||
|
||||
Get started with Pipelines in a few easy steps:
|
||||
|
||||
1. **Ensure Python 3.11 is installed.**
|
||||
2. **Clone the Pipelines repository:**
|
||||
|
||||
```sh
|
||||
git clone https://github.com/open-webui/pipelines.git
|
||||
cd pipelines
|
||||
```
|
||||
|
||||
3. **Install the required dependencies:**
|
||||
|
||||
```sh
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
4. **Start the Pipelines server:**
|
||||
|
||||
```sh
|
||||
sh ./start.sh
|
||||
```
|
||||
|
||||
Once the server is running, set the OpenAI URL on your client to the Pipelines URL. This unlocks the full capabilities of Pipelines, integrating any Python library and creating custom workflows tailored to your needs.
|
||||
|
||||
## 📂 Directory Structure and Examples
|
||||
|
||||
The `/pipelines` directory is the core of your setup. Add new modules, customize existing ones, and manage your workflows here. All the pipelines in the `/pipelines` directory will be **automatically loaded** when the server launches.
|
||||
|
||||
You can change this directory from `/pipelines` to another location using the `PIPELINES_DIR` env variable.
|
||||
|
||||
### Integration Examples
|
||||
|
||||
Find various integration examples in the `https://github.com/open-webui/pipelines/examples` directory. These examples show how to integrate different functionalities, providing a foundation for building your own custom pipelines.
|
||||
|
||||
## 🎉 Work in Progress
|
||||
|
||||
We’re continuously evolving! We'd love to hear your feedback and understand which hooks and features would best suit your use case. Feel free to reach out and become a part of our Open WebUI community!
|
||||
|
||||
Our vision is to push **Pipelines** to become the ultimate plugin framework for our AI interface, **Open WebUI**. Imagine **Open WebUI** as the WordPress of AI interfaces, with **Pipelines** being its diverse range of plugins. Join us on this exciting journey! 🌍
|
||||
|
@ -7,14 +7,6 @@ import { TopBanners } from "@site/src/components/TopBanners";
|
||||
|
||||
<TopBanners />
|
||||
|
||||
## Understanding the Open WebUI Architecture
|
||||
|
||||
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
||||
|
||||
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_BASE_URL` environment variable. Therefore, a request made to `/ollama` in the WebUI is effectively the same as making a request to `OLLAMA_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_BASE_URL/api/tags` in the backend.
|
||||
|
||||
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
||||
|
||||
## Open WebUI: Server Connection Error
|
||||
|
||||
If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`.
|
||||
@ -270,3 +262,11 @@ For local installations of Open WebUI, navigate to the `open-webui` directory an
|
||||
```bash
|
||||
sqlite3 backend/data/webui.db "UPDATE auth SET password='HASH' WHERE email='admin@example.com';"
|
||||
```
|
||||
|
||||
## Understanding the Open WebUI Architecture
|
||||
|
||||
The Open WebUI system is designed to streamline interactions between the client (your browser) and the Ollama API. At the heart of this design is a backend reverse proxy, enhancing security and resolving CORS issues.
|
||||
|
||||
- **How it Works**: The Open WebUI is designed to interact with the Ollama API through a specific route. When a request is made from the WebUI to Ollama, it is not directly sent to the Ollama API. Initially, the request is sent to the Open WebUI backend via `/ollama` route. From there, the backend is responsible for forwarding the request to the Ollama API. This forwarding is accomplished by using the route specified in the `OLLAMA_BASE_URL` environment variable. Therefore, a request made to `/ollama` in the WebUI is effectively the same as making a request to `OLLAMA_BASE_URL` in the backend. For instance, a request to `/ollama/api/tags` in the WebUI is equivalent to `OLLAMA_BASE_URL/api/tags` in the backend.
|
||||
|
||||
- **Security Benefits**: This design prevents direct exposure of the Ollama API to the frontend, safeguarding against potential CORS (Cross-Origin Resource Sharing) issues and unauthorized access. Requiring authentication to access the Ollama API further enhances this security layer.
|
||||
|
BIN
static/img/pipelines/header.png
Normal file
BIN
static/img/pipelines/header.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 28 KiB |
BIN
static/img/pipelines/workflow.png
Normal file
BIN
static/img/pipelines/workflow.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 56 KiB |
Loading…
Reference in New Issue
Block a user