This commit is contained in:
Timothy J. Baek 2024-11-05 21:36:45 -08:00
parent 40dca9eceb
commit a9bad6bfdc
4 changed files with 4 additions and 4 deletions

View File

@ -106,7 +106,7 @@ After completing these steps, your ComfyUI setup should be integrated with Open
### Configuring with SwarmUI
SwarmUI utilizes ComfyUI as its backend. In order to get Open WebUI to work with SwarmUI you will have to append `ComfyBackendDirect` to the `ComfyUI Base URL`. Additionally, you will want to setup SwarmUI with LAN access. After aforementioned adjustments, setting up SwarmUI to work with Open WebUI will be the same as [Step one: Configure Open WebUI Settings](https://github.com/open-webui/docs/edit/main/docs/tutorials/features/images.md#step-1-configure-open-webui-settings) as outlined above.
SwarmUI utilizes ComfyUI as its backend. In order to get Open WebUI to work with SwarmUI you will have to append `ComfyBackendDirect` to the `ComfyUI Base URL`. Additionally, you will want to setup SwarmUI with LAN access. After aforementioned adjustments, setting up SwarmUI to work with Open WebUI will be the same as [Step one: Configure Open WebUI Settings](https://github.com/open-webui/docs/edit/main/docs/features/images.md#step-1-configure-open-webui-settings) as outlined above.
![Install SwarmUI with LAN Access](https://github.com/user-attachments/assets/a6567e13-1ced-4743-8d8e-be526207f9f6)
#### SwarmUI API URL

View File

@ -10,7 +10,7 @@ Enhance your understanding of OpenWebUI with key concepts and components to impr
---
## Explore the Workspace
Begin by exploring the [Workspace](../../tutorials/features/workspace) to discover essential concepts such as Modelfiles, Knowledge, Prompts, Tools, and Functions.
Begin by exploring the [Workspace](../../features/workspace) to discover essential concepts such as Modelfiles, Knowledge, Prompts, Tools, and Functions.
---

View File

@ -12,7 +12,7 @@ import { SponsorList } from "@site/src/components/SponsorList";
<TopBanners />
**Open WebUI is an [extensible](https://docs.openwebui.com/tutorials/plugin/), feature-rich, and user-friendly self-hosted AI interface designed to operate entirely offline.** It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
**Open WebUI is an [extensible](https://docs.openwebui.com/features/plugin/), feature-rich, and user-friendly self-hosted AI interface designed to operate entirely offline.** It supports various LLM runners, including Ollama and OpenAI-compatible APIs.
![GitHub stars](https://img.shields.io/github/stars/open-webui/open-webui?style=social)
![GitHub forks](https://img.shields.io/github/forks/open-webui/open-webui?style=social)

View File

@ -12,7 +12,7 @@ title: "⚡ Pipelines"
# Pipelines: UI-Agnostic OpenAI API Plugin Framework
:::tip
If your goal is simply to add support for additional providers like Anthropic or basic filters, you likely don't need Pipelines . For those cases, [Open WebUI Functions](/tutorials/plugin/functions) are a better fit—it's built-in, much more convenient, and easier to configure. Pipelines, however, comes into play when you're dealing with computationally heavy tasks (e.g., running large models or complex logic) that you want to offload from your main Open WebUI instance for better performance and scalability.
If your goal is simply to add support for additional providers like Anthropic or basic filters, you likely don't need Pipelines . For those cases, [Open WebUI Functions](/features/plugin/functions) are a better fit—it's built-in, much more convenient, and easier to configure. Pipelines, however, comes into play when you're dealing with computationally heavy tasks (e.g., running large models or complex logic) that you want to offload from your main Open WebUI instance for better performance and scalability.
:::
Welcome to **Pipelines**, an [Open WebUI](https://github.com/open-webui) initiative. Pipelines bring modular, customizable workflows to any UI client supporting OpenAI API specs and much more! Easily extend functionalities, integrate unique logic, and create dynamic workflows with just a few lines of code.