From 03d7ca774eb1b1432f1ece518f03eab0872540a2 Mon Sep 17 00:00:00 2001 From: ayana Date: Tue, 10 Jun 2025 10:58:41 -0700 Subject: [PATCH] Update Pipes docs --- docs/pipelines/pipes.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/docs/pipelines/pipes.md b/docs/pipelines/pipes.md index aa31e7c..a459048 100644 --- a/docs/pipelines/pipes.md +++ b/docs/pipelines/pipes.md @@ -4,7 +4,8 @@ title: "🔧 Pipes" --- # Pipes -Pipes are functions that can be used to perform actions prior to returning LLM messages to the user. Examples of potential actions you can take with Pipes are Retrieval Augmented Generation (RAG), sending requests to non-OpenAI LLM providers (such as Anthropic, Azure OpenAI, or Google), or executing functions right in your web UI. Pipes can be hosted as a Function or on a Pipelines server. A list of examples is maintained in the [Pipelines repo](https://github.com/open-webui/pipelines/tree/main/examples/pipelines). The general workflow can be seen in the image below. + +Pipes are standalone functions that process inputs and generate responses, possibly by invoking one or more LLMs or external services before returning results to the user. Examples of potential actions you can take with Pipes are Retrieval Augmented Generation (RAG), sending requests to non-OpenAI LLM providers (such as Anthropic, Azure OpenAI, or Google), or executing functions right in your web UI. Pipes can be hosted as a Function or on a Pipelines server. A list of examples is maintained in the [Pipelines repo](https://github.com/open-webui/pipelines/tree/main/examples/pipelines). The general workflow can be seen in the image below.