From 9bad6ca5e55ce974eaeef8cfe88be7271064bcb6 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Tue, 25 Feb 2025 01:14:17 -0800 Subject: [PATCH] Update filter.mdx --- docs/features/plugin/functions/filter.mdx | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/features/plugin/functions/filter.mdx b/docs/features/plugin/functions/filter.mdx index 43dfaf0..15e48a3 100644 --- a/docs/features/plugin/functions/filter.mdx +++ b/docs/features/plugin/functions/filter.mdx @@ -155,19 +155,19 @@ Note: The user feels the same, but the model processes a cleaner and easier-to-u --- -## πŸ†• 3️⃣ **`stream` Hook (New in Open WebUI 0.5.17)** +#### πŸ†• 3️⃣ **`stream` Hook (New in Open WebUI 0.5.17)** -### πŸ”„ What is the `stream` Hook? +##### πŸ”„ What is the `stream` Hook? The **`stream` function** is a new feature introduced in Open WebUI **0.5.17** that allows you to **intercept and modify streamed model responses** in real time. Unlike `outlet`, which processes an entire completed response, `stream` operates on **individual chunks** as they are received from the model. -#### πŸ› οΈ When to Use the Stream Hook? +##### πŸ› οΈ When to Use the Stream Hook? - Modify **streaming responses** before they are displayed to users. - Implement **real-time censorship or cleanup**. - **Monitor streamed data** for logging/debugging. -### πŸ“œ Example: Logging Streaming Chunks +##### πŸ“œ Example: Logging Streaming Chunks Here’s how you can inspect and modify streamed LLM responses: ```python @@ -186,7 +186,7 @@ def stream(self, event: dict) -> dict: - Each line represents a **small fragment** of the model's streamed response. - The **`delta.content` field** contains the progressively generated text. -### πŸ”„ Example: Filtering Out Emojis from Streamed Data +##### πŸ”„ Example: Filtering Out Emojis from Streamed Data ```python def stream(self, event: dict) -> dict: for choice in event.get("choices", []):