This commit is contained in:
Timothy J. Baek 2024-09-25 02:04:05 +02:00
parent a4e1fa502e
commit 64311f61de
27 changed files with 18 additions and 17222 deletions

View File

@ -1,58 +0,0 @@
---
sidebar_position: 6
title: "Monitoring with Langfuse"
---
:::warning
Bundled LiteLLM support has been deprecated from 0.2.0
:::
# Monitoring with Langfuse
Integrating [Langfuse](https://cloud.langfuse.com) with LiteLLM allows for detailed observation and recording of API calls.
This guide walks you through setting up Langfuse callbacks with LiteLLM.
The local deployment of Langfuse is an option available through their open-source alternative. However, for the convenience of this tutorial, we will utilize the free limited version of their Cloud service. If data privacy is a concern for you, it is recommended to install the local version instead.
## Getting Started with Langfuse
Begin by setting up your Langfuse account and acquiring necessary keys:
1. Create an account at [Langfuse](https://cloud.langfuse.com/auth/sign-up).
2. Generate and copy your public and private keys.
## Configuring OpenWebUI LiteLLM Proxy for Langfuse
To integrate Langfuse with LiteLLM, you'll need to modify the LiteLLM `config.yaml` file and set environment variables for your Docker container.
### Editing the LiteLLM Configuration File
Edit the LiteLLM `config.yaml` file, located in your host Docker mount point at `/data/litellm/config.yaml`.
Add the following under the general settings as shown in the [LiteLLM official documentation](https://litellm.vercel.app/docs/observability/langfuse_integration):
```yaml
general_settings: {}
litellm_settings:
success_callback: ["langfuse"]
failure_callback: ["langfuse"]
```
### Setting Environment Variables in Docker
When launching the Docker container, pass the Langfuse API keys as environment variables:
```bash
LANGFUSE_PUBLIC_KEY: Replace "xxxxx" with your actual public key.
LANGFUSE_SECRET_KEY: Replace "xxxxx" with your actual secret key.
These variables can be set directly in the docker run command or through a Docker Compose YAML file.
```
## Testing the Integration
Once setup is complete, your Langfuse dashboard should start recording every API call made through the LiteLLM integration. This allows for efficient monitoring and troubleshooting of API interactions.
![Langfuse Dashboard](/img/tutorial_langfuse.png)
:::note
Ensure that all configurations are correctly set, and environment variables are properly passed to avoid integration issues.
:::

View File

@ -1,45 +0,0 @@
---
sidebar_position: 4
title: "LiteLLM Configuration"
---
:::warning
Bundled LiteLLM support has been deprecated from 0.2.0
:::
# LiteLLM Configuration
[LiteLLM](https://litellm.vercel.app/docs/proxy/configs#quick-start) supports a variety of APIs, both OpenAI-compatible and others. To integrate a new API model, follow these instructions:
## Initial Setup
To allow editing of your `config.yaml` file, use `-v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml` to bind-mount it with your `docker run` command:
```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data -v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
_Note: `config.yaml` does not need to exist on the host before running for the first time._
## Configuring Open WebUI
1. Go to the **Settings > Models > Manage LiteLLM Models**.
2. In 'Simple' mode, you will only see the option to enter a **Model**.
3. For additional configuration options, click on the 'Simple' toggle to switch to 'Advanced' mode. Here you can enter:
- **Model Name**: The name of the model as you want it to appear in the models list.
- **API Base URL**: The base URL for your API provider. This field can usually be left blank unless your provider specifies a custom endpoint URL.
- **API Key**: Your unique API key. Replace with the key provided by your API provider.
- **API RPM**: The allowed requests per minute for your API. Replace with the appropriate value for your API plan.
4. After entering all the required information, click the '+' button to add the new model to LiteLLM.
## Examples
_Ollama API (from inside Docker):_
![LiteLLM Config Ollama](/img/tutorial_litellm_ollama.png)
_Gemini API (MakerSuite/AI Studio):_
![LiteLLM Config Gemini](/img/tutorial_litellm_gemini.png)
Advanced configuration options not covered in the settings interface can be edited in the `config.yaml` file manually. For more information on the specific providers and advanced settings, consult the [LiteLLM Providers Documentation](https://litellm.vercel.app/docs/providers).

View File

@ -1,7 +1,7 @@
{ {
"label": "📝 Tutorial", "label": "📝 Tutorials",
"position": 300, "position": 300,
"link": { "link": {
"type": "generated-index" "type": "generated-index"
} }
} }

View File

@ -0,0 +1,4 @@
---
sidebar_position: 1
title: "Features"
---

View File

@ -191,12 +191,12 @@ This is enabled on a per session basis eg. reloading the page, changing to anoth
6. [Optional] Enter the `SearchApi engine` name you want to query. Example, `google`, `bing`, `baidu`, `google_news`, `bing_news`, `google_videos`, `google_scholar` and `google_patents.` By default, it is set to `google`. 6. [Optional] Enter the `SearchApi engine` name you want to query. Example, `google`, `bing`, `baidu`, `google_news`, `bing_news`, `google_videos`, `google_scholar` and `google_patents.` By default, it is set to `google`.
7. Click `Save`. 7. Click `Save`.
![Open WebUI Admin panel](../../static/img/tutorial_searchapi_search.png) ![Open WebUI Admin panel](/img/tutorial_searchapi_search.png)
#### Note #### Note
You have to enable `Web search` in the prompt field, using plus (`+`) button to search the web using [SearchApi](https://www.searchapi.io/) engines. You have to enable `Web search` in the prompt field, using plus (`+`) button to search the web using [SearchApi](https://www.searchapi.io/) engines.
![enable Web search](../../static/img/enable_web_search.png) ![enable Web search](/img/enable_web_search.png)
## Google PSE API ## Google PSE API
@ -211,14 +211,14 @@ You have to enable `Web search` in the prompt field, using plus (`+`) button to
7. Fill `Google PSE API Key` with the `API key` and `Google PSE Engine Id` (# 4) 7. Fill `Google PSE API Key` with the `API key` and `Google PSE Engine Id` (# 4)
8. Click `Save` 8. Click `Save`
![Open WebUI Admin panel](../../static/img/tutorial_google_pse1.png) ![Open WebUI Admin panel](/img/tutorial_google_pse1.png)
#### Note #### Note
You have to enable `Web search` in the prompt field, using plus (`+`) button. You have to enable `Web search` in the prompt field, using plus (`+`) button.
Search the web ;-) Search the web ;-)
![enable Web search](../../static/img/tutorial_google_pse2.png) ![enable Web search](/img/tutorial_google_pse2.png)
## Brave API ## Brave API

View File

@ -0,0 +1,4 @@
---
sidebar_position: 2
title: "Integrations"
---

View File

@ -0,0 +1,4 @@
---
sidebar_position: 3
title: "Tips & Tricks"
---

17113
package-lock.json generated

File diff suppressed because it is too large Load Diff