Merge pull request #495 from Classic298/main

chore: rename openwebui -> Open WebUI and small changes to OpenAI Quick Start Guide
This commit is contained in:
Tim Jaeryang Baek 2025-04-12 11:38:49 -07:00 committed by GitHub
commit 9215594498
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
9 changed files with 34 additions and 34 deletions

View File

@ -312,9 +312,9 @@ async def test_function(
In the Tools definition metadata you can specify custom packages. When you click `Save` the line will be parsed and `pip install` will be run on all requirements at once. In the Tools definition metadata you can specify custom packages. When you click `Save` the line will be parsed and `pip install` will be run on all requirements at once.
Keep in mind that as pip is used in the same process as Open-WebUI, the UI will be completely unresponsive during the installation. Keep in mind that as pip is used in the same process as Open WebUI, the UI will be completely unresponsive during the installation.
No measures are taken to handle package conflicts with Open-WebUI's requirements. That means that specifying requirements can break OpenWebUI if you're not careful. You might be able to work around this by specifying `open-webui` itself as a requirement. No measures are taken to handle package conflicts with Open WebUI's requirements. That means that specifying requirements can break Open WebUI if you're not careful. You might be able to work around this by specifying `open-webui` itself as a requirement.
<details> <details>

View File

@ -13,20 +13,20 @@ Open WebUI makes it easy to connect and use OpenAI and other OpenAI-compatible A
## Step 1: Get Your OpenAI API Key ## Step 1: Get Your OpenAI API Key
To use OpenAI models (such as GPT-4 or GPT-3.5), you need an API key from a supported provider. To use OpenAI models (such as GPT-4 or o3-mini), you need an API key from a supported provider.
You can use: You can use:
- OpenAI directly (https://platform.openai.com/account/api-keys) - OpenAI directly (https://platform.openai.com/account/api-keys)
- Azure OpenAI - Azure OpenAI
- An OpenAI-compatible service (e.g., LocalAI, FastChat, Helicone, etc.) - Any OpenAI-compatible service (e.g., LocalAI, FastChat, Helicone, LiteLLM, OpenRouter etc.)
👉 Once you have the key, copy it and keep it handy. 👉 Once you have the key, copy it and keep it handy.
For most OpenAI usage, the default API base URL is: For most OpenAI usage, the default API base URL is:
https://api.openai.com/v1 https://api.openai.com/v1
Other providers may use different URLs — check your providers documentation. Other providers use different URLs — check your providers documentation.
--- ---
@ -38,7 +38,7 @@ Once Open WebUI is running:
2. Navigate to **Connections > OpenAI > Manage** (look for the wrench icon). 2. Navigate to **Connections > OpenAI > Manage** (look for the wrench icon).
3. Click **Add New Connection**. 3. Click **Add New Connection**.
4. Fill in the following: 4. Fill in the following:
- API URL: https://api.openai.com/v1 - API URL: https://api.openai.com/v1 (or the URL of your specific provider)
- API Key: Paste your key here - API Key: Paste your key here
5. Click Save ✅. 5. Click Save ✅.
@ -61,7 +61,7 @@ Heres what model selection looks like:
![OpenAI Model Selector](/images/getting-started/quick-start/selector-openai.png) ![OpenAI Model Selector](/images/getting-started/quick-start/selector-openai.png)
Simply choose GPT-4, GPT-3.5, or any compatible model offered by your provider. Simply choose GPT-4, o3-mini, or any compatible model offered by your provider.
--- ---

View File

@ -18,7 +18,7 @@ title: "💥 Monitoring and Debugging with Langfuse"
![Langfuse Integration](https://langfuse.com/images/docs/openwebui-integration.gif) ![Langfuse Integration](https://langfuse.com/images/docs/openwebui-integration.gif)
_Langfuse integration steps_ _Langfuse integration steps_
[Pipelines](https://github.com/open-webui/pipelines/) in OpenWebUi is an UI-agnostic framework for OpenAI API plugins. It enables the injection of plugins that intercept, process, and forward user prompts to the final LLM, allowing for enhanced control and customization of prompt handling. [Pipelines](https://github.com/open-webui/pipelines/) in Open WebUI is an UI-agnostic framework for OpenAI API plugins. It enables the injection of plugins that intercept, process, and forward user prompts to the final LLM, allowing for enhanced control and customization of prompt handling.
To trace your application data with Langfuse, you can use the [Langfuse pipeline](https://github.com/open-webui/pipelines/blob/d4fca4c37c4b8603be7797245e749e9086f35130/examples/filters/langfuse_filter_pipeline.py), which enables real-time monitoring and analysis of message interactions. To trace your application data with Langfuse, you can use the [Langfuse pipeline](https://github.com/open-webui/pipelines/blob/d4fca4c37c4b8603be7797245e749e9086f35130/examples/filters/langfuse_filter_pipeline.py), which enables real-time monitoring and analysis of message interactions.

View File

@ -1,8 +1,8 @@
### Using a Self-Signed Certificate and Nginx on Windows without Docker ### Using a Self-Signed Certificate and Nginx on Windows without Docker
For basic internal/development installations, you can use nginx and a self-signed certificate to proxy openwebui to https, allowing use of features such as microphone input over LAN. (By default, most browsers will not allow microphone input on insecure non-localhost urls) For basic internal/development installations, you can use nginx and a self-signed certificate to proxy Open WebUI to https, allowing use of features such as microphone input over LAN. (By default, most browsers will not allow microphone input on insecure non-localhost urls)
This guide assumes you installed openwebui using pip and are running `open-webui serve` This guide assumes you installed Open WebUI using pip and are running `open-webui serve`
#### Step 1: Installing openssl for certificate generation #### Step 1: Installing openssl for certificate generation
@ -46,7 +46,7 @@ Move the generated nginx.key and nginx.crt files to a folder of your choice, or
Open C:\nginx\conf\nginx.conf in a text editor Open C:\nginx\conf\nginx.conf in a text editor
If you want openwebui to be accessible over your local LAN, be sure to note your LAN ip address using `ipconfig` e.g. 192.168.1.15 If you want Open WebUI to be accessible over your local LAN, be sure to note your LAN ip address using `ipconfig` e.g. 192.168.1.15
Set it up as follows: Set it up as follows:
@ -145,4 +145,4 @@ Run nginx by running `nginx`. If an nginx service is already started, you can re
--- ---
You should now be able to access openwebui on https://192.168.1.15 (or your own LAN ip as appropriate). Be sure to allow windows firewall access as needed. You should now be able to access Open WebUI on https://192.168.1.15 (or your own LAN ip as appropriate). Be sure to allow windows firewall access as needed.