From 0b0c908831ef51d409f466dd3327cf2010504a01 Mon Sep 17 00:00:00 2001 From: Yuwen Hu Date: Fri, 31 May 2024 16:34:13 +0800 Subject: [PATCH] Update for configuration --- docs/tutorial/ipex_llm.md | 14 ++++++++++++++ 1 file changed, 14 insertions(+) diff --git a/docs/tutorial/ipex_llm.md b/docs/tutorial/ipex_llm.md index d3ee9a9..2b2e9fc 100644 --- a/docs/tutorial/ipex_llm.md +++ b/docs/tutorial/ipex_llm.md @@ -22,3 +22,17 @@ Refer to [this guide](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quicksta :::tip If you would like to reach the Ollama serve from another machine, make sure you set or export the environment variable `OLLAMA_HOST=0.0.0.0` before executing the command `ollama serve`. ::: + +## Configure Open WebUI + +Access the Ollama settings through **Settings -> Connections** in the menu. By default, the **Ollama Base URL** is preset to https://localhost:11434, as illustrated in the snapshot below. To verify the status of the Ollama service connection, click the **Refresh button** located next to the textbox. If the WebUI is unable to establish a connection with the Ollama server, you will see an error message stating, `WebUI could not connect to Ollama`. + +![Open WebUI Ollama Setting Failure](https://llm-assets.readthedocs.io/en/latest/_images/open_webui_settings_0.png) + +If the connection is successful, you will see a message stating `Service Connection Verified`, as illustrated below. + +![Open WebUI Ollama Setting Success](https://llm-assets.readthedocs.io/en/latest/_images/open_webui_settings.png) + +:::tip +If you want to use an Ollama server hosted at a different URL, simply update the **Ollama Base URL** to the new URL and press the **Refresh** button to re-confirm the connection to Ollama. +::: \ No newline at end of file