From b4191d93e3b8fd1b979fabc11f20a220b30cde2d Mon Sep 17 00:00:00 2001 From: Yuwen Hu Date: Mon, 3 Jun 2024 18:04:43 +0800 Subject: [PATCH] Small typo fixes --- docs/tutorial/ipex_llm.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/tutorial/ipex_llm.md b/docs/tutorial/ipex_llm.md index 7138293..ea1196b 100644 --- a/docs/tutorial/ipex_llm.md +++ b/docs/tutorial/ipex_llm.md @@ -20,7 +20,7 @@ This tutorial demonstrates how to setup Open WebUI with **IPEX-LLM accelerated O Refer to [this guide](https://ipex-llm.readthedocs.io/en/latest/doc/LLM/Quickstart/ollama_quickstart.html) from IPEX-LLM official documentation about how to install and run Ollama serve accelerated by IPEX-LLM on Intel GPU. :::tip -If you would like to reach the Ollama serve from another machine, make sure you set or export the environment variable `OLLAMA_HOST=0.0.0.0` before executing the command `ollama serve`. +If you would like to reach the Ollama service from another machine, make sure you set or export the environment variable `OLLAMA_HOST=0.0.0.0` before executing the command `ollama serve`. ::: ## Configure Open WebUI