docs/docs/troubleshooting/connection-error.mdx
Silentoplayz 87632c8e59 New Apache Tika & Artifacts Docs Pages
New Apache Tika & Artifacts Docs Pages
2024-12-21 12:47:50 -05:00

73 lines
3.1 KiB
Plaintext
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
sidebar_position: 0
title: "🚧 Server Connectivity Issues"
---
We're here to help you get everything set up and running smoothly. Below, you'll find step-by-step instructions tailored for different scenarios to solve common connection issues with Ollama and external servers like Hugging Face.
## 🌟 Connection to Ollama Server
### 🚀 Accessing Ollama from Open WebUI
Struggling to connect to Ollama from Open WebUI? It could be because Ollama isnt listening on a network interface that allows external connections. Lets sort that out:
1. **Configure Ollama to Listen Broadly** 🎧:
Set `OLLAMA_HOST` to `0.0.0.0` to make Ollama listen on all network interfaces.
2. **Update Environment Variables**:
Ensure that the `OLLAMA_HOST` is accurately set within your deployment environment.
3. **Restart Ollama**🔄:
A restart is needed for the changes to take effect.
💡 After setting up, verify that Ollama is accessible by visiting the WebUI interface.
For more detailed instructions on configuring Ollama, please refer to the [Ollama's Official Documentation](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux).
### 🐳 Docker Connection Error
If you're seeing a connection error when trying to access Ollama, it might be because the WebUI docker container can't talk to the Ollama server running on your host. Lets fix that:
1. **Adjust the Network Settings** 🛠️:
Use the `--network=host` flag in your Docker command. This links your container directly to your hosts network.
2. **Change the Port**:
Remember that the internal port changes from 3000 to 8080.
**Example Docker Command**:
```bash
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_BASE_URL=http://127.0.0.1:11434 --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
🔗 After running the above, your WebUI should be available at `http://localhost:8080`.
## 🔒 SSL Connection Issue with Hugging Face
Encountered an SSL error? It could be an issue with the Hugging Face server. Here's what to do:
1. **Check Hugging Face Server Status**:
Verify if there's a known outage or issue on their end.
2. **Switch Endpoint**:
If Hugging Face is down, switch the endpoint in your Docker command.
**Example Docker Command for Connected Issues**:
```bash
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
## 🍏 Podman on MacOS
Running on MacOS with Podman? Heres how to ensure connectivity:
1. **Enable Host Loopback**:
Use `--network slirp4netns:allow_host_loopback=true` in your command.
2. **Set OLLAMA_BASE_URL**:
Ensure it points to `http://host.containers.internal:11434`.
**Example Podman Command**:
```bash
podman run -d --network slirp4netns:allow_host_loopback=true -p 3000:8080 -e OLLAMA_BASE_URL=http://host.containers.internal:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```