mirror of
https://github.com/open-webui/open-webui
synced 2024-11-21 23:57:51 +00:00
rename to open-webui
This commit is contained in:
parent
509d2a61eb
commit
90bcd1644a
22
README.md
22
README.md
@ -126,19 +126,19 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
|
|||||||
|
|
||||||
#### Installing with Docker 🐳
|
#### Installing with Docker 🐳
|
||||||
|
|
||||||
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v ollama-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
- **Important:** When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data.
|
||||||
|
|
||||||
- **If Ollama is on your computer**, use this command:
|
- **If Ollama is on your computer**, use this command:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
- **To build the container yourself**, follow these steps:
|
- **To build the container yourself**, follow these steps:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker build -t ollama-webui .
|
docker build -t open-webui .
|
||||||
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
|
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui
|
||||||
```
|
```
|
||||||
|
|
||||||
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
|
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000).
|
||||||
@ -148,14 +148,14 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open
|
|||||||
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
|
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
Or for a self-built container:
|
Or for a self-built container:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker build -t ollama-webui .
|
docker build -t open-webui .
|
||||||
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v ollama-webui:/app/backend/data --name ollama-webui --restart always ollama-webui
|
docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui
|
||||||
```
|
```
|
||||||
|
|
||||||
### Installing Ollama and Open WebUI Together
|
### Installing Ollama and Open WebUI Together
|
||||||
@ -215,8 +215,8 @@ For other ways to install, like using Kustomize or Helm, check out [INSTALLATION
|
|||||||
In case you want to update your local Docker installation to the latest version, you can do it performing the following actions:
|
In case you want to update your local Docker installation to the latest version, you can do it performing the following actions:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker rm -f ollama-webui
|
docker rm -f open-webui
|
||||||
docker pull ghcr.io/ollama-webui/ollama-webui:main
|
docker pull ghcr.io/open-webui/open-webui:main
|
||||||
[insert command you used to install]
|
[insert command you used to install]
|
||||||
```
|
```
|
||||||
|
|
||||||
@ -243,8 +243,8 @@ The Open WebUI consists of two primary components: the frontend and the backend
|
|||||||
Run the following commands to install:
|
Run the following commands to install:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
git clone https://github.com/ollama-webui/ollama-webui.git
|
git clone https://github.com/open-webui/open-webui.git
|
||||||
cd ollama-webui/
|
cd open-webui/
|
||||||
|
|
||||||
# Copying required .env file
|
# Copying required .env file
|
||||||
cp -RPp example.env .env
|
cp -RPp example.env .env
|
||||||
|
@ -15,7 +15,7 @@ If you're experiencing connection issues, it’s often due to the WebUI docker c
|
|||||||
**Example Docker Command**:
|
**Example Docker Command**:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
|
docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main
|
||||||
```
|
```
|
||||||
|
|
||||||
### General Connection Errors
|
### General Connection Errors
|
||||||
|
@ -40,9 +40,7 @@ class UrlUpdateForm(BaseModel):
|
|||||||
|
|
||||||
|
|
||||||
@app.post("/url/update")
|
@app.post("/url/update")
|
||||||
async def update_ollama_api_url(
|
async def update_ollama_api_url(form_data: UrlUpdateForm, user=Depends(get_admin_user)):
|
||||||
form_data: UrlUpdateForm, user=Depends(get_admin_user)
|
|
||||||
):
|
|
||||||
app.state.OLLAMA_API_BASE_URL = form_data.url
|
app.state.OLLAMA_API_BASE_URL = form_data.url
|
||||||
return {"OLLAMA_API_BASE_URL": app.state.OLLAMA_API_BASE_URL}
|
return {"OLLAMA_API_BASE_URL": app.state.OLLAMA_API_BASE_URL}
|
||||||
|
|
||||||
@ -68,10 +66,14 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
|
|||||||
if path in ["pull", "delete", "push", "copy", "create"]:
|
if path in ["pull", "delete", "push", "copy", "create"]:
|
||||||
if user.role != "admin":
|
if user.role != "admin":
|
||||||
raise HTTPException(
|
raise HTTPException(
|
||||||
status_code=status.HTTP_401_UNAUTHORIZED, detail=ERROR_MESSAGES.ACCESS_PROHIBITED
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||||
)
|
)
|
||||||
else:
|
else:
|
||||||
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail=ERROR_MESSAGES.ACCESS_PROHIBITED)
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail=ERROR_MESSAGES.ACCESS_PROHIBITED,
|
||||||
|
)
|
||||||
|
|
||||||
headers.pop("host", None)
|
headers.pop("host", None)
|
||||||
headers.pop("authorization", None)
|
headers.pop("authorization", None)
|
||||||
@ -126,7 +128,7 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
|
|||||||
try:
|
try:
|
||||||
return await run_in_threadpool(get_request)
|
return await run_in_threadpool(get_request)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
error_detail = "Ollama WebUI: Server Connection Error"
|
error_detail = "Open WebUI: Server Connection Error"
|
||||||
if r is not None:
|
if r is not None:
|
||||||
try:
|
try:
|
||||||
res = r.json()
|
res = r.json()
|
||||||
|
@ -61,7 +61,7 @@ async def update_ollama_api_url(
|
|||||||
# yield line
|
# yield line
|
||||||
# except Exception as e:
|
# except Exception as e:
|
||||||
# print(e)
|
# print(e)
|
||||||
# error_detail = "Ollama WebUI: Server Connection Error"
|
# error_detail = "Open WebUI: Server Connection Error"
|
||||||
# yield json.dumps({"error": error_detail, "message": str(e)}).encode()
|
# yield json.dumps({"error": error_detail, "message": str(e)}).encode()
|
||||||
|
|
||||||
|
|
||||||
@ -110,7 +110,7 @@ async def proxy(path: str, request: Request, user=Depends(get_current_user)):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(e)
|
print(e)
|
||||||
error_detail = "Ollama WebUI: Server Connection Error"
|
error_detail = "Open WebUI: Server Connection Error"
|
||||||
|
|
||||||
if response is not None:
|
if response is not None:
|
||||||
try:
|
try:
|
||||||
|
@ -9,7 +9,12 @@ from pydantic import BaseModel
|
|||||||
|
|
||||||
from apps.web.models.users import Users
|
from apps.web.models.users import Users
|
||||||
from constants import ERROR_MESSAGES
|
from constants import ERROR_MESSAGES
|
||||||
from utils.utils import decode_token, get_current_user, get_verified_user, get_admin_user
|
from utils.utils import (
|
||||||
|
decode_token,
|
||||||
|
get_current_user,
|
||||||
|
get_verified_user,
|
||||||
|
get_admin_user,
|
||||||
|
)
|
||||||
from config import OPENAI_API_BASE_URL, OPENAI_API_KEY, CACHE_DIR
|
from config import OPENAI_API_BASE_URL, OPENAI_API_KEY, CACHE_DIR
|
||||||
|
|
||||||
import hashlib
|
import hashlib
|
||||||
@ -47,7 +52,6 @@ async def update_openai_url(form_data: UrlUpdateForm, user=Depends(get_admin_use
|
|||||||
return {"OPENAI_API_BASE_URL": app.state.OPENAI_API_BASE_URL}
|
return {"OPENAI_API_BASE_URL": app.state.OPENAI_API_BASE_URL}
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@app.get("/key")
|
@app.get("/key")
|
||||||
async def get_openai_key(user=Depends(get_admin_user)):
|
async def get_openai_key(user=Depends(get_admin_user)):
|
||||||
return {"OPENAI_API_KEY": app.state.OPENAI_API_KEY}
|
return {"OPENAI_API_KEY": app.state.OPENAI_API_KEY}
|
||||||
@ -107,7 +111,7 @@ async def speech(request: Request, user=Depends(get_verified_user)):
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(e)
|
print(e)
|
||||||
error_detail = "Ollama WebUI: Server Connection Error"
|
error_detail = "Open WebUI: Server Connection Error"
|
||||||
if r is not None:
|
if r is not None:
|
||||||
try:
|
try:
|
||||||
res = r.json()
|
res = r.json()
|
||||||
@ -188,7 +192,7 @@ async def proxy(path: str, request: Request, user=Depends(get_verified_user)):
|
|||||||
return response_data
|
return response_data
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(e)
|
print(e)
|
||||||
error_detail = "Ollama WebUI: Server Connection Error"
|
error_detail = "Open WebUI: Server Connection Error"
|
||||||
if r is not None:
|
if r is not None:
|
||||||
try:
|
try:
|
||||||
res = r.json()
|
res = r.json()
|
||||||
|
@ -10,16 +10,16 @@ services:
|
|||||||
restart: unless-stopped
|
restart: unless-stopped
|
||||||
image: ollama/ollama:latest
|
image: ollama/ollama:latest
|
||||||
|
|
||||||
ollama-webui:
|
open-webui:
|
||||||
build:
|
build:
|
||||||
context: .
|
context: .
|
||||||
args:
|
args:
|
||||||
OLLAMA_API_BASE_URL: '/ollama/api'
|
OLLAMA_API_BASE_URL: '/ollama/api'
|
||||||
dockerfile: Dockerfile
|
dockerfile: Dockerfile
|
||||||
image: ghcr.io/ollama-webui/ollama-webui:main
|
image: ghcr.io/open-webui/open-webui:main
|
||||||
container_name: ollama-webui
|
container_name: open-webui
|
||||||
volumes:
|
volumes:
|
||||||
- ollama-webui:/app/backend/data
|
- open-webui:/app/backend/data
|
||||||
depends_on:
|
depends_on:
|
||||||
- ollama
|
- ollama
|
||||||
ports:
|
ports:
|
||||||
@ -33,4 +33,4 @@ services:
|
|||||||
|
|
||||||
volumes:
|
volumes:
|
||||||
ollama: {}
|
ollama: {}
|
||||||
ollama-webui: {}
|
open-webui: {}
|
||||||
|
@ -1,36 +1,37 @@
|
|||||||
# Contributing to Ollama WebUI
|
# Contributing to Open WebUI
|
||||||
|
|
||||||
🚀 **Welcome, Contributors!** 🚀
|
🚀 **Welcome, Contributors!** 🚀
|
||||||
|
|
||||||
Your interest in contributing to Ollama WebUI is greatly appreciated. This document is here to guide you through the process, ensuring your contributions enhance the project effectively. Let's make Ollama WebUI even better, together!
|
Your interest in contributing to Open WebUI is greatly appreciated. This document is here to guide you through the process, ensuring your contributions enhance the project effectively. Let's make Open WebUI even better, together!
|
||||||
|
|
||||||
## 📌 Key Points
|
## 📌 Key Points
|
||||||
|
|
||||||
### 🦙 Ollama vs. Ollama WebUI
|
### 🦙 Ollama vs. Open WebUI
|
||||||
|
|
||||||
It's crucial to distinguish between Ollama and Ollama WebUI:
|
It's crucial to distinguish between Ollama and Open WebUI:
|
||||||
|
|
||||||
- **Ollama WebUI** focuses on providing an intuitive and responsive web interface for chat interactions.
|
- **Open WebUI** focuses on providing an intuitive and responsive web interface for chat interactions.
|
||||||
- **Ollama** is the underlying technology that powers these interactions.
|
- **Ollama** is the underlying technology that powers these interactions.
|
||||||
|
|
||||||
If your issue or contribution pertains directly to the core Ollama technology, please direct it to the appropriate [Ollama project repository](https://ollama.com/). Ollama WebUI's repository is dedicated to the web interface aspect only.
|
If your issue or contribution pertains directly to the core Ollama technology, please direct it to the appropriate [Ollama project repository](https://ollama.com/). Open WebUI's repository is dedicated to the web interface aspect only.
|
||||||
|
|
||||||
### 🚨 Reporting Issues
|
### 🚨 Reporting Issues
|
||||||
|
|
||||||
Noticed something off? Have an idea? Check our [Issues tab](https://github.com/ollama-webui/ollama-webui/issues) to see if it's already been reported or suggested. If not, feel free to open a new issue. When reporting an issue, please follow our issue templates. These templates are designed to ensure that all necessary details are provided from the start, enabling us to address your concerns more efficiently.
|
Noticed something off? Have an idea? Check our [Issues tab](https://github.com/open-webui/oopen-webui/issues) to see if it's already been reported or suggested. If not, feel free to open a new issue. When reporting an issue, please follow our issue templates. These templates are designed to ensure that all necessary details are provided from the start, enabling us to address your concerns more efficiently.
|
||||||
|
|
||||||
> [!IMPORTANT]
|
> [!IMPORTANT]
|
||||||
|
>
|
||||||
> - **Template Compliance:** Please be aware that failure to follow the provided issue template, or not providing the requested information at all, will likely result in your issue being closed without further consideration. This approach is critical for maintaining the manageability and integrity of issue tracking.
|
> - **Template Compliance:** Please be aware that failure to follow the provided issue template, or not providing the requested information at all, will likely result in your issue being closed without further consideration. This approach is critical for maintaining the manageability and integrity of issue tracking.
|
||||||
>
|
>
|
||||||
> - **Detail is Key:** To ensure your issue is understood and can be effectively addressed, it's imperative to include comprehensive details. Descriptions should be clear, including steps to reproduce, expected outcomes, and actual results. Lack of sufficient detail may hinder our ability to resolve your issue.
|
> - **Detail is Key:** To ensure your issue is understood and can be effectively addressed, it's imperative to include comprehensive details. Descriptions should be clear, including steps to reproduce, expected outcomes, and actual results. Lack of sufficient detail may hinder our ability to resolve your issue.
|
||||||
|
|
||||||
### 🧭 Scope of Support
|
### 🧭 Scope of Support
|
||||||
|
|
||||||
We've noticed an uptick in issues not directly related to Ollama WebUI but rather to the environment it's run in, especially Docker setups. While we strive to support Docker deployment, understanding Docker fundamentals is crucial for a smooth experience.
|
We've noticed an uptick in issues not directly related to Open WebUI but rather to the environment it's run in, especially Docker setups. While we strive to support Docker deployment, understanding Docker fundamentals is crucial for a smooth experience.
|
||||||
|
|
||||||
- **Docker Deployment Support**: Ollama WebUI supports Docker deployment. Familiarity with Docker is assumed. For Docker basics, please refer to the [official Docker documentation](https://docs.docker.com/get-started/overview/).
|
- **Docker Deployment Support**: Open WebUI supports Docker deployment. Familiarity with Docker is assumed. For Docker basics, please refer to the [official Docker documentation](https://docs.docker.com/get-started/overview/).
|
||||||
|
|
||||||
- **Advanced Configurations**: Setting up reverse proxies for HTTPS and managing Docker deployments requires foundational knowledge. There are numerous online resources available to learn these skills. Ensuring you have this knowledge will greatly enhance your experience with Ollama WebUI and similar projects.
|
- **Advanced Configurations**: Setting up reverse proxies for HTTPS and managing Docker deployments requires foundational knowledge. There are numerous online resources available to learn these skills. Ensuring you have this knowledge will greatly enhance your experience with Open WebUI and similar projects.
|
||||||
|
|
||||||
## 💡 Contributing
|
## 💡 Contributing
|
||||||
|
|
||||||
@ -40,14 +41,14 @@ Looking to contribute? Great! Here's how you can help:
|
|||||||
|
|
||||||
We welcome pull requests. Before submitting one, please:
|
We welcome pull requests. Before submitting one, please:
|
||||||
|
|
||||||
1. Discuss your idea or issue in the [issues section](https://github.com/ollama-webui/ollama-webui/issues).
|
1. Discuss your idea or issue in the [issues section](https://github.com/open-webui/open-webui/issues).
|
||||||
2. Follow the project's coding standards and include tests for new features.
|
2. Follow the project's coding standards and include tests for new features.
|
||||||
3. Update documentation as necessary.
|
3. Update documentation as necessary.
|
||||||
4. Write clear, descriptive commit messages.
|
4. Write clear, descriptive commit messages.
|
||||||
|
|
||||||
### 📚 Documentation & Tutorials
|
### 📚 Documentation & Tutorials
|
||||||
|
|
||||||
Help us make Ollama WebUI more accessible by improving documentation, writing tutorials, or creating guides on setting up and optimizing the web UI.
|
Help us make Open WebUI more accessible by improving documentation, writing tutorials, or creating guides on setting up and optimizing the web UI.
|
||||||
|
|
||||||
### 🤔 Questions & Feedback
|
### 🤔 Questions & Feedback
|
||||||
|
|
||||||
@ -55,6 +56,6 @@ Got questions or feedback? Join our [Discord community](https://discord.gg/5rJgQ
|
|||||||
|
|
||||||
## 🙏 Thank You!
|
## 🙏 Thank You!
|
||||||
|
|
||||||
Your contributions, big or small, make a significant impact on Ollama WebUI. We're excited to see what you bring to the project!
|
Your contributions, big or small, make a significant impact on Open WebUI. We're excited to see what you bring to the project!
|
||||||
|
|
||||||
Together, let's create an even more powerful tool for the community. 🌟
|
Together, let's create an even more powerful tool for the community. 🌟
|
||||||
|
@ -1,20 +1,20 @@
|
|||||||
# Security Policy
|
# Security Policy
|
||||||
Our primary goal is to ensure the protection and confidentiality of sensitive data stored by users on ollama-webui.
|
|
||||||
## Supported Versions
|
|
||||||
|
|
||||||
|
Our primary goal is to ensure the protection and confidentiality of sensitive data stored by users on open-webui.
|
||||||
|
|
||||||
|
## Supported Versions
|
||||||
|
|
||||||
| Version | Supported |
|
| Version | Supported |
|
||||||
| ------- | ------------------ |
|
| ------- | ------------------ |
|
||||||
| main | :white_check_mark: |
|
| main | :white_check_mark: |
|
||||||
| others | :x: |
|
| others | :x: |
|
||||||
|
|
||||||
|
|
||||||
## Reporting a Vulnerability
|
## Reporting a Vulnerability
|
||||||
|
|
||||||
If you discover a security issue within our system, please notify us immediately via a pull request or contact us on discord.
|
If you discover a security issue within our system, please notify us immediately via a pull request or contact us on discord.
|
||||||
|
|
||||||
## Product Security
|
## Product Security
|
||||||
|
|
||||||
We regularly audit our internal processes and system's architecture for vulnerabilities using a combination of automated and manual testing techniques.
|
We regularly audit our internal processes and system's architecture for vulnerabilities using a combination of automated and manual testing techniques.
|
||||||
|
|
||||||
We are planning on implementing SAST and SCA scans in our project soon.
|
We are planning on implementing SAST and SCA scans in our project soon.
|
||||||
|
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
Sometimes, its beneficial to host Ollama, separate from the UI, but retain the RAG and RBAC support features shared across users:
|
Sometimes, its beneficial to host Ollama, separate from the UI, but retain the RAG and RBAC support features shared across users:
|
||||||
|
|
||||||
# Ollama WebUI Configuration
|
# Open WebUI Configuration
|
||||||
|
|
||||||
## UI Configuration
|
## UI Configuration
|
||||||
|
|
||||||
@ -24,7 +24,6 @@ Enable the site first before you can request SSL:
|
|||||||
|
|
||||||
`a2ensite server.com.conf` # this will enable the site. a2ensite is short for "Apache 2 Enable Site"
|
`a2ensite server.com.conf` # this will enable the site. a2ensite is short for "Apache 2 Enable Site"
|
||||||
|
|
||||||
|
|
||||||
```
|
```
|
||||||
# For SSL
|
# For SSL
|
||||||
<VirtualHost 192.168.1.100:443>
|
<VirtualHost 192.168.1.100:443>
|
||||||
@ -62,14 +61,12 @@ Create server.com.conf if it is not yet already created, containing the above `<
|
|||||||
|
|
||||||
Once it's created, run `certbot --apache -d server.com`, this will request and add/create an SSL keys for you as well as create the server.com.le-ssl.conf
|
Once it's created, run `certbot --apache -d server.com`, this will request and add/create an SSL keys for you as well as create the server.com.le-ssl.conf
|
||||||
|
|
||||||
|
|
||||||
# Configuring Ollama Server
|
# Configuring Ollama Server
|
||||||
|
|
||||||
On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference:
|
On your latest installation of Ollama, make sure that you have setup your api server from the official Ollama reference:
|
||||||
|
|
||||||
[Ollama FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md)
|
[Ollama FAQ](https://github.com/jmorganca/ollama/blob/main/docs/faq.md)
|
||||||
|
|
||||||
|
|
||||||
### TL;DR
|
### TL;DR
|
||||||
|
|
||||||
The guide doesn't seem to match the current updated service file on linux. So, we will address it here:
|
The guide doesn't seem to match the current updated service file on linux. So, we will address it here:
|
||||||
@ -81,6 +78,7 @@ sudo nano /etc/systemd/system/ollama.service
|
|||||||
```
|
```
|
||||||
|
|
||||||
Add the following lines:
|
Add the following lines:
|
||||||
|
|
||||||
```
|
```
|
||||||
Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify
|
Environment="OLLAMA_HOST=0.0.0.0:11434" # this line is mandatory. You can also specify
|
||||||
```
|
```
|
||||||
@ -106,15 +104,13 @@ Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/
|
|||||||
WantedBy=default.target
|
WantedBy=default.target
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
||||||
Save the file by pressing CTRL+S, then press CTRL+X
|
Save the file by pressing CTRL+S, then press CTRL+X
|
||||||
|
|
||||||
When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server.
|
When your computer restarts, the Ollama server will now be listening on the IP:PORT you specified, in this case 0.0.0.0:11434, or 192.168.254.106:11434 (whatever your local IP address is). Make sure that your router is correctly configured to serve pages from that local IP by forwarding 11434 to your local IP server.
|
||||||
|
|
||||||
|
|
||||||
# Ollama Model Configuration
|
# Ollama Model Configuration
|
||||||
## For the Ollama model configuration, use the following Apache VirtualHost setup:
|
|
||||||
|
|
||||||
|
## For the Ollama model configuration, use the following Apache VirtualHost setup:
|
||||||
|
|
||||||
Navigate to the apache sites-available directory:
|
Navigate to the apache sites-available directory:
|
||||||
|
|
||||||
@ -198,7 +194,6 @@ If you encounter any misconfiguration or errors, please file an issue or engage
|
|||||||
|
|
||||||
Let's make this UI much more user friendly for everyone!
|
Let's make this UI much more user friendly for everyone!
|
||||||
|
|
||||||
Thanks for making ollama-webui your UI Choice for AI!
|
Thanks for making open-webui your UI Choice for AI!
|
||||||
|
|
||||||
|
This doc is made by **Bob Reyes**, your **Open-WebUI** fan from the Philippines.
|
||||||
This doc is made by **Bob Reyes**, your **Ollama-Web-UI** fan from the Philippines.
|
|
||||||
|
Loading…
Reference in New Issue
Block a user