mirror of
https://github.com/open-webui/open-webui
synced 2024-11-16 05:24:02 +00:00
Refactor docker-compose configuration for modularity
Split the original docker-compose.yml into three separate files for enhanced modularity and ease of use. Created docker-compose.api.yml for API exposure configuration and docker-compose.gpu.yml for GPU support. This change simplifies the management of different deployment environments and configurations, making it easier to enable or disable specific features such as GPU support and API access without modifying the main docker-compose file.
This commit is contained in:
parent
d2a290aa27
commit
9bbae0e25a
@ -75,7 +75,14 @@ If you don't have Ollama installed yet, you can use the provided Docker Compose
|
||||
docker compose up -d --build
|
||||
```
|
||||
|
||||
This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed.
|
||||
This command will install both Ollama and Ollama Web UI on your system.
|
||||
Enable GPU support or Exposing Ollama API outside the container stack with the following command:
|
||||
```bash
|
||||
docker compose -f docker-compose.yml \
|
||||
-f docker-compose.gpu.yml \
|
||||
-f docker-compose.api.yml \
|
||||
up -d --build
|
||||
```
|
||||
|
||||
### Installing Ollama Web UI Only
|
||||
|
||||
|
7
docker-compose.api.yml
Normal file
7
docker-compose.api.yml
Normal file
@ -0,0 +1,7 @@
|
||||
version: '3.6'
|
||||
|
||||
services:
|
||||
ollama:
|
||||
# Expose Ollama API outside the container stack
|
||||
ports:
|
||||
- 11434:11434
|
13
docker-compose.gpu.yml
Normal file
13
docker-compose.gpu.yml
Normal file
@ -0,0 +1,13 @@
|
||||
version: '3.6'
|
||||
|
||||
services:
|
||||
ollama:
|
||||
# GPU support
|
||||
deploy:
|
||||
resources:
|
||||
reservations:
|
||||
devices:
|
||||
- driver: nvidia
|
||||
count: 1
|
||||
capabilities:
|
||||
- gpu
|
@ -2,20 +2,8 @@ version: '3.6'
|
||||
|
||||
services:
|
||||
ollama:
|
||||
# Uncomment below for GPU support
|
||||
# deploy:
|
||||
# resources:
|
||||
# reservations:
|
||||
# devices:
|
||||
# - driver: nvidia
|
||||
# count: 1
|
||||
# capabilities:
|
||||
# - gpu
|
||||
volumes:
|
||||
- ollama:/root/.ollama
|
||||
# Uncomment below to expose Ollama API outside the container stack
|
||||
# ports:
|
||||
# - 11434:11434
|
||||
container_name: ollama
|
||||
pull_policy: always
|
||||
tty: true
|
||||
|
Loading…
Reference in New Issue
Block a user