diff --git a/README.md b/README.md index 5af1a6c42..756ac30cb 100644 --- a/README.md +++ b/README.md @@ -73,13 +73,22 @@ Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/) ### Installing Both Ollama and Ollama Web UI Using Docker Compose -If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command: +If you don't have Ollama installed yet, you can use the provided bash script for a hassle-free installation. Simply run the following command: +For cpu-only container ```bash -docker compose up -d --build +chmod +x run-compose.sh && ./run-compose.sh ``` -This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed. +For gpu-enabled container (to enable this you must have your gpu driver for docker, it mostly works with nvidia so this is the official install guide: [nvidia-container-toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html)) +```bash +chmod +x run-compose.sh && ./run-compose.sh --enable-gpu[count=1] +``` + +Note that both the above commands will use the latest production docker image in repository, to be able to build the latest local version you'll need to append the `--build` parameter, for example: +```bash +./run-compose.sh --build --enable-gpu[count=1] +``` ### Installing Ollama Web UI Only diff --git a/run-compose.sh b/run-compose.sh index 25cc12db3..7c7ceb714 100755 --- a/run-compose.sh +++ b/run-compose.sh @@ -80,10 +80,12 @@ usage() { echo " -h, --help Show this help message." echo "" echo "Examples:" - echo " $0 --enable-gpu[count=1]" - echo " $0 --enable-api[port=11435]" - echo " $0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000]" - echo " $0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data]" + echo " ./$0 --drop" + echo " ./$0 --enable-gpu[count=1]" + echo " ./$0 --enable-api[port=11435]" + echo " ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000]" + echo " ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data]" + echo " ./$0 --enable-gpu[count=1] --enable-api[port=12345] --webui[port=3000] --data[folder=./ollama-data] --build" echo "" echo "This script configures and runs a docker-compose setup with optional GPU support, API exposure, and web UI configuration." echo "About the gpu to use, the script automatically detects it using the "lspci" command."