diff --git a/docs/getting-started/installation.md b/docs/getting-started/installation.md index 2a3389e..a56c12f 100644 --- a/docs/getting-started/installation.md +++ b/docs/getting-started/installation.md @@ -16,22 +16,22 @@ kubectl apply -k ./kubernetes/manifest ### Installing Both Ollama and Open WebUI Using Helm -Package Helm file first +:::info + + The helm install method has been migrated to the new github repo, + and the latest installation method is referred to. [https://github.com/open-webui/helm-charts](https://github.com/open-webui/helm-charts) + +::: + +Confirm that'Helm 'has been deployed on your execution environment. +For more installation instructions, please refer to [https://helm.sh/docs/intro/install/](https://helm.sh/docs/intro/install/) ```bash -helm package ./kubernetes/helm/ +helm repo add open-webui https://helm.openwebui.com/ +helm repo update + +kubectl create namespace open-webui +helm upgrade --install open-webui open-webui/open-webui --namespace open-webui ``` -For cpu-only pod - -```bash -helm install open-webui ./open-webui-*.tgz -``` - -For gpu-enabled pod - -```bash -helm install open-webui ./open-webui-*.tgz --set ollama.resources.limits.nvidia.com/gpu="1" -``` - -Check the `kubernetes/helm/values.yaml` file to know which parameters are available for customization +Check the [kubernetes/helm/values.yaml](https://github.com/open-webui/helm-charts/tree/main/charts/open-webui) file to know more values are available for customization