From ee1c37f70ed66e773acd058f065b684cb09b0e6a Mon Sep 17 00:00:00 2001 From: Arthur25 <98665507+Arthur2500@users.noreply.github.com> Date: Wed, 14 Feb 2024 23:17:45 +0100 Subject: [PATCH] Replaced old .ai TLD with new .com TLD --- README.md | 2 +- TROUBLESHOOTING.md | 2 +- docs/apache.md | 2 +- src/lib/components/chat/Settings/Models.svelte | 4 ++-- src/routes/(app)/modelfiles/create/+page.svelte | 2 +- 5 files changed, 6 insertions(+), 6 deletions(-) diff --git a/README.md b/README.md index 624708041..6fa71b211 100644 --- a/README.md +++ b/README.md @@ -121,7 +121,7 @@ Don't forget to explore our sibling project, [OllamaHub](https://ollamahub.com/) 2. **Ensure You Have the Latest Version of Ollama:** - - Download the latest version from [https://ollama.ai/](https://ollama.ai/). + - Download the latest version from [https://ollama.com/](https://ollama.com/). 3. **Verify Ollama Installation:** - After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you. diff --git a/TROUBLESHOOTING.md b/TROUBLESHOOTING.md index f7976550d..0ced1658a 100644 --- a/TROUBLESHOOTING.md +++ b/TROUBLESHOOTING.md @@ -20,7 +20,7 @@ docker run -d --network=host -v ollama-webui:/app/backend/data -e OLLAMA_API_BAS ### General Connection Errors -**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.ai/) for the latest updates. +**Ensure Ollama Version is Up-to-Date**: Always start by checking that you have the latest version of Ollama. Visit [Ollama's official site](https://ollama.com/) for the latest updates. **Troubleshooting Steps**: diff --git a/docs/apache.md b/docs/apache.md index 481fc7468..6169cbe3f 100644 --- a/docs/apache.md +++ b/docs/apache.md @@ -74,7 +74,7 @@ On your latest installation of Ollama, make sure that you have setup your api se The guide doesn't seem to match the current updated service file on linux. So, we will address it here: -Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.ai/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file: +Unless when you're compiling Ollama from source, installing with the standard install `curl https://ollama.com/install.sh | sh` creates a file called `ollama.service` in /etc/systemd/system. You can use nano to edit the file: ``` sudo nano /etc/systemd/system/ollama.service diff --git a/src/lib/components/chat/Settings/Models.svelte b/src/lib/components/chat/Settings/Models.svelte index 90655aa79..529b778ff 100644 --- a/src/lib/components/chat/Settings/Models.svelte +++ b/src/lib/components/chat/Settings/Models.svelte @@ -291,7 +291,7 @@
-
Pull a model from Ollama.ai
+
Pull a model from Ollama.com
To access the available model names for downloading, click here.
diff --git a/src/routes/(app)/modelfiles/create/+page.svelte b/src/routes/(app)/modelfiles/create/+page.svelte index 8738128bd..fcead2ba4 100644 --- a/src/routes/(app)/modelfiles/create/+page.svelte +++ b/src/routes/(app)/modelfiles/create/+page.svelte @@ -497,7 +497,7 @@ SYSTEM """${system}"""`.replace(/^\s*\n/gm, '');
To access the available model names for downloading, click here.