From bd70b8fe642e95a00033b6564ec1ddf732977447 Mon Sep 17 00:00:00 2001 From: Dustin Loring Date: Mon, 9 Dec 2024 06:14:44 -0500 Subject: [PATCH] Update docs Removed the ollama ModelFile section as it is not needed anymore --- docs/docs/index.md | 25 ------------------------- 1 file changed, 25 deletions(-) diff --git a/docs/docs/index.md b/docs/docs/index.md index 5c12a00..d9c953e 100644 --- a/docs/docs/index.md +++ b/docs/docs/index.md @@ -148,31 +148,6 @@ sudo npm install -g pnpm pnpm run dev ``` -## Super Important Note on Running Ollama Models - -Ollama models by default only have 2048 tokens for their context window. Even for large models that can easily handle way more. -This is not a large enough window to handle the Bolt.new/oTToDev prompt! You have to create a version of any model you want -to use where you specify a larger context window. Luckily it's super easy to do that. - -All you have to do is: - -- Create a file called "Modelfile" (no file extension) anywhere on your computer -- Put in the two lines: - -``` -FROM [Ollama model ID such as qwen2.5-coder:7b] -PARAMETER num_ctx 32768 -``` - -- Run the command: - -``` -ollama create -f Modelfile [your new model ID, can be whatever you want (example: qwen2.5-coder-extra-ctx:7b)] -``` - -Now you have a new Ollama model that isn't heavily limited in the context length like Ollama models are by default for some reason. -You'll see this new model in the list of Ollama models along with all the others you pulled! - ## Adding New LLMs: To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider.