From f692b6f9f23c4ea9f872fb0445e490c91a191fac Mon Sep 17 00:00:00 2001 From: Dustin Loring Date: Sun, 1 Dec 2024 14:47:44 -0500 Subject: [PATCH] Update README.md --- README.md | 9 --------- 1 file changed, 9 deletions(-) diff --git a/README.md b/README.md index 9bbb5c3..6590ab2 100644 --- a/README.md +++ b/README.md @@ -198,15 +198,6 @@ sudo npm install -g pnpm ```bash pnpm run dev ``` - -## Adding New LLMs: - -To make new LLMs available to use in this version of Bolt.new, head on over to `app/utils/constants.ts` and find the constant MODEL_LIST. Each element in this array is an object that has the model ID for the name (get this from the provider's API documentation), a label for the frontend model dropdown, and the provider. - -By default, Anthropic, OpenAI, Groq, and Ollama are implemented as providers, but the YouTube video for this repo covers how to extend this to work with more providers if you wish! - -When you add a new model to the MODEL_LIST array, it will immediately be available to use when you run the app locally or reload it. For Ollama models, make sure you have the model installed already before trying to use it here! - ## Available Scripts - `pnpm run dev`: Starts the development server.