From 91aa47902f0a0d695a1dca24651d99e83b4acc71 Mon Sep 17 00:00:00 2001 From: JR <42428778+jamesrothera@users.noreply.github.com> Date: Fri, 27 Jun 2025 03:35:16 +1000 Subject: [PATCH] Update variables --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 074c1f24..10edf53a 100644 --- a/README.md +++ b/README.md @@ -1,5 +1,7 @@ # bolt.diy JR +Udpated environment variables + [![bolt.diy: AI-Powered Full-Stack Web Development in the Browser](./public/social_preview_index.jpg)](https://bolt.diy) Welcome to bolt.diy, the official open source version of Bolt.new, which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.