Merge branch 'main' into FEAT_BoltDYI_PREVIEW_V3

This commit is contained in:
Leex 2025-01-19 00:14:09 +01:00 committed by GitHub
commit 031e679aff
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
4 changed files with 11 additions and 9 deletions

View File

@ -144,7 +144,7 @@ docker build . --target bolt-ai-development
**Option 3: Docker Compose Profile**
```bash
docker-compose --profile development up
docker compose --profile development up
```
#### Running the Development Container
@ -171,7 +171,7 @@ docker build . --target bolt-ai-production
**Option 3: Docker Compose Profile**
```bash
docker-compose --profile production up
docker compose --profile production up
```
#### Running the Production Container

View File

@ -4,8 +4,10 @@
Welcome to bolt.diy, the official open source version of Bolt.new (previously known as oTToDev and bolt.new ANY LLM), which allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more information.
-----
Check the [bolt.diy Docs](https://stackblitz-labs.github.io/bolt.diy/) for more offical installation instructions and more informations.
-----
Also [this pinned post in our community](https://thinktank.ottomator.ai/t/videos-tutorial-helpful-content/3243) has a bunch of incredible resources for running and deploying bolt.diy yourself!
We have also launched an experimental agent called the "bolt.diy Expert" that can answer common questions about bolt.diy. Find it here on the [oTTomator Live Agent Studio](https://studio.ottomator.ai/).
@ -93,7 +95,7 @@ project, please check the [project management guide](./PROJECT.md) to get starte
## Features
- **AI-powered full-stack web development** directly in your browser.
- **AI-powered full-stack web development** for **NodeJS based applications** directly in your browser.
- **Support for multiple LLMs** with an extensible architecture to integrate additional models.
- **Attach images to prompts** for better contextual understanding.
- **Integrated terminal** to view output of LLM-run commands.
@ -186,7 +188,7 @@ This option requires some familiarity with Docker but provides a more isolated e
2. **Run the Container**:
```bash
docker-compose --profile development up
docker compose --profile development up
```
## Configuring API Keys and Providers

View File

@ -144,7 +144,7 @@ docker build . --target bolt-ai-development
**Option 3: Docker Compose Profile**
```bash
docker-compose --profile development up
docker compose --profile development up
```
#### Running the Development Container
@ -171,7 +171,7 @@ docker build . --target bolt-ai-production
**Option 3: Docker Compose Profile**
```bash
docker-compose --profile production up
docker compose --profile production up
```
#### Running the Production Container

View File

@ -156,7 +156,7 @@ Once you've configured your keys, the application will be ready to use the selec
2. **Run the Container**:
Use Docker Compose profiles to manage environments:
```bash
docker-compose --profile development up
docker compose --profile development up
```
- With the development profile, changes to your code will automatically reflect in the running container (hot reloading).
@ -188,7 +188,7 @@ To keep your local version of bolt.diy up to date with the latest changes, follo
- **If using Docker**, ensure you rebuild the Docker image to avoid using a cached version:
```bash
docker-compose --profile development up --build
docker compose --profile development up --build
```
- **If not using Docker**, you can start the application as usual with: