mirror of
https://github.com/coleam00/bolt.new-any-llm
synced 2024-12-27 22:33:03 +00:00
docs: updated style in faq
updated style in FAQ docs to be an accordion like style added a TOC to the index page in the docs
This commit is contained in:
parent
01599caf38
commit
0733813af8
@ -1,6 +1,7 @@
|
||||
# Frequently Asked Questions (FAQ)
|
||||
|
||||
## What are the best models for bolt.diy?
|
||||
<details>
|
||||
<summary><strong>What are the best models for bolt.diy?</strong></summary>
|
||||
|
||||
For the best experience with bolt.diy, we recommend using the following models:
|
||||
|
||||
@ -11,10 +12,10 @@ For the best experience with bolt.diy, we recommend using the following models:
|
||||
- **Qwen 2.5 Coder 32b**: Best model for self-hosting with reasonable hardware requirements
|
||||
|
||||
**Note**: Models with less than 7b parameters typically lack the capability to properly interact with bolt!
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## How do I get the best results with bolt.diy?
|
||||
<details>
|
||||
<summary><strong>How do I get the best results with bolt.diy?</strong></summary>
|
||||
|
||||
- **Be specific about your stack**:
|
||||
Mention the frameworks or libraries you want to use (e.g., Astro, Tailwind, ShadCN) in your initial prompt. This ensures that bolt.diy scaffolds the project according to your preferences.
|
||||
@ -28,72 +29,62 @@ For the best experience with bolt.diy, we recommend using the following models:
|
||||
- **Batch simple instructions**:
|
||||
Combine simple tasks into a single prompt to save time and reduce API credit consumption. For example:
|
||||
*"Change the color scheme, add mobile responsiveness, and restart the dev server."*
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## How do I contribute to bolt.diy?
|
||||
<details>
|
||||
<summary><strong>How do I contribute to bolt.diy?</strong></summary>
|
||||
|
||||
Check out our [Contribution Guide](CONTRIBUTING.md) for more details on how to get involved!
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## What are the future plans for bolt.diy?
|
||||
<details>
|
||||
<summary><strong>What are the future plans for bolt.diy?</strong></summary>
|
||||
|
||||
Visit our [Roadmap](https://roadmap.sh/r/ottodev-roadmap-2ovzo) for the latest updates.
|
||||
New features and improvements are on the way!
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Why are there so many open issues/pull requests?
|
||||
<details>
|
||||
<summary><strong>Why are there so many open issues/pull requests?</strong></summary>
|
||||
|
||||
bolt.diy began as a small showcase project on @ColeMedin's YouTube channel to explore editing open-source projects with local LLMs. However, it quickly grew into a massive community effort!
|
||||
|
||||
We’re forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we’re also exploring partnerships to help the project thrive.
|
||||
We're forming a team of maintainers to manage demand and streamline issue resolution. The maintainers are rockstars, and we're also exploring partnerships to help the project thrive.
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## How do local LLMs compare to larger models like Claude 3.5 Sonnet for bolt.diy?
|
||||
<details>
|
||||
<summary><strong>How do local LLMs compare to larger models like Claude 3.5 Sonnet for bolt.diy?</strong></summary>
|
||||
|
||||
While local LLMs are improving rapidly, larger models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b still offer the best results for complex applications. Our ongoing focus is to improve prompts, agents, and the platform to better support smaller local LLMs.
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
## Common Errors and Troubleshooting
|
||||
<details>
|
||||
<summary><strong>Common Errors and Troubleshooting</strong></summary>
|
||||
|
||||
### **"There was an error processing this request"**
|
||||
This generic error message means something went wrong. Check both:
|
||||
- The terminal (if you started the app with Docker or `pnpm`).
|
||||
- The developer console in your browser (press `F12` or right-click > *Inspect*, then go to the *Console* tab).
|
||||
|
||||
---
|
||||
|
||||
### **"x-api-key header missing"**
|
||||
This error is sometimes resolved by restarting the Docker container.
|
||||
If that doesn’t work, try switching from Docker to `pnpm` or vice versa. We’re actively investigating this issue.
|
||||
|
||||
---
|
||||
If that doesn't work, try switching from Docker to `pnpm` or vice versa. We're actively investigating this issue.
|
||||
|
||||
### **Blank preview when running the app**
|
||||
A blank preview often occurs due to hallucinated bad code or incorrect commands.
|
||||
To troubleshoot:
|
||||
- Check the developer console for errors.
|
||||
- Remember, previews are core functionality, so the app isn’t broken! We’re working on making these errors more transparent.
|
||||
|
||||
---
|
||||
- Remember, previews are core functionality, so the app isn't broken! We're working on making these errors more transparent.
|
||||
|
||||
### **"Everything works, but the results are bad"**
|
||||
Local LLMs like Qwen-2.5-Coder are powerful for small applications but still experimental for larger projects. For better results, consider using larger models like GPT-4o, Claude 3.5 Sonnet, or DeepSeek Coder V2 236b.
|
||||
|
||||
---
|
||||
|
||||
### **"Received structured exception #0xc0000005: access violation"**
|
||||
|
||||
If you are getting this, you are probably on Windows. The fix is generally to update the [Visual C++ Redistributable](https://learn.microsoft.com/en-us/cpp/windows/latest-supported-vc-redist?view=msvc-170)
|
||||
|
||||
---
|
||||
|
||||
### **"Miniflare or Wrangler errors in Windows"**
|
||||
You will need to make sure you have the latest version of Visual Studio C++ installed (14.40.33816), more information here https://github.com/stackblitz-labs/bolt.diy/issues/19.
|
||||
</details>
|
||||
|
||||
---
|
||||
|
||||
|
@ -1,6 +1,21 @@
|
||||
# Welcome to bolt diy
|
||||
bolt.diy allows you to choose the LLM that you use for each prompt! Currently, you can use OpenAI, Anthropic, Ollama, OpenRouter, Gemini, LMStudio, Mistral, xAI, HuggingFace, DeepSeek, or Groq models - and it is easily extended to use any other model supported by the Vercel AI SDK! See the instructions below for running this locally and extending it to include more models.
|
||||
|
||||
## Table of Contents
|
||||
- [Join the community!](#join-the-community)
|
||||
- [What's bolt.diy](#whats-boltdiy)
|
||||
- [What Makes bolt.diy Different](#what-makes-boltdiy-different)
|
||||
- [Setup](#setup)
|
||||
- [Run with Docker](#run-with-docker)
|
||||
- [Using Helper Scripts](#1a-using-helper-scripts)
|
||||
- [Direct Docker Build Commands](#1b-direct-docker-build-commands-alternative-to-using-npm-scripts)
|
||||
- [Docker Compose with Profiles](#2-docker-compose-with-profiles-to-run-the-container)
|
||||
- [Run Without Docker](#run-without-docker)
|
||||
- [Adding New LLMs](#adding-new-llms)
|
||||
- [Available Scripts](#available-scripts)
|
||||
- [Development](#development)
|
||||
- [Tips and Tricks](#tips-and-tricks)
|
||||
|
||||
---
|
||||
|
||||
## Join the community!
|
||||
|
Loading…
Reference in New Issue
Block a user