mirror of
https://github.com/stackblitz/bolt.new
synced 2025-02-06 04:48:04 +00:00
Update README.md
Fixed Readme to be up to date with prompting fixees
This commit is contained in:
parent
cd4ddfd0a2
commit
1766dd5aa8
@ -11,6 +11,7 @@ This fork of bolt.new allows you to choose the LLM that you use for each prompt!
|
|||||||
- ✅ Autogenerate Ollama models from what is downloaded (@mosquet)
|
- ✅ Autogenerate Ollama models from what is downloaded (@mosquet)
|
||||||
- ✅ Filter models by provider (@jasonm23)
|
- ✅ Filter models by provider (@jasonm23)
|
||||||
- ✅ Download project as ZIP (@fabwaseem)
|
- ✅ Download project as ZIP (@fabwaseem)
|
||||||
|
- ✅ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (@kofi-bhr)
|
||||||
- ⬜ LM Studio Integration
|
- ⬜ LM Studio Integration
|
||||||
- ⬜ DeepSeek API Integration
|
- ⬜ DeepSeek API Integration
|
||||||
- ⬜ Together Integration
|
- ⬜ Together Integration
|
||||||
@ -19,7 +20,6 @@ This fork of bolt.new allows you to choose the LLM that you use for each prompt!
|
|||||||
- ⬜ Run agents in the backend instead of a single model call
|
- ⬜ Run agents in the backend instead of a single model call
|
||||||
- ⬜ Publish projects directly to GitHub
|
- ⬜ Publish projects directly to GitHub
|
||||||
- ⬜ Load local projects into the app
|
- ⬜ Load local projects into the app
|
||||||
- ⬜ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (there is definitely opportunity there)
|
|
||||||
|
|
||||||
# Bolt.new: AI-Powered Full-Stack Web Development in the Browser
|
# Bolt.new: AI-Powered Full-Stack Web Development in the Browser
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user