diff --git a/.env.example b/.env.example index 968e934..d5cfc5c 100644 --- a/.env.example +++ b/.env.example @@ -29,5 +29,16 @@ GOOGLE_GENERATIVE_AI_API_KEY= # EXAMPLE http://localhost:11434 OLLAMA_API_BASE_URL= +# You only need this environment variable set if you want to use OpenAI Like models +OPENAI_LIKE_API_BASE_URL= + +# Get your OpenAI Like API Key +OPENAI_LIKE_API_KEY= + +# Get your Mistral API Key by following these instructions - +# https://console.mistral.ai/api-keys/ +# You only need this environment variable set if you want to use Mistral models +MISTRAL_API_KEY= + # Include this environment variable if you want more logging for debugging locally VITE_LOG_LEVEL=debug diff --git a/.github/workflows/github-build-push.yml b/.github/workflows/github-build-push.yml new file mode 100644 index 0000000..4d4db05 --- /dev/null +++ b/.github/workflows/github-build-push.yml @@ -0,0 +1,39 @@ +name: Build and Push Container + +on: + push: + branches: + - main + # paths: + # - 'Dockerfile' + workflow_dispatch: +jobs: + build-and-push: + runs-on: [ubuntu-latest] + steps: + - name: Checkout code + uses: actions/checkout@v4 + + - name: Set up QEMU + uses: docker/setup-qemu-action@v1 + + - name: Set up Docker Buildx + uses: docker/setup-buildx-action@v1 + + - name: Login to GitHub Container Registry + uses: docker/login-action@v1 + with: + registry: ghcr.io + username: ${{ github.actor }} + password: ${{ secrets.GITHUB_TOKEN }} + + - name: Build and Push Containers + uses: docker/build-push-action@v2 + with: + context: . + file: Dockerfile + platforms: linux/amd64,linux/arm64 + push: true + tags: | + ghcr.io/${{ github.repository }}:latest + ghcr.io/${{ github.repository }}:${{ github.sha }} diff --git a/.gitignore b/.gitignore index 20f5d15..f141cc0 100644 --- a/.gitignore +++ b/.gitignore @@ -12,7 +12,7 @@ dist-ssr *.local .vscode/* -!.vscode/launch.json +.vscode/launch.json !.vscode/extensions.json .idea .DS_Store diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index b5a963f..c09eae8 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,95 +1,92 @@ -[![Bolt Open Source Codebase](./public/social_preview_index.jpg)](https://bolt.new) +# Contributing to Bolt.new Fork -> Welcome to the **Bolt** open-source codebase! This repo contains a simple example app using the core components from bolt.new to help you get started building **AI-powered software development tools** powered by StackBlitz’s **WebContainer API**. +First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide. -### Why Build with Bolt + WebContainer API +## 📋 Table of Contents +- [Code of Conduct](#code-of-conduct) +- [How Can I Contribute?](#how-can-i-contribute) +- [Pull Request Guidelines](#pull-request-guidelines) +- [Coding Standards](#coding-standards) +- [Development Setup](#development-setup) +- [Project Structure](#project-structure) -By building with the Bolt + WebContainer API you can create browser-based applications that let users **prompt, run, edit, and deploy** full-stack web apps directly in the browser, without the need for virtual machines. With WebContainer API, you can build apps that give AI direct access and full control over a **Node.js server**, **filesystem**, **package manager** and **dev terminal** inside your users browser tab. This powerful combination allows you to create a new class of development tools that support all major JavaScript libraries and Node packages right out of the box, all without remote environments or local installs. +## Code of Conduct -### What’s the Difference Between Bolt (This Repo) and [Bolt.new](https://bolt.new)? +This project and everyone participating in it is governed by our Code of Conduct. By participating, you are expected to uphold this code. Please report unacceptable behavior to the project maintainers. -- **Bolt.new**: This is the **commercial product** from StackBlitz—a hosted, browser-based AI development tool that enables users to prompt, run, edit, and deploy full-stack web applications directly in the browser. Built on top of the [Bolt open-source repo](https://github.com/stackblitz/bolt.new) and powered by the StackBlitz **WebContainer API**. +## How Can I Contribute? -- **Bolt (This Repo)**: This open-source repository provides the core components used to make **Bolt.new**. This repo contains the UI interface for Bolt as well as the server components, built using [Remix Run](https://remix.run/). By leveraging this repo and StackBlitz’s **WebContainer API**, you can create your own AI-powered development tools and full-stack applications that run entirely in the browser. +### 🐞 Reporting Bugs and Feature Requests +- Check the issue tracker to avoid duplicates +- Use the issue templates when available +- Include as much relevant information as possible +- For bugs, add steps to reproduce the issue -# Get Started Building with Bolt +### 🔧 Code Contributions +1. Fork the repository +2. Create a new branch for your feature/fix +3. Write your code +4. Submit a pull request -Bolt combines the capabilities of AI with sandboxed development environments to create a collaborative experience where code can be developed by the assistant and the programmer together. Bolt combines [WebContainer API](https://webcontainers.io/api) with [Claude Sonnet 3.5](https://www.anthropic.com/news/claude-3-5-sonnet) using [Remix](https://remix.run/) and the [AI SDK](https://sdk.vercel.ai/). +### ✨ Becoming a Core Contributor +We're looking for dedicated contributors to help maintain and grow this project. If you're interested in becoming a core contributor, please fill out our [Contributor Application Form](https://forms.gle/TBSteXSDCtBDwr5m7). -### WebContainer API +## Pull Request Guidelines -Bolt uses [WebContainers](https://webcontainers.io/) to run generated code in the browser. WebContainers provide Bolt with a full-stack sandbox environment using [WebContainer API](https://webcontainers.io/api). WebContainers run full-stack applications directly in the browser without the cost and security concerns of cloud hosted AI agents. WebContainers are interactive and editable, and enables Bolt's AI to run code and understand any changes from the user. +### 📝 PR Checklist +- [ ] Branch from the main branch +- [ ] Update documentation if needed +- [ ] Manually verify all new functionality works as expected +- [ ] Keep PRs focused and atomic -The [WebContainer API](https://webcontainers.io) is free for personal and open source usage. If you're building an application for commercial usage, you can learn more about our [WebContainer API commercial usage pricing here](https://stackblitz.com/pricing#webcontainer-api). +### 👀 Review Process +1. Manually test the changes +2. At least one maintainer review required +3. Address all review comments +4. Maintain clean commit history -### Remix App +## Coding Standards -Bolt is built with [Remix](https://remix.run/) and -deployed using [CloudFlare Pages](https://pages.cloudflare.com/) and -[CloudFlare Workers](https://workers.cloudflare.com/). +### 💻 General Guidelines +- Follow existing code style +- Comment complex logic +- Keep functions focused and small +- Use meaningful variable names -### AI SDK Integration - -Bolt uses the [AI SDK](https://github.com/vercel/ai) to integrate with AI -models. At this time, Bolt supports using Anthropic's Claude Sonnet 3.5. -You can get an API key from the [Anthropic API Console](https://console.anthropic.com/) to use with Bolt. -Take a look at how [Bolt uses the AI SDK](https://github.com/stackblitz/bolt.new/tree/main/app/lib/.server/llm) - -## Prerequisites - -Before you begin, ensure you have the following installed: - -- Node.js (v20.15.1) -- pnpm (v9.4.0) - -## Setup - -1. Clone the repository (if you haven't already): +## Development Setup +### 🔄 Initial Setup +1. Clone the repository: ```bash -git clone https://github.com/stackblitz/bolt.new.git +git clone https://github.com/coleam00/bolt.new-any-llm.git ``` 2. Install dependencies: - ```bash pnpm install ``` -3. Create a `.env.local` file in the root directory and add your Anthropic API key: - -``` +3. Set up environment variables: + - Rename `.env.example` to `.env.local` + - Add your LLM API keys (only set the ones you plan to use): +```bash +GROQ_API_KEY=XXX +OPENAI_API_KEY=XXX ANTHROPIC_API_KEY=XXX +... ``` - -Optionally, you can set the debug level: - -``` + - Optionally set debug level: +```bash VITE_LOG_LEVEL=debug ``` - **Important**: Never commit your `.env.local` file to version control. It's already included in .gitignore. -## Available Scripts - -- `pnpm run dev`: Starts the development server. -- `pnpm run build`: Builds the project. -- `pnpm run start`: Runs the built application locally using Wrangler Pages. This script uses `bindings.sh` to set up necessary bindings so you don't have to duplicate environment variables. -- `pnpm run preview`: Builds the project and then starts it locally, useful for testing the production build. Note, HTTP streaming currently doesn't work as expected with `wrangler pages dev`. -- `pnpm test`: Runs the test suite using Vitest. -- `pnpm run typecheck`: Runs TypeScript type checking. -- `pnpm run typegen`: Generates TypeScript types using Wrangler. -- `pnpm run deploy`: Builds the project and deploys it to Cloudflare Pages. - -## Development - -To start the development server: - +### 🚀 Running the Development Server ```bash pnpm run dev ``` -This will start the Remix Vite development server. +**Note**: You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway. ## Testing diff --git a/Dockerfile b/Dockerfile index 6e4fc54..67f6089 100644 --- a/Dockerfile +++ b/Dockerfile @@ -28,5 +28,6 @@ RUN npm run build CMD [ "pnpm", "run", "dockerstart"] # Development image -FROM base AS bolt-ai-dev -ENTRYPOINT ["pnpm", "run", "dev", "--host"] \ No newline at end of file +FROM base AS bolt-ai-development +RUN mkdir -p ${WORKDIR}/run +CMD pnpm run dev --host \ No newline at end of file diff --git a/README.md b/README.md index 50d8f6b..116963f 100644 --- a/README.md +++ b/README.md @@ -11,6 +11,7 @@ This fork of Bolt.new allows you to choose the LLM that you use for each prompt! - ✅ Autogenerate Ollama models from what is downloaded (@yunatamos) - ✅ Filter models by provider (@jasonm23) - ✅ Download project as ZIP (@fabwaseem) +- ✅ Improvements to the main Bolt.new prompt in `app\lib\.server\llm\prompts.ts` (@kofi-bhr) - ⬜ LM Studio Integration - ⬜ DeepSeek API Integration - ⬜ Together Integration @@ -28,6 +29,7 @@ This fork of Bolt.new allows you to choose the LLM that you use for each prompt! - ⬜ Prompt caching - ⬜ Ability to enter API keys in the UI - ⬜ Prevent Bolt from rewriting files as often +- ⬜ Have LLM plan the project in a MD file for better results/transparency # Bolt.new: AI-Powered Full-Stack Web Development in the Browser @@ -114,7 +116,7 @@ To start the development server: pnpm run dev ``` -This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally! It's an easy install and a good browser for web development anyway. +This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway. ## Tips and Tricks diff --git a/app/components/chat/BaseChat.tsx b/app/components/chat/BaseChat.tsx index b742134..c1175f7 100644 --- a/app/components/chat/BaseChat.tsx +++ b/app/components/chat/BaseChat.tsx @@ -28,7 +28,7 @@ const ModelSelector = ({ model, setModel, modelList, providerList }) => { const [provider, setProvider] = useState(DEFAULT_PROVIDER); return (
-