merge with upstream/main

This commit is contained in:
Andrew Trokhymenko 2024-11-29 22:02:35 -05:00
commit 7cdb56a847
38 changed files with 1463 additions and 952 deletions

17
.husky/pre-commit Normal file
View File

@ -0,0 +1,17 @@
#!/bin/sh
echo "🔍 Running pre-commit hook to check the code looks good... 🔍"
if ! pnpm typecheck; then
echo "❌ Type checking failed! Please review TypeScript types."
echo "Once you're done, don't forget to add your changes to the commit! 🚀"
exit 1
fi
if ! pnpm lint; then
echo "❌ Linting failed! 'pnpm lint:check' will help you fix the easy ones."
echo "Once you're done, don't forget to add your beautification to the commit! 🤩"
exit 1
fi
echo "👍 All good! Committing changes..."

View File

@ -1,9 +1,6 @@
# Contributing to Bolt.new Fork # Contributing to oTToDev
## DEFAULT_NUM_CTX
The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file. First off, thank you for considering contributing to oTToDev! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make oTToDev a better tool for developers worldwide.
First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide.
## 📋 Table of Contents ## 📋 Table of Contents
- [Code of Conduct](#code-of-conduct) - [Code of Conduct](#code-of-conduct)
@ -56,6 +53,8 @@ We're looking for dedicated contributors to help maintain and grow this project.
- Comment complex logic - Comment complex logic
- Keep functions focused and small - Keep functions focused and small
- Use meaningful variable names - Use meaningful variable names
- Lint your code. This repo contains a pre-commit-hook that will verify your code is linted properly,
so set up your IDE to do that for you!
## Development Setup ## Development Setup

View File

@ -29,6 +29,8 @@ https://thinktank.ottomator.ai
- ✅ Bolt terminal to see the output of LLM run commands (@thecodacus) - ✅ Bolt terminal to see the output of LLM run commands (@thecodacus)
- ✅ Streaming of code output (@thecodacus) - ✅ Streaming of code output (@thecodacus)
- ✅ Ability to revert code to earlier version (@wonderwhy-er) - ✅ Ability to revert code to earlier version (@wonderwhy-er)
- ✅ Cohere Integration (@hasanraiyan)
- ✅ Dynamic model max token length (@hasanraiyan)
- ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs) - ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs)
- ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start) - ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start)
- ⬜ **HIGH PRIORITY** - Load local projects into the app - ⬜ **HIGH PRIORITY** - Load local projects into the app
@ -39,8 +41,6 @@ https://thinktank.ottomator.ai
- ⬜ Azure Open AI API Integration - ⬜ Azure Open AI API Integration
- ⬜ Perplexity Integration - ⬜ Perplexity Integration
- ⬜ Vertex AI Integration - ⬜ Vertex AI Integration
- ✅ Cohere Integration (@hasanraiyan)
- ✅ Dynamic model max token length (@hasanraiyan)
- ⬜ Deploy directly to Vercel/Netlify/other similar platforms - ⬜ Deploy directly to Vercel/Netlify/other similar platforms
- ⬜ Prompt caching - ⬜ Prompt caching
- ⬜ Better prompt enhancing - ⬜ Better prompt enhancing
@ -246,14 +246,55 @@ pnpm run dev
This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway. This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway.
## Tips and Tricks ## FAQ
Here are some tips to get the most out of Bolt.new: ### How do I get the best results with oTToDev?
- **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly. - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly.
- **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting. - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting.
- **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps Bolt understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality. - **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps oTToDev understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality.
- **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly. - **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask oTToDev to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly.
### How do I contribute to oTToDev?
[Please check out our dedicated page for contributing to oTToDev here!](CONTRIBUTING.md)
### Do you plan on merging oTToDev back into the official Bolt.new repo?
More news coming on this coming early next month - stay tuned!
### What are the future plans for oTToDev?
[Check out our Roadmap here!](https://roadmap.sh/r/ottodev-roadmap-2ovzo)
Lot more updates to this roadmap coming soon!
### Why are there so many open issues/pull requests?
oTToDev was started simply to showcase how to edit an open source project and to do something cool with local LLMs on my (@ColeMedin) YouTube channel! However, it quickly
grew into a massive community project that I am working hard to keep up with the demand of by forming a team of maintainers and getting as many people involved as I can.
That effort is going well and all of our maintainers are ABSOLUTE rockstars, but it still takes time to organize everything so we can efficiently get through all
the issues and PRs. But rest assured, we are working hard and even working on some partnerships behind the scenes to really help this project take off!
### How do local LLMs fair compared to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new?
As much as the gap is quickly closing between open source and massive close source models, youre still going to get the best results with the very large models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. This is one of the big tasks we have at hand - figuring out how to prompt better, use agents, and improve the platform as a whole to make it work better for even the smaller local LLMs!
### I'm getting the error: "There was an error processing this request"
If you see this error within oTToDev, that is just the application telling you there is a problem at a high level, and this could mean a number of different things. To find the actual error, please check BOTH the terminal where you started the application (with Docker or pnpm) and the developer console in the browser. For most browsers, you can access the developer console by pressing F12 or right clicking anywhere in the browser and selecting “Inspect”. Then go to the “console” tab in the top right.
### I'm getting the error: "x-api-key header missing"
We have seen this error a couple times and for some reason just restarting the Docker container has fixed it. This seems to be Ollama specific. Another thing to try is try to run oTToDev with Docker or pnpm, whichever you didnt run first. We are still on the hunt for why this happens once and a while!
### I'm getting a blank preview when oTToDev runs my app!
We promise you that we are constantly testing new PRs coming into oTToDev and the preview is core functionality, so the application is not broken! When you get a blank preview or dont get a preview, this is generally because the LLM hallucinated bad code or incorrect commands. We are working on making this more transparent so it is obvious. Sometimes the error will appear in developer console too so check that as well.
### Everything works but the results are bad
This goes to the point above about how local LLMs are getting very powerful but you still are going to see better (sometimes much better) results with the largest LLMs like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. If you are using smaller LLMs like Qwen-2.5-Coder, consider it more experimental and educational at this point. It can build smaller applications really well, which is super impressive for a local LLM, but for larger scale applications you want to use the larger LLMs still!

View File

@ -10,6 +10,7 @@ interface APIKeyManagerProps {
labelForGetApiKey?: string; labelForGetApiKey?: string;
} }
// eslint-disable-next-line @typescript-eslint/naming-convention
export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ provider, apiKey, setApiKey }) => { export const APIKeyManager: React.FC<APIKeyManagerProps> = ({ provider, apiKey, setApiKey }) => {
const [isEditing, setIsEditing] = useState(false); const [isEditing, setIsEditing] = useState(false);
const [tempKey, setTempKey] = useState(apiKey); const [tempKey, setTempKey] = useState(apiKey);

View File

@ -1,47 +1,45 @@
// @ts-nocheck /*
// Preventing TS checks with files presented in the video for a better presentation. * @ts-nocheck
* Preventing TS checks with files presented in the video for a better presentation.
*/
import type { Message } from 'ai'; import type { Message } from 'ai';
import React, { type RefCallback, useEffect } from 'react'; import React, { type RefCallback, useEffect, useState } from 'react';
import { ClientOnly } from 'remix-utils/client-only'; import { ClientOnly } from 'remix-utils/client-only';
import { Menu } from '~/components/sidebar/Menu.client'; import { Menu } from '~/components/sidebar/Menu.client';
import { IconButton } from '~/components/ui/IconButton'; import { IconButton } from '~/components/ui/IconButton';
import { Workbench } from '~/components/workbench/Workbench.client'; import { Workbench } from '~/components/workbench/Workbench.client';
import { classNames } from '~/utils/classNames'; import { classNames } from '~/utils/classNames';
import { MODEL_LIST, DEFAULT_PROVIDER, PROVIDER_LIST, initializeModelList } from '~/utils/constants'; import { MODEL_LIST, PROVIDER_LIST, initializeModelList } from '~/utils/constants';
import { Messages } from './Messages.client'; import { Messages } from './Messages.client';
import { SendButton } from './SendButton.client'; import { SendButton } from './SendButton.client';
import { useState } from 'react';
import { APIKeyManager } from './APIKeyManager'; import { APIKeyManager } from './APIKeyManager';
import Cookies from 'js-cookie'; import Cookies from 'js-cookie';
import * as Tooltip from '@radix-ui/react-tooltip';
import styles from './BaseChat.module.scss'; import styles from './BaseChat.module.scss';
import type { ProviderInfo } from '~/utils/types'; import type { ProviderInfo } from '~/utils/types';
import { ExportChatButton } from '~/components/chat/chatExportAndImport/ExportChatButton';
import { ImportButtons } from '~/components/chat/chatExportAndImport/ImportButtons';
import { ExamplePrompts } from '~/components/chat/ExamplePrompts';
import FilePreview from './FilePreview'; import FilePreview from './FilePreview';
const EXAMPLE_PROMPTS = [ // @ts-ignore TODO: Introduce proper types
{ text: 'Build a todo app in React using Tailwind' }, // eslint-disable-next-line @typescript-eslint/no-unused-vars
{ text: 'Build a simple blog using Astro' },
{ text: 'Create a cookie consent form using Material UI' },
{ text: 'Make a space invaders game' },
{ text: 'How do I center a div?' },
];
const providerList = PROVIDER_LIST;
const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => { const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => {
return ( return (
<div className="mb-2 flex gap-2 flex-col sm:flex-row"> <div className="mb-2 flex gap-2 flex-col sm:flex-row">
<select <select
value={provider?.name} value={provider?.name}
onChange={(e) => { onChange={(e) => {
setProvider(providerList.find((p) => p.name === e.target.value)); setProvider(providerList.find((p: ProviderInfo) => p.name === e.target.value));
const firstModel = [...modelList].find((m) => m.provider == e.target.value); const firstModel = [...modelList].find((m) => m.provider == e.target.value);
setModel(firstModel ? firstModel.name : ''); setModel(firstModel ? firstModel.name : '');
}} }}
className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all" className="flex-1 p-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary focus:outline-none focus:ring-2 focus:ring-bolt-elements-focus transition-all"
> >
{providerList.map((provider) => ( {providerList.map((provider: ProviderInfo) => (
<option key={provider.name} value={provider.name}> <option key={provider.name} value={provider.name}>
{provider.name} {provider.name}
</option> </option>
@ -75,6 +73,7 @@ interface BaseChatProps {
chatStarted?: boolean; chatStarted?: boolean;
isStreaming?: boolean; isStreaming?: boolean;
messages?: Message[]; messages?: Message[];
description?: string;
enhancingPrompt?: boolean; enhancingPrompt?: boolean;
promptEnhanced?: boolean; promptEnhanced?: boolean;
input?: string; input?: string;
@ -86,6 +85,8 @@ interface BaseChatProps {
sendMessage?: (event: React.UIEvent, messageInput?: string) => void; sendMessage?: (event: React.UIEvent, messageInput?: string) => void;
handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void; handleInputChange?: (event: React.ChangeEvent<HTMLTextAreaElement>) => void;
enhancePrompt?: () => void; enhancePrompt?: () => void;
importChat?: (description: string, messages: Message[]) => Promise<void>;
exportChat?: () => void;
uploadedFiles?: File[]; uploadedFiles?: File[];
setUploadedFiles?: (files: File[]) => void; setUploadedFiles?: (files: File[]) => void;
imageDataList?: string[]; imageDataList?: string[];
@ -111,12 +112,13 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
enhancePrompt, enhancePrompt,
sendMessage, sendMessage,
handleStop, handleStop,
uploadedFiles, importChat,
exportChat,
uploadedFiles = [],
setUploadedFiles, setUploadedFiles,
imageDataList, imageDataList = [],
setImageDataList, setImageDataList,
messages, messages,
children, // Add this
}, },
ref, ref,
) => { ) => {
@ -128,14 +130,17 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
// Load API keys from cookies on component mount // Load API keys from cookies on component mount
try { try {
const storedApiKeys = Cookies.get('apiKeys'); const storedApiKeys = Cookies.get('apiKeys');
if (storedApiKeys) { if (storedApiKeys) {
const parsedKeys = JSON.parse(storedApiKeys); const parsedKeys = JSON.parse(storedApiKeys);
if (typeof parsedKeys === 'object' && parsedKeys !== null) { if (typeof parsedKeys === 'object' && parsedKeys !== null) {
setApiKeys(parsedKeys); setApiKeys(parsedKeys);
} }
} }
} catch (error) { } catch (error) {
console.error('Error loading API keys from cookies:', error); console.error('Error loading API keys from cookies:', error);
// Clear invalid cookie data // Clear invalid cookie data
Cookies.remove('apiKeys'); Cookies.remove('apiKeys');
} }
@ -149,6 +154,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
try { try {
const updatedApiKeys = { ...apiKeys, [provider]: key }; const updatedApiKeys = { ...apiKeys, [provider]: key };
setApiKeys(updatedApiKeys); setApiKeys(updatedApiKeys);
// Save updated API keys to cookies with 30 day expiry and secure settings // Save updated API keys to cookies with 30 day expiry and secure settings
Cookies.set('apiKeys', JSON.stringify(updatedApiKeys), { Cookies.set('apiKeys', JSON.stringify(updatedApiKeys), {
expires: 30, // 30 days expires: 30, // 30 days
@ -161,11 +167,6 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
} }
}; };
const handleRemoveFile = () => {
setUploadedFiles([]);
setImageDataList([]);
};
const handleFileUpload = () => { const handleFileUpload = () => {
const input = document.createElement('input'); const input = document.createElement('input');
input.type = 'file'; input.type = 'file';
@ -173,8 +174,10 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
input.onchange = async (e) => { input.onchange = async (e) => {
const file = (e.target as HTMLInputElement).files?.[0]; const file = (e.target as HTMLInputElement).files?.[0];
if (file) { if (file) {
const reader = new FileReader(); const reader = new FileReader();
reader.onload = (e) => { reader.onload = (e) => {
const base64Image = e.target?.result as string; const base64Image = e.target?.result as string;
setUploadedFiles?.([...uploadedFiles, file]); setUploadedFiles?.([...uploadedFiles, file]);
@ -187,7 +190,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
input.click(); input.click();
}; };
return ( const baseChat = (
<div <div
ref={ref} ref={ref}
className={classNames( className={classNames(
@ -300,6 +303,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
handleStop?.(); handleStop?.();
return; return;
} }
if (input.length > 0 || uploadedFiles.length > 0) { if (input.length > 0 || uploadedFiles.length > 0) {
sendMessage?.(event); sendMessage?.(event);
} }
@ -309,22 +313,19 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
</ClientOnly> </ClientOnly>
<div className="flex justify-between items-center text-sm p-4 pt-2"> <div className="flex justify-between items-center text-sm p-4 pt-2">
<div className="flex gap-1 items-center"> <div className="flex gap-1 items-center">
<IconButton <IconButton title="Upload file" className="transition-all" onClick={() => handleFileUpload()}>
title="Upload file"
className="transition-all"
onClick={() => handleFileUpload()}
>
<div className="i-ph:paperclip text-xl"></div> <div className="i-ph:paperclip text-xl"></div>
</IconButton> </IconButton>
<IconButton <IconButton
title="Enhance prompt" title="Enhance prompt"
disabled={input.length === 0 || enhancingPrompt} disabled={input.length === 0 || enhancingPrompt}
className={classNames('transition-all', { className={classNames(
'opacity-100!': enhancingPrompt, 'transition-all',
'text-bolt-elements-item-contentAccent! pr-1.5 enabled:hover:bg-bolt-elements-item-backgroundAccent!': enhancingPrompt ? 'opacity-100' : '',
promptEnhanced, promptEnhanced ? 'text-bolt-elements-item-contentAccent' : '',
})} promptEnhanced ? 'pr-1.5' : '',
promptEnhanced ? 'enabled:hover:bg-bolt-elements-item-backgroundAccent' : '',
)}
onClick={() => enhancePrompt?.()} onClick={() => enhancePrompt?.()}
> >
{enhancingPrompt ? ( {enhancingPrompt ? (
@ -339,6 +340,7 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
</> </>
)} )}
</IconButton> </IconButton>
{chatStarted && <ClientOnly>{() => <ExportChatButton exportChat={exportChat} />}</ClientOnly>}
</div> </div>
{input.length > 3 ? ( {input.length > 3 ? (
<div className="text-xs text-bolt-elements-textTertiary"> <div className="text-xs text-bolt-elements-textTertiary">
@ -351,30 +353,14 @@ export const BaseChat = React.forwardRef<HTMLDivElement, BaseChatProps>(
</div> </div>
</div> </div>
</div> </div>
{!chatStarted && ( {!chatStarted && ImportButtons(importChat)}
<div id="examples" className="relative w-full max-w-xl mx-auto mt-8 flex justify-center"> {!chatStarted && ExamplePrompts(sendMessage)}
<div className="flex flex-col space-y-2 [mask-image:linear-gradient(to_bottom,black_0%,transparent_180%)] hover:[mask-image:none]">
{EXAMPLE_PROMPTS.map((examplePrompt, index) => {
return (
<button
key={index}
onClick={(event) => {
sendMessage?.(event, examplePrompt.text);
}}
className="group flex items-center w-full gap-2 justify-center bg-transparent text-bolt-elements-textTertiary hover:text-bolt-elements-textPrimary transition-theme"
>
{examplePrompt.text}
<div className="i-ph:arrow-bend-down-left" />
</button>
);
})}
</div>
</div>
)}
</div> </div>
<ClientOnly>{() => <Workbench chatStarted={chatStarted} isStreaming={isStreaming} />}</ClientOnly> <ClientOnly>{() => <Workbench chatStarted={chatStarted} isStreaming={isStreaming} />}</ClientOnly>
</div> </div>
</div> </div>
); );
return <Tooltip.Provider delayDuration={200}>{baseChat}</Tooltip.Provider>;
}, },
); );

View File

@ -1,5 +1,7 @@
// @ts-nocheck /*
// Preventing TS checks with files presented in the video for a better presentation. * @ts-nocheck
* Preventing TS checks with files presented in the video for a better presentation.
*/
import { useStore } from '@nanostores/react'; import { useStore } from '@nanostores/react';
import type { Message } from 'ai'; import type { Message } from 'ai';
import { useChat } from 'ai/react'; import { useChat } from 'ai/react';
@ -7,10 +9,9 @@ import { useAnimate } from 'framer-motion';
import { memo, useEffect, useRef, useState } from 'react'; import { memo, useEffect, useRef, useState } from 'react';
import { cssTransition, toast, ToastContainer } from 'react-toastify'; import { cssTransition, toast, ToastContainer } from 'react-toastify';
import { useMessageParser, usePromptEnhancer, useShortcuts, useSnapScroll } from '~/lib/hooks'; import { useMessageParser, usePromptEnhancer, useShortcuts, useSnapScroll } from '~/lib/hooks';
import { useChatHistory } from '~/lib/persistence'; import { description, useChatHistory } from '~/lib/persistence';
import { chatStore } from '~/lib/stores/chat'; import { chatStore } from '~/lib/stores/chat';
import { workbenchStore } from '~/lib/stores/workbench'; import { workbenchStore } from '~/lib/stores/workbench';
import { fileModificationsToHTML } from '~/utils/diff';
import { DEFAULT_MODEL, DEFAULT_PROVIDER, PROVIDER_LIST } from '~/utils/constants'; import { DEFAULT_MODEL, DEFAULT_PROVIDER, PROVIDER_LIST } from '~/utils/constants';
import { cubicEasingFn } from '~/utils/easings'; import { cubicEasingFn } from '~/utils/easings';
import { createScopedLogger, renderLogger } from '~/utils/logger'; import { createScopedLogger, renderLogger } from '~/utils/logger';
@ -28,11 +29,20 @@ const logger = createScopedLogger('Chat');
export function Chat() { export function Chat() {
renderLogger.trace('Chat'); renderLogger.trace('Chat');
const { ready, initialMessages, storeMessageHistory } = useChatHistory(); const { ready, initialMessages, storeMessageHistory, importChat, exportChat } = useChatHistory();
const title = useStore(description);
return ( return (
<> <>
{ready && <ChatImpl initialMessages={initialMessages} storeMessageHistory={storeMessageHistory} />} {ready && (
<ChatImpl
description={title}
initialMessages={initialMessages}
exportChat={exportChat}
storeMessageHistory={storeMessageHistory}
importChat={importChat}
/>
)}
<ToastContainer <ToastContainer
closeButton={({ closeToast }) => { closeButton={({ closeToast }) => {
return ( return (
@ -67,247 +77,255 @@ export function Chat() {
interface ChatProps { interface ChatProps {
initialMessages: Message[]; initialMessages: Message[];
storeMessageHistory: (messages: Message[]) => Promise<void>; storeMessageHistory: (messages: Message[]) => Promise<void>;
importChat: (description: string, messages: Message[]) => Promise<void>;
exportChat: () => void;
description?: string;
} }
export const ChatImpl = memo(({ initialMessages, storeMessageHistory }: ChatProps) => { export const ChatImpl = memo(
useShortcuts(); ({ description, initialMessages, storeMessageHistory, importChat, exportChat }: ChatProps) => {
useShortcuts();
const textareaRef = useRef<HTMLTextAreaElement>(null); const textareaRef = useRef<HTMLTextAreaElement>(null);
const [chatStarted, setChatStarted] = useState(initialMessages.length > 0); const [chatStarted, setChatStarted] = useState(initialMessages.length > 0);
const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here const [uploadedFiles, setUploadedFiles] = useState<File[]>([]); // Move here
const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here const [imageDataList, setImageDataList] = useState<string[]>([]); // Move here
const [model, setModel] = useState(() => {
const savedModel = Cookies.get('selectedModel');
return savedModel || DEFAULT_MODEL;
});
const [provider, setProvider] = useState(() => {
const savedProvider = Cookies.get('selectedProvider');
return PROVIDER_LIST.find((p) => p.name === savedProvider) || DEFAULT_PROVIDER;
});
const [model, setModel] = useState(() => { const { showChat } = useStore(chatStore);
const savedModel = Cookies.get('selectedModel');
return savedModel || DEFAULT_MODEL;
});
const [provider, setProvider] = useState(() => {
const savedProvider = Cookies.get('selectedProvider');
return PROVIDER_LIST.find(p => p.name === savedProvider) || DEFAULT_PROVIDER;
});
const { showChat } = useStore(chatStore); const [animationScope, animate] = useAnimate();
const [animationScope, animate] = useAnimate(); const [apiKeys, setApiKeys] = useState<Record<string, string>>({});
const [apiKeys, setApiKeys] = useState<Record<string, string>>({}); const { messages, isLoading, input, handleInputChange, setInput, stop, append } = useChat({
api: '/api/chat',
const { messages, isLoading, input, handleInputChange, setInput, stop, append } = useChat({ body: {
api: '/api/chat', apiKeys,
body: { },
apiKeys onError: (error) => {
}, logger.error('Request failed\n\n', error);
onError: (error) => { toast.error(
logger.error('Request failed\n\n', error); 'There was an error processing your request: ' + (error.message ? error.message : 'No details were returned'),
toast.error('There was an error processing your request: ' + (error.message ? error.message : "No details were returned"));
},
onFinish: () => {
logger.debug('Finished streaming');
},
initialMessages,
});
const { enhancingPrompt, promptEnhanced, enhancePrompt, resetEnhancer } = usePromptEnhancer();
const { parsedMessages, parseMessages } = useMessageParser();
const TEXTAREA_MAX_HEIGHT = chatStarted ? 400 : 200;
useEffect(() => {
chatStore.setKey('started', initialMessages.length > 0);
}, []);
useEffect(() => {
parseMessages(messages, isLoading);
if (messages.length > initialMessages.length) {
storeMessageHistory(messages).catch((error) => toast.error(error.message));
}
}, [messages, isLoading, parseMessages]);
const scrollTextArea = () => {
const textarea = textareaRef.current;
if (textarea) {
textarea.scrollTop = textarea.scrollHeight;
}
};
const abort = () => {
stop();
chatStore.setKey('aborted', true);
workbenchStore.abortAllActions();
};
useEffect(() => {
const textarea = textareaRef.current;
if (textarea) {
textarea.style.height = 'auto';
const scrollHeight = textarea.scrollHeight;
textarea.style.height = `${Math.min(scrollHeight, TEXTAREA_MAX_HEIGHT)}px`;
textarea.style.overflowY = scrollHeight > TEXTAREA_MAX_HEIGHT ? 'auto' : 'hidden';
}
}, [input, textareaRef]);
const runAnimation = async () => {
if (chatStarted) {
return;
}
await Promise.all([
animate('#examples', { opacity: 0, display: 'none' }, { duration: 0.1 }),
animate('#intro', { opacity: 0, flex: 1 }, { duration: 0.2, ease: cubicEasingFn }),
]);
chatStore.setKey('started', true);
setChatStarted(true);
};
const sendMessage = async (_event: React.UIEvent, messageInput?: string) => {
const _input = messageInput || input;
if (_input.length === 0 || isLoading) {
return;
}
/**
* @note (delm) Usually saving files shouldn't take long but it may take longer if there
* many unsaved files. In that case we need to block user input and show an indicator
* of some kind so the user is aware that something is happening. But I consider the
* happy case to be no unsaved files and I would expect users to save their changes
* before they send another message.
*/
await workbenchStore.saveAllFiles();
const fileModifications = workbenchStore.getFileModifcations();
chatStore.setKey('aborted', false);
runAnimation();
if (fileModifications !== undefined) {
const diff = fileModificationsToHTML(fileModifications);
/**
* If we have file modifications we append a new user message manually since we have to prefix
* the user input with the file modifications and we don't want the new user input to appear
* in the prompt. Using `append` is almost the same as `handleSubmit` except that we have to
* manually reset the input and we'd have to manually pass in file attachments. However, those
* aren't relevant here.
*/
append({
role: 'user',
content: [
{
type: 'text',
text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${diff}\n\n${_input}`
},
...(imageDataList.map(imageData => ({
type: 'image',
image: imageData
})))
]
});
/**
* After sending a new message we reset all modifications since the model
* should now be aware of all the changes.
*/
workbenchStore.resetAllFileModifications();
} else {
append({
role: 'user',
content: [
{
type: 'text',
text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`
},
...(imageDataList.map(imageData => ({
type: 'image',
image: imageData
})))
]
});
}
setInput('');
// Add file cleanup here
setUploadedFiles([]);
setImageDataList([]);
resetEnhancer();
textareaRef.current?.blur();
};
const [messageRef, scrollRef] = useSnapScroll();
useEffect(() => {
const storedApiKeys = Cookies.get('apiKeys');
if (storedApiKeys) {
setApiKeys(JSON.parse(storedApiKeys));
}
}, []);
const handleModelChange = (newModel: string) => {
setModel(newModel);
Cookies.set('selectedModel', newModel, { expires: 30 });
};
const handleProviderChange = (newProvider: ProviderInfo) => {
setProvider(newProvider);
Cookies.set('selectedProvider', newProvider.name, { expires: 30 });
};
return (
<BaseChat
ref={animationScope}
textareaRef={textareaRef}
input={input}
showChat={showChat}
chatStarted={chatStarted}
isStreaming={isLoading}
enhancingPrompt={enhancingPrompt}
promptEnhanced={promptEnhanced}
sendMessage={sendMessage}
model={model}
setModel={handleModelChange}
provider={provider}
setProvider={handleProviderChange}
messageRef={messageRef}
scrollRef={scrollRef}
handleInputChange={handleInputChange}
handleStop={abort}
messages={messages.map((message, i) => {
if (message.role === 'user') {
return message;
}
return {
...message,
content: parsedMessages[i] || '',
};
})}
enhancePrompt={() => {
enhancePrompt(
input,
(input) => {
setInput(input);
scrollTextArea();
},
model,
provider,
apiKeys
); );
}} },
uploadedFiles={uploadedFiles} onFinish: () => {
setUploadedFiles={setUploadedFiles} logger.debug('Finished streaming');
imageDataList={imageDataList} },
setImageDataList={setImageDataList} initialMessages,
/> });
);
});
const { enhancingPrompt, promptEnhanced, enhancePrompt, resetEnhancer } = usePromptEnhancer();
const { parsedMessages, parseMessages } = useMessageParser();
const TEXTAREA_MAX_HEIGHT = chatStarted ? 400 : 200;
useEffect(() => {
chatStore.setKey('started', initialMessages.length > 0);
}, []);
useEffect(() => {
parseMessages(messages, isLoading);
if (messages.length > initialMessages.length) {
storeMessageHistory(messages).catch((error) => toast.error(error.message));
}
}, [messages, isLoading, parseMessages]);
const scrollTextArea = () => {
const textarea = textareaRef.current;
if (textarea) {
textarea.scrollTop = textarea.scrollHeight;
}
};
const abort = () => {
stop();
chatStore.setKey('aborted', true);
workbenchStore.abortAllActions();
};
useEffect(() => {
const textarea = textareaRef.current;
if (textarea) {
textarea.style.height = 'auto';
const scrollHeight = textarea.scrollHeight;
textarea.style.height = `${Math.min(scrollHeight, TEXTAREA_MAX_HEIGHT)}px`;
textarea.style.overflowY = scrollHeight > TEXTAREA_MAX_HEIGHT ? 'auto' : 'hidden';
}
}, [input, textareaRef]);
const runAnimation = async () => {
if (chatStarted) {
return;
}
await Promise.all([
animate('#examples', { opacity: 0, display: 'none' }, { duration: 0.1 }),
animate('#intro', { opacity: 0, flex: 1 }, { duration: 0.2, ease: cubicEasingFn }),
]);
chatStore.setKey('started', true);
setChatStarted(true);
};
const sendMessage = async (_event: React.UIEvent, messageInput?: string) => {
const _input = messageInput || input;
if (_input.length === 0 || isLoading) {
return;
}
/**
* @note (delm) Usually saving files shouldn't take long but it may take longer if there
* many unsaved files. In that case we need to block user input and show an indicator
* of some kind so the user is aware that something is happening. But I consider the
* happy case to be no unsaved files and I would expect users to save their changes
* before they send another message.
*/
await workbenchStore.saveAllFiles();
const fileModifications = workbenchStore.getFileModifcations();
chatStore.setKey('aborted', false);
runAnimation();
if (fileModifications !== undefined) {
/**
* If we have file modifications we append a new user message manually since we have to prefix
* the user input with the file modifications and we don't want the new user input to appear
* in the prompt. Using `append` is almost the same as `handleSubmit` except that we have to
* manually reset the input and we'd have to manually pass in file attachments. However, those
* aren't relevant here.
*/
append({
role: 'user',
content: [
{
type: 'text',
text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`,
},
...imageDataList.map((imageData) => ({
type: 'image',
image: imageData,
})),
] as any, // Type assertion to bypass compiler check
});
/**
* After sending a new message we reset all modifications since the model
* should now be aware of all the changes.
*/
workbenchStore.resetAllFileModifications();
} else {
append({
role: 'user',
content: [
{
type: 'text',
text: `[Model: ${model}]\n\n[Provider: ${provider.name}]\n\n${_input}`,
},
...imageDataList.map((imageData) => ({
type: 'image',
image: imageData,
})),
] as any, // Type assertion to bypass compiler check
});
}
setInput('');
// Add file cleanup here
setUploadedFiles([]);
setImageDataList([]);
resetEnhancer();
textareaRef.current?.blur();
};
const [messageRef, scrollRef] = useSnapScroll();
useEffect(() => {
const storedApiKeys = Cookies.get('apiKeys');
if (storedApiKeys) {
setApiKeys(JSON.parse(storedApiKeys));
}
}, []);
const handleModelChange = (newModel: string) => {
setModel(newModel);
Cookies.set('selectedModel', newModel, { expires: 30 });
};
const handleProviderChange = (newProvider: ProviderInfo) => {
setProvider(newProvider);
Cookies.set('selectedProvider', newProvider.name, { expires: 30 });
};
return (
<BaseChat
ref={animationScope}
textareaRef={textareaRef}
input={input}
showChat={showChat}
chatStarted={chatStarted}
isStreaming={isLoading}
enhancingPrompt={enhancingPrompt}
promptEnhanced={promptEnhanced}
sendMessage={sendMessage}
model={model}
setModel={handleModelChange}
provider={provider}
setProvider={handleProviderChange}
messageRef={messageRef}
scrollRef={scrollRef}
handleInputChange={handleInputChange}
handleStop={abort}
description={description}
importChat={importChat}
exportChat={exportChat}
messages={messages.map((message, i) => {
if (message.role === 'user') {
return message;
}
return {
...message,
content: parsedMessages[i] || '',
};
})}
enhancePrompt={() => {
enhancePrompt(
input,
(input) => {
setInput(input);
scrollTextArea();
},
model,
provider,
apiKeys,
);
}}
uploadedFiles={uploadedFiles}
setUploadedFiles={setUploadedFiles}
imageDataList={imageDataList}
setImageDataList={setImageDataList}
/>
);
},
);

View File

@ -0,0 +1,32 @@
import React from 'react';
const EXAMPLE_PROMPTS = [
{ text: 'Build a todo app in React using Tailwind' },
{ text: 'Build a simple blog using Astro' },
{ text: 'Create a cookie consent form using Material UI' },
{ text: 'Make a space invaders game' },
{ text: 'How do I center a div?' },
];
export function ExamplePrompts(sendMessage?: { (event: React.UIEvent, messageInput?: string): void | undefined }) {
return (
<div id="examples" className="relative w-full max-w-xl mx-auto mt-8 flex justify-center">
<div className="flex flex-col space-y-2 [mask-image:linear-gradient(to_bottom,black_0%,transparent_180%)] hover:[mask-image:none]">
{EXAMPLE_PROMPTS.map((examplePrompt, index: number) => {
return (
<button
key={index}
onClick={(event) => {
sendMessage?.(event, examplePrompt.text);
}}
className="group flex items-center w-full gap-2 justify-center bg-transparent text-bolt-elements-textTertiary hover:text-bolt-elements-textPrimary transition-theme"
>
{examplePrompt.text}
<div className="i-ph:arrow-bend-down-left" />
</button>
);
})}
</div>
</div>
);
}

View File

@ -3,37 +3,37 @@ import React from 'react';
// Rest of the interface remains the same // Rest of the interface remains the same
interface FilePreviewProps { interface FilePreviewProps {
files: File[]; files: File[];
imageDataList: string[]; imageDataList: string[];
onRemove: (index: number) => void; onRemove: (index: number) => void;
} }
const FilePreview: React.FC<FilePreviewProps> = ({ files, imageDataList, onRemove }) => { const FilePreview: React.FC<FilePreviewProps> = ({ files, imageDataList, onRemove }) => {
if (!files || files.length === 0) { if (!files || files.length === 0) {
return null; return null;
} }
return ( return (
<div className="flex flex-row overflow-x-auto"> <div className="flex flex-row overflow-x-auto">
{files.map((file, index) => ( {files.map((file, index) => (
<div key={file.name + file.size} className="mr-2 relative"> <div key={file.name + file.size} className="mr-2 relative">
{imageDataList[index] && ( {imageDataList[index] && (
<div className="relative"> <div className="relative">
<img src={imageDataList[index]} alt={file.name} className="max-h-20" /> <img src={imageDataList[index]} alt={file.name} className="max-h-20" />
<button <button
onClick={() => onRemove(index)} onClick={() => onRemove(index)}
className="absolute -top-2 -right-2 z-10 bg-white rounded-full p-1 shadow-md hover:bg-gray-100" className="absolute -top-2 -right-2 z-10 bg-white rounded-full p-1 shadow-md hover:bg-gray-100"
> >
<div className="bg-black rounded-full p-1"> <div className="bg-black rounded-full p-1">
<div className="i-ph:x w-3 h-3 text-gray-400" /> <div className="i-ph:x w-3 h-3 text-gray-400" />
</div>
</button>
</div>
)}
</div> </div>
))} </button>
</div>
)}
</div> </div>
); ))}
</div>
);
}; };
export default FilePreview; export default FilePreview;

View File

@ -0,0 +1,164 @@
import React from 'react';
import type { Message } from 'ai';
import { toast } from 'react-toastify';
import ignore from 'ignore';
interface ImportFolderButtonProps {
className?: string;
importChat?: (description: string, messages: Message[]) => Promise<void>;
}
// Common patterns to ignore, similar to .gitignore
const IGNORE_PATTERNS = [
'node_modules/**',
'.git/**',
'dist/**',
'build/**',
'.next/**',
'coverage/**',
'.cache/**',
'.vscode/**',
'.idea/**',
'**/*.log',
'**/.DS_Store',
'**/npm-debug.log*',
'**/yarn-debug.log*',
'**/yarn-error.log*',
];
const ig = ignore().add(IGNORE_PATTERNS);
const generateId = () => Math.random().toString(36).substring(2, 15);
const isBinaryFile = async (file: File): Promise<boolean> => {
const chunkSize = 1024; // Read the first 1 KB of the file
const buffer = new Uint8Array(await file.slice(0, chunkSize).arrayBuffer());
for (let i = 0; i < buffer.length; i++) {
const byte = buffer[i];
if (byte === 0 || (byte < 32 && byte !== 9 && byte !== 10 && byte !== 13)) {
return true; // Found a binary character
}
}
return false;
};
export const ImportFolderButton: React.FC<ImportFolderButtonProps> = ({ className, importChat }) => {
const shouldIncludeFile = (path: string): boolean => {
return !ig.ignores(path);
};
const createChatFromFolder = async (files: File[], binaryFiles: string[]) => {
const fileArtifacts = await Promise.all(
files.map(async (file) => {
return new Promise<string>((resolve, reject) => {
const reader = new FileReader();
reader.onload = () => {
const content = reader.result as string;
const relativePath = file.webkitRelativePath.split('/').slice(1).join('/');
resolve(
`<boltAction type="file" filePath="${relativePath}">
${content}
</boltAction>`,
);
};
reader.onerror = reject;
reader.readAsText(file);
});
}),
);
const binaryFilesMessage =
binaryFiles.length > 0
? `\n\nSkipped ${binaryFiles.length} binary files:\n${binaryFiles.map((f) => `- ${f}`).join('\n')}`
: '';
const message: Message = {
role: 'assistant',
content: `I'll help you set up these files.${binaryFilesMessage}
<boltArtifact id="imported-files" title="Imported Files">
${fileArtifacts.join('\n\n')}
</boltArtifact>`,
id: generateId(),
createdAt: new Date(),
};
const userMessage: Message = {
role: 'user',
id: generateId(),
content: 'Import my files',
createdAt: new Date(),
};
const description = `Folder Import: ${files[0].webkitRelativePath.split('/')[0]}`;
if (importChat) {
await importChat(description, [userMessage, message]);
}
};
return (
<>
<input
type="file"
id="folder-import"
className="hidden"
webkitdirectory=""
directory=""
onChange={async (e) => {
const allFiles = Array.from(e.target.files || []);
const filteredFiles = allFiles.filter((file) => shouldIncludeFile(file.webkitRelativePath));
if (filteredFiles.length === 0) {
toast.error('No files found in the selected folder');
return;
}
try {
const fileChecks = await Promise.all(
filteredFiles.map(async (file) => ({
file,
isBinary: await isBinaryFile(file),
})),
);
const textFiles = fileChecks.filter((f) => !f.isBinary).map((f) => f.file);
const binaryFilePaths = fileChecks
.filter((f) => f.isBinary)
.map((f) => f.file.webkitRelativePath.split('/').slice(1).join('/'));
if (textFiles.length === 0) {
toast.error('No text files found in the selected folder');
return;
}
if (binaryFilePaths.length > 0) {
toast.info(`Skipping ${binaryFilePaths.length} binary files`);
}
await createChatFromFolder(textFiles, binaryFilePaths);
} catch (error) {
console.error('Failed to import folder:', error);
toast.error('Failed to import folder');
}
e.target.value = ''; // Reset file input
}}
{...({} as any)} // if removed webkitdirectory will throw errors as unknow attribute
/>
<button
onClick={() => {
const input = document.getElementById('folder-import');
input?.click();
}}
className={className}
>
<div className="i-ph:folder-simple-upload" />
Import Folder
</button>
</>
);
};

View File

@ -3,11 +3,11 @@ import React from 'react';
import { classNames } from '~/utils/classNames'; import { classNames } from '~/utils/classNames';
import { AssistantMessage } from './AssistantMessage'; import { AssistantMessage } from './AssistantMessage';
import { UserMessage } from './UserMessage'; import { UserMessage } from './UserMessage';
import * as Tooltip from '@radix-ui/react-tooltip';
import { useLocation } from '@remix-run/react'; import { useLocation } from '@remix-run/react';
import { db, chatId } from '~/lib/persistence/useChatHistory'; import { db, chatId } from '~/lib/persistence/useChatHistory';
import { forkChat } from '~/lib/persistence/db'; import { forkChat } from '~/lib/persistence/db';
import { toast } from 'react-toastify'; import { toast } from 'react-toastify';
import WithTooltip from '~/components/ui/Tooltip';
interface MessagesProps { interface MessagesProps {
id?: string; id?: string;
@ -41,92 +41,66 @@ export const Messages = React.forwardRef<HTMLDivElement, MessagesProps>((props:
}; };
return ( return (
<Tooltip.Provider delayDuration={200}> <div id={id} ref={ref} className={props.className}>
<div id={id} ref={ref} className={props.className}> {messages.length > 0
{messages.length > 0 ? messages.map((message, index) => {
? messages.map((message, index) => { const { role, content, id: messageId } = message;
const { role, content, id: messageId } = message; const isUserMessage = role === 'user';
const isUserMessage = role === 'user'; const isFirst = index === 0;
const isFirst = index === 0; const isLast = index === messages.length - 1;
const isLast = index === messages.length - 1;
return ( return (
<div <div
key={index} key={index}
className={classNames('flex gap-4 p-6 w-full rounded-[calc(0.75rem-1px)]', { className={classNames('flex gap-4 p-6 w-full rounded-[calc(0.75rem-1px)]', {
'bg-bolt-elements-messages-background': isUserMessage || !isStreaming || (isStreaming && !isLast), 'bg-bolt-elements-messages-background': isUserMessage || !isStreaming || (isStreaming && !isLast),
'bg-gradient-to-b from-bolt-elements-messages-background from-30% to-transparent': 'bg-gradient-to-b from-bolt-elements-messages-background from-30% to-transparent':
isStreaming && isLast, isStreaming && isLast,
'mt-4': !isFirst, 'mt-4': !isFirst,
})} })}
> >
{isUserMessage && ( {isUserMessage && (
<div className="flex items-center justify-center w-[34px] h-[34px] overflow-hidden bg-white text-gray-600 rounded-full shrink-0 self-start"> <div className="flex items-center justify-center w-[34px] h-[34px] overflow-hidden bg-white text-gray-600 rounded-full shrink-0 self-start">
<div className="i-ph:user-fill text-xl"></div> <div className="i-ph:user-fill text-xl"></div>
</div>
)}
<div className="grid grid-col-1 w-full">
{isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />}
</div> </div>
{!isUserMessage && ( )}
<div className="flex gap-2 flex-col lg:flex-row"> <div className="grid grid-col-1 w-full">
<Tooltip.Root> {isUserMessage ? <UserMessage content={content} /> : <AssistantMessage content={content} />}
<Tooltip.Trigger asChild>
{messageId && (
<button
onClick={() => handleRewind(messageId)}
key="i-ph:arrow-u-up-left"
className={classNames(
'i-ph:arrow-u-up-left',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)}
/>
)}
</Tooltip.Trigger>
<Tooltip.Portal>
<Tooltip.Content
className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
sideOffset={5}
style={{ zIndex: 1000 }}
>
Revert to this message
<Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
</Tooltip.Content>
</Tooltip.Portal>
</Tooltip.Root>
<Tooltip.Root>
<Tooltip.Trigger asChild>
<button
onClick={() => handleFork(messageId)}
key="i-ph:git-fork"
className={classNames(
'i-ph:git-fork',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)}
/>
</Tooltip.Trigger>
<Tooltip.Portal>
<Tooltip.Content
className="bg-bolt-elements-tooltip-background text-bolt-elements-textPrimary px-3 py-2 rounded-lg text-sm shadow-lg"
sideOffset={5}
style={{ zIndex: 1000 }}
>
Fork chat from this message
<Tooltip.Arrow className="fill-bolt-elements-tooltip-background" />
</Tooltip.Content>
</Tooltip.Portal>
</Tooltip.Root>
</div>
)}
</div> </div>
); {!isUserMessage && (
}) <div className="flex gap-2 flex-col lg:flex-row">
: null} <WithTooltip tooltip="Revert to this message">
{isStreaming && ( {messageId && (
<div className="text-center w-full text-bolt-elements-textSecondary i-svg-spinners:3-dots-fade text-4xl mt-4"></div> <button
)} onClick={() => handleRewind(messageId)}
</div> key="i-ph:arrow-u-up-left"
</Tooltip.Provider> className={classNames(
'i-ph:arrow-u-up-left',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)}
/>
)}
</WithTooltip>
<WithTooltip tooltip="Fork chat from this message">
<button
onClick={() => handleFork(messageId)}
key="i-ph:git-fork"
className={classNames(
'i-ph:git-fork',
'text-xl text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary transition-colors',
)}
/>
</WithTooltip>
</div>
)}
</div>
);
})
: null}
{isStreaming && (
<div className="text-center w-full text-bolt-elements-textSecondary i-svg-spinners:3-dots-fade text-4xl mt-4"></div>
)}
</div>
); );
}); });

View File

@ -1,6 +1,7 @@
// @ts-nocheck /*
// Preventing TS checks with files presented in the video for a better presentation. * @ts-nocheck
import { modificationsRegex } from '~/utils/diff'; * Preventing TS checks with files presented in the video for a better presentation.
*/
import { MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants'; import { MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
import { Markdown } from './Markdown'; import { Markdown } from './Markdown';
@ -18,11 +19,11 @@ export function UserMessage({ content }: UserMessageProps) {
); );
} }
function sanitizeUserMessage(content: string | Array<{type: string, text?: string, image_url?: {url: string}}>) { function sanitizeUserMessage(content: string | Array<{ type: string; text?: string; image_url?: { url: string } }>) {
if (Array.isArray(content)) { if (Array.isArray(content)) {
const textItem = content.find(item => item.type === 'text'); const textItem = content.find((item) => item.type === 'text');
return textItem?.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') || ''; return textItem?.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') || '';
} }
return content.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''); return content.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
} }

View File

@ -0,0 +1,13 @@
import WithTooltip from '~/components/ui/Tooltip';
import { IconButton } from '~/components/ui/IconButton';
import React from 'react';
export const ExportChatButton = ({ exportChat }: { exportChat?: () => void }) => {
return (
<WithTooltip tooltip="Export Chat">
<IconButton title="Export Chat" onClick={() => exportChat?.()}>
<div className="i-ph:download-simple text-xl"></div>
</IconButton>
</WithTooltip>
);
};

View File

@ -0,0 +1,71 @@
import type { Message } from 'ai';
import { toast } from 'react-toastify';
import React from 'react';
import { ImportFolderButton } from '~/components/chat/ImportFolderButton';
export function ImportButtons(importChat: ((description: string, messages: Message[]) => Promise<void>) | undefined) {
return (
<div className="flex flex-col items-center justify-center flex-1 p-4">
<input
type="file"
id="chat-import"
className="hidden"
accept=".json"
onChange={async (e) => {
const file = e.target.files?.[0];
if (file && importChat) {
try {
const reader = new FileReader();
reader.onload = async (e) => {
try {
const content = e.target?.result as string;
const data = JSON.parse(content);
if (!Array.isArray(data.messages)) {
toast.error('Invalid chat file format');
}
await importChat(data.description, data.messages);
toast.success('Chat imported successfully');
} catch (error: unknown) {
if (error instanceof Error) {
toast.error('Failed to parse chat file: ' + error.message);
} else {
toast.error('Failed to parse chat file');
}
}
};
reader.onerror = () => toast.error('Failed to read chat file');
reader.readAsText(file);
} catch (error) {
toast.error(error instanceof Error ? error.message : 'Failed to import chat');
}
e.target.value = ''; // Reset file input
} else {
toast.error('Something went wrong');
}
}}
/>
<div className="flex flex-col items-center gap-4 max-w-2xl text-center">
<div className="flex gap-2">
<button
onClick={() => {
const input = document.getElementById('chat-import');
input?.click();
}}
className="px-4 py-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 transition-all flex items-center gap-2"
>
<div className="i-ph:upload-simple" />
Import Chat
</button>
<ImportFolderButton
importChat={importChat}
className="px-4 py-2 rounded-lg border border-bolt-elements-borderColor bg-bolt-elements-prompt-background text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 transition-all flex items-center gap-2"
/>
</div>
</div>
</div>
);
}

View File

@ -1,70 +1,55 @@
import * as Dialog from '@radix-ui/react-dialog'; import * as Dialog from '@radix-ui/react-dialog';
import { useEffect, useRef, useState } from 'react';
import { type ChatHistoryItem } from '~/lib/persistence'; import { type ChatHistoryItem } from '~/lib/persistence';
import WithTooltip from '~/components/ui/Tooltip';
interface HistoryItemProps { interface HistoryItemProps {
item: ChatHistoryItem; item: ChatHistoryItem;
onDelete?: (event: React.UIEvent) => void; onDelete?: (event: React.UIEvent) => void;
onDuplicate?: (id: string) => void; onDuplicate?: (id: string) => void;
exportChat: (id?: string) => void;
} }
export function HistoryItem({ item, onDelete, onDuplicate }: HistoryItemProps) { export function HistoryItem({ item, onDelete, onDuplicate, exportChat }: HistoryItemProps) {
const [hovering, setHovering] = useState(false);
const hoverRef = useRef<HTMLDivElement>(null);
useEffect(() => {
let timeout: NodeJS.Timeout | undefined;
function mouseEnter() {
setHovering(true);
if (timeout) {
clearTimeout(timeout);
}
}
function mouseLeave() {
setHovering(false);
}
hoverRef.current?.addEventListener('mouseenter', mouseEnter);
hoverRef.current?.addEventListener('mouseleave', mouseLeave);
return () => {
hoverRef.current?.removeEventListener('mouseenter', mouseEnter);
hoverRef.current?.removeEventListener('mouseleave', mouseLeave);
};
}, []);
return ( return (
<div <div className="group rounded-md text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 overflow-hidden flex justify-between items-center px-2 py-1">
ref={hoverRef}
className="group rounded-md text-bolt-elements-textSecondary hover:text-bolt-elements-textPrimary hover:bg-bolt-elements-background-depth-3 overflow-hidden flex justify-between items-center px-2 py-1"
>
<a href={`/chat/${item.urlId}`} className="flex w-full relative truncate block"> <a href={`/chat/${item.urlId}`} className="flex w-full relative truncate block">
{item.description} {item.description}
<div className="absolute right-0 z-1 top-0 bottom-0 bg-gradient-to-l from-bolt-elements-background-depth-2 group-hover:from-bolt-elements-background-depth-3 to-transparent w-10 flex justify-end group-hover:w-15 group-hover:from-45%"> <div className="absolute right-0 z-1 top-0 bottom-0 bg-gradient-to-l from-bolt-elements-background-depth-2 group-hover:from-bolt-elements-background-depth-3 box-content pl-3 to-transparent w-10 flex justify-end group-hover:w-15 group-hover:from-99%">
{hovering && ( <div className="flex items-center p-1 text-bolt-elements-textSecondary opacity-0 group-hover:opacity-100 transition-opacity">
<div className="flex items-center p-1 text-bolt-elements-textSecondary"> <WithTooltip tooltip="Export chat">
{onDuplicate && ( <button
type="button"
className="i-ph:download-simple scale-110 mr-2 hover:text-bolt-elements-item-contentAccent"
onClick={(event) => {
event.preventDefault();
exportChat(item.id);
}}
title="Export chat"
/>
</WithTooltip>
{onDuplicate && (
<WithTooltip tooltip="Duplicate chat">
<button <button
className="i-ph:copy scale-110 mr-2" type="button"
className="i-ph:copy scale-110 mr-2 hover:text-bolt-elements-item-contentAccent"
onClick={() => onDuplicate?.(item.id)} onClick={() => onDuplicate?.(item.id)}
title="Duplicate chat" title="Duplicate chat"
/> />
)} </WithTooltip>
<Dialog.Trigger asChild> )}
<Dialog.Trigger asChild>
<WithTooltip tooltip="Delete chat">
<button <button
className="i-ph:trash scale-110" type="button"
className="i-ph:trash scale-110 hover:text-bolt-elements-button-danger-text"
onClick={(event) => { onClick={(event) => {
// we prevent the default so we don't trigger the anchor above
event.preventDefault(); event.preventDefault();
onDelete?.(event); onDelete?.(event);
}} }}
/> />
</Dialog.Trigger> </WithTooltip>
</div> </Dialog.Trigger>
)} </div>
</div> </div>
</a> </a>
</div> </div>

View File

@ -2,7 +2,6 @@ import { motion, type Variants } from 'framer-motion';
import { useCallback, useEffect, useRef, useState } from 'react'; import { useCallback, useEffect, useRef, useState } from 'react';
import { toast } from 'react-toastify'; import { toast } from 'react-toastify';
import { Dialog, DialogButton, DialogDescription, DialogRoot, DialogTitle } from '~/components/ui/Dialog'; import { Dialog, DialogButton, DialogDescription, DialogRoot, DialogTitle } from '~/components/ui/Dialog';
import { IconButton } from '~/components/ui/IconButton';
import { ThemeSwitch } from '~/components/ui/ThemeSwitch'; import { ThemeSwitch } from '~/components/ui/ThemeSwitch';
import { db, deleteById, getAll, chatId, type ChatHistoryItem, useChatHistory } from '~/lib/persistence'; import { db, deleteById, getAll, chatId, type ChatHistoryItem, useChatHistory } from '~/lib/persistence';
import { cubicEasingFn } from '~/utils/easings'; import { cubicEasingFn } from '~/utils/easings';
@ -34,7 +33,7 @@ const menuVariants = {
type DialogContent = { type: 'delete'; item: ChatHistoryItem } | null; type DialogContent = { type: 'delete'; item: ChatHistoryItem } | null;
export const Menu = () => { export const Menu = () => {
const { duplicateCurrentChat } = useChatHistory(); const { duplicateCurrentChat, exportChat } = useChatHistory();
const menuRef = useRef<HTMLDivElement>(null); const menuRef = useRef<HTMLDivElement>(null);
const [list, setList] = useState<ChatHistoryItem[]>([]); const [list, setList] = useState<ChatHistoryItem[]>([]);
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
@ -102,7 +101,6 @@ export const Menu = () => {
const handleDeleteClick = (event: React.UIEvent, item: ChatHistoryItem) => { const handleDeleteClick = (event: React.UIEvent, item: ChatHistoryItem) => {
event.preventDefault(); event.preventDefault();
setDialogContent({ type: 'delete', item }); setDialogContent({ type: 'delete', item });
}; };
@ -131,7 +129,7 @@ export const Menu = () => {
</a> </a>
</div> </div>
<div className="text-bolt-elements-textPrimary font-medium pl-6 pr-5 my-2">Your Chats</div> <div className="text-bolt-elements-textPrimary font-medium pl-6 pr-5 my-2">Your Chats</div>
<div className="flex-1 overflow-scroll pl-4 pr-5 pb-5"> <div className="flex-1 overflow-auto pl-4 pr-5 pb-5">
{list.length === 0 && <div className="pl-2 text-bolt-elements-textTertiary">No previous conversations</div>} {list.length === 0 && <div className="pl-2 text-bolt-elements-textTertiary">No previous conversations</div>}
<DialogRoot open={dialogContent !== null}> <DialogRoot open={dialogContent !== null}>
{binDates(list).map(({ category, items }) => ( {binDates(list).map(({ category, items }) => (
@ -143,6 +141,7 @@ export const Menu = () => {
<HistoryItem <HistoryItem
key={item.id} key={item.id}
item={item} item={item}
exportChat={exportChat}
onDelete={(event) => handleDeleteClick(event, item)} onDelete={(event) => handleDeleteClick(event, item)}
onDuplicate={() => handleDuplicate(item.id)} onDuplicate={() => handleDuplicate(item.id)}
/> />
@ -186,4 +185,4 @@ export const Menu = () => {
</div> </div>
</motion.div> </motion.div>
); );
} };

View File

@ -0,0 +1,73 @@
import * as Tooltip from '@radix-ui/react-tooltip';
interface TooltipProps {
tooltip: React.ReactNode;
children: React.ReactNode;
sideOffset?: number;
className?: string;
arrowClassName?: string;
tooltipStyle?: React.CSSProperties;
position?: 'top' | 'bottom' | 'left' | 'right';
maxWidth?: number;
delay?: number;
}
const WithTooltip = ({
tooltip,
children,
sideOffset = 5,
className = '',
arrowClassName = '',
tooltipStyle = {},
position = 'top',
maxWidth = 250,
delay = 0,
}: TooltipProps) => {
return (
<Tooltip.Root delayDuration={delay}>
<Tooltip.Trigger asChild>{children}</Tooltip.Trigger>
<Tooltip.Portal>
<Tooltip.Content
side={position}
className={`
z-[2000]
px-2.5
py-1.5
max-h-[300px]
select-none
rounded-md
bg-bolt-elements-background-depth-3
text-bolt-elements-textPrimary
text-sm
leading-tight
shadow-lg
animate-in
fade-in-0
zoom-in-95
data-[state=closed]:animate-out
data-[state=closed]:fade-out-0
data-[state=closed]:zoom-out-95
${className}
`}
sideOffset={sideOffset}
style={{
maxWidth,
...tooltipStyle,
}}
>
<div className="break-words">{tooltip}</div>
<Tooltip.Arrow
className={`
fill-bolt-elements-background-depth-3
${arrowClassName}
`}
width={12}
height={6}
/>
</Tooltip.Content>
</Tooltip.Portal>
</Tooltip.Root>
);
};
export default WithTooltip;

View File

@ -239,7 +239,7 @@ export const EditorPanel = memo(
<div className="i-ph:terminal-window-duotone text-lg" /> <div className="i-ph:terminal-window-duotone text-lg" />
Terminal {terminalCount > 1 && index} Terminal {terminalCount > 1 && index}
</button> </button>
</React.Fragment> </React.Fragment>
)} )}
</React.Fragment> </React.Fragment>
); );
@ -255,6 +255,7 @@ export const EditorPanel = memo(
</div> </div>
{Array.from({ length: terminalCount + 1 }, (_, index) => { {Array.from({ length: terminalCount + 1 }, (_, index) => {
const isActive = activeTerminal === index; const isActive = activeTerminal === index;
if (index == 0) { if (index == 0) {
logger.info('Starting bolt terminal'); logger.info('Starting bolt terminal');
@ -273,6 +274,7 @@ export const EditorPanel = memo(
/> />
); );
} }
return ( return (
<Terminal <Terminal
key={index} key={index}

View File

@ -111,7 +111,7 @@ export const FileTree = memo(
}; };
return ( return (
<div className={classNames('text-sm', className)}> <div className={classNames('text-sm', className, 'overflow-y-auto')}>
{filteredFileList.map((fileOrFolder) => { {filteredFileList.map((fileOrFolder) => {
switch (fileOrFolder.kind) { switch (fileOrFolder.kind) {
case 'file': { case 'file': {

View File

@ -57,7 +57,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
renderLogger.trace('Workbench'); renderLogger.trace('Workbench');
const [isSyncing, setIsSyncing] = useState(false); const [isSyncing, setIsSyncing] = useState(false);
const [isUploading, setIsUploading] = useState(false);
const hasPreview = useStore(computed(workbenchStore.previews, (previews) => previews.length > 0)); const hasPreview = useStore(computed(workbenchStore.previews, (previews) => previews.length > 0));
const showWorkbench = useStore(workbenchStore.showWorkbench); const showWorkbench = useStore(workbenchStore.showWorkbench);
@ -120,60 +119,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
} }
}, []); }, []);
const handleUploadFiles = useCallback(async () => {
setIsUploading(true);
try {
// const directoryHandle = await window.showDirectoryPicker();
// // First upload new files
// await workbenchStore.uploadFilesFromDisk(directoryHandle);
// // Get current files state
// const currentFiles = workbenchStore.files.get();
// // Create new modifications map with all files as "new"
// const newModifications = new Map();
// Object.entries(currentFiles).forEach(([path, file]) => {
// if (file.type === 'file') {
// newModifications.set(path, file.content);
// }
// });
// // Update workbench state
// await workbenchStore.refreshFiles();
// workbenchStore.resetAllFileModifications();
// toast.success('Files uploaded successfully');
// } catch (error) {
// toast.error('Failed to upload files');
// }
await handleUploadFilesFunc();
}
finally {
setIsUploading(false);
}
}, []);
async function handleUploadFilesFunc() {
try {
// First clean all statuses
await workbenchStore.saveAllFiles();
await workbenchStore.resetAllFileModifications();
await workbenchStore.refreshFiles();
// Now upload new files
const directoryHandle = await window.showDirectoryPicker();
await workbenchStore.uploadFilesFromDisk(directoryHandle);
toast.success('Files uploaded successfully');
} catch (error) {
console.error('Upload files error:', error);
toast.error('Failed to upload files');
}
}
return ( return (
chatStarted && ( chatStarted && (
<motion.div <motion.div
@ -213,10 +158,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
{isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-down" />} {isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-down" />}
{isSyncing ? 'Syncing...' : 'Sync Files'} {isSyncing ? 'Syncing...' : 'Sync Files'}
</PanelHeaderButton> </PanelHeaderButton>
<PanelHeaderButton className="mr-1 text-sm" onClick={handleUploadFiles} disabled={isSyncing}>
{isSyncing ? <div className="i-ph:spinner" /> : <div className="i-ph:cloud-arrow-up" />}
{isSyncing ? 'Uploading...' : 'Upload Files'}
</PanelHeaderButton>
<PanelHeaderButton <PanelHeaderButton
className="mr-1 text-sm" className="mr-1 text-sm"
onClick={() => { onClick={() => {
@ -233,16 +174,21 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) =>
'Please enter a name for your new GitHub repository:', 'Please enter a name for your new GitHub repository:',
'bolt-generated-project', 'bolt-generated-project',
); );
if (!repoName) { if (!repoName) {
alert('Repository name is required. Push to GitHub cancelled.'); alert('Repository name is required. Push to GitHub cancelled.');
return; return;
} }
const githubUsername = prompt('Please enter your GitHub username:'); const githubUsername = prompt('Please enter your GitHub username:');
if (!githubUsername) { if (!githubUsername) {
alert('GitHub username is required. Push to GitHub cancelled.'); alert('GitHub username is required. Push to GitHub cancelled.');
return; return;
} }
const githubToken = prompt('Please enter your GitHub personal access token:'); const githubToken = prompt('Please enter your GitHub personal access token:');
if (!githubToken) { if (!githubToken) {
alert('GitHub token is required. Push to GitHub cancelled.'); alert('GitHub token is required. Push to GitHub cancelled.');
return; return;

View File

@ -1,5 +1,7 @@
// @ts-nocheck /*
// Preventing TS checks with files presented in the video for a better presentation. * @ts-nocheck
* Preventing TS checks with files presented in the video for a better presentation.
*/
import { env } from 'node:process'; import { env } from 'node:process';
export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Record<string, string>) { export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Record<string, string>) {
@ -28,17 +30,19 @@ export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Re
case 'OpenRouter': case 'OpenRouter':
return env.OPEN_ROUTER_API_KEY || cloudflareEnv.OPEN_ROUTER_API_KEY; return env.OPEN_ROUTER_API_KEY || cloudflareEnv.OPEN_ROUTER_API_KEY;
case 'Deepseek': case 'Deepseek':
return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY;
case 'Mistral': case 'Mistral':
return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY; return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY;
case "OpenAILike": case 'OpenAILike':
return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY; return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY;
case "xAI": case 'xAI':
return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY; return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY;
case "Cohere": case 'Cohere':
return env.COHERE_API_KEY; return env.COHERE_API_KEY;
case 'AzureOpenAI':
return env.AZURE_OPENAI_API_KEY;
default: default:
return ""; return '';
} }
} }
@ -47,14 +51,17 @@ export function getBaseURL(cloudflareEnv: Env, provider: string) {
case 'OpenAILike': case 'OpenAILike':
return env.OPENAI_LIKE_API_BASE_URL || cloudflareEnv.OPENAI_LIKE_API_BASE_URL; return env.OPENAI_LIKE_API_BASE_URL || cloudflareEnv.OPENAI_LIKE_API_BASE_URL;
case 'LMStudio': case 'LMStudio':
return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || "http://localhost:1234"; return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
case 'Ollama': case 'Ollama': {
let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || "http://localhost:11434"; let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || 'http://localhost:11434';
if (env.RUNNING_IN_DOCKER === 'true') {
baseUrl = baseUrl.replace("localhost", "host.docker.internal"); if (env.RUNNING_IN_DOCKER === 'true') {
} baseUrl = baseUrl.replace('localhost', 'host.docker.internal');
return baseUrl; }
return baseUrl;
}
default: default:
return ""; return '';
} }
} }

View File

@ -1,27 +1,29 @@
// @ts-nocheck /*
// Preventing TS checks with files presented in the video for a better presentation. * @ts-nocheck
* Preventing TS checks with files presented in the video for a better presentation.
*/
import { getAPIKey, getBaseURL } from '~/lib/.server/llm/api-key'; import { getAPIKey, getBaseURL } from '~/lib/.server/llm/api-key';
import { createAnthropic } from '@ai-sdk/anthropic'; import { createAnthropic } from '@ai-sdk/anthropic';
import { createOpenAI } from '@ai-sdk/openai'; import { createOpenAI } from '@ai-sdk/openai';
import { createGoogleGenerativeAI } from '@ai-sdk/google'; import { createGoogleGenerativeAI } from '@ai-sdk/google';
import { ollama } from 'ollama-ai-provider'; import { ollama } from 'ollama-ai-provider';
import { createOpenRouter } from "@openrouter/ai-sdk-provider"; import { createOpenRouter } from '@openrouter/ai-sdk-provider';
import { createMistral } from '@ai-sdk/mistral'; import { createMistral } from '@ai-sdk/mistral';
import { createCohere } from '@ai-sdk/cohere' import { createCohere } from '@ai-sdk/cohere';
import type { LanguageModelV1 } from 'ai';
export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? parseInt(process.env.DEFAULT_NUM_CTX, 10) : 32768;
parseInt(process.env.DEFAULT_NUM_CTX, 10) :
32768;
export function getAnthropicModel(apiKey: string, model: string) { type OptionalApiKey = string | undefined;
export function getAnthropicModel(apiKey: OptionalApiKey, model: string) {
const anthropic = createAnthropic({ const anthropic = createAnthropic({
apiKey, apiKey,
}); });
return anthropic(model); return anthropic(model);
} }
export function getOpenAILikeModel(baseURL: string, apiKey: OptionalApiKey, model: string) {
export function getOpenAILikeModel(baseURL: string, apiKey: string, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
baseURL, baseURL,
apiKey, apiKey,
@ -30,7 +32,7 @@ export function getOpenAILikeModel(baseURL: string, apiKey: string, model: strin
return openai(model); return openai(model);
} }
export function getCohereAIModel(apiKey:string, model: string){ export function getCohereAIModel(apiKey: OptionalApiKey, model: string) {
const cohere = createCohere({ const cohere = createCohere({
apiKey, apiKey,
}); });
@ -38,7 +40,7 @@ export function getCohereAIModel(apiKey:string, model: string){
return cohere(model); return cohere(model);
} }
export function getOpenAIModel(apiKey: string, model: string) { export function getOpenAIModel(apiKey: OptionalApiKey, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
apiKey, apiKey,
}); });
@ -46,15 +48,15 @@ export function getOpenAIModel(apiKey: string, model: string) {
return openai(model); return openai(model);
} }
export function getMistralModel(apiKey: string, model: string) { export function getMistralModel(apiKey: OptionalApiKey, model: string) {
const mistral = createMistral({ const mistral = createMistral({
apiKey apiKey,
}); });
return mistral(model); return mistral(model);
} }
export function getGoogleModel(apiKey: string, model: string) { export function getGoogleModel(apiKey: OptionalApiKey, model: string) {
const google = createGoogleGenerativeAI({ const google = createGoogleGenerativeAI({
apiKey, apiKey,
}); });
@ -62,7 +64,7 @@ export function getGoogleModel(apiKey: string, model: string) {
return google(model); return google(model);
} }
export function getGroqModel(apiKey: string, model: string) { export function getGroqModel(apiKey: OptionalApiKey, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
baseURL: 'https://api.groq.com/openai/v1', baseURL: 'https://api.groq.com/openai/v1',
apiKey, apiKey,
@ -71,7 +73,7 @@ export function getGroqModel(apiKey: string, model: string) {
return openai(model); return openai(model);
} }
export function getHuggingFaceModel(apiKey: string, model: string) { export function getHuggingFaceModel(apiKey: OptionalApiKey, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
baseURL: 'https://api-inference.huggingface.co/v1/', baseURL: 'https://api-inference.huggingface.co/v1/',
apiKey, apiKey,
@ -81,15 +83,16 @@ export function getHuggingFaceModel(apiKey: string, model: string) {
} }
export function getOllamaModel(baseURL: string, model: string) { export function getOllamaModel(baseURL: string, model: string) {
let Ollama = ollama(model, { const ollamaInstance = ollama(model, {
numCtx: DEFAULT_NUM_CTX, numCtx: DEFAULT_NUM_CTX,
}); }) as LanguageModelV1 & { config: any };
Ollama.config.baseURL = `${baseURL}/api`; ollamaInstance.config.baseURL = `${baseURL}/api`;
return Ollama;
return ollamaInstance;
} }
export function getDeepseekModel(apiKey: string, model: string) { export function getDeepseekModel(apiKey: OptionalApiKey, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
baseURL: 'https://api.deepseek.com/beta', baseURL: 'https://api.deepseek.com/beta',
apiKey, apiKey,
@ -98,9 +101,9 @@ export function getDeepseekModel(apiKey: string, model: string) {
return openai(model); return openai(model);
} }
export function getOpenRouterModel(apiKey: string, model: string) { export function getOpenRouterModel(apiKey: OptionalApiKey, model: string) {
const openRouter = createOpenRouter({ const openRouter = createOpenRouter({
apiKey apiKey,
}); });
return openRouter.chat(model); return openRouter.chat(model);
@ -109,13 +112,13 @@ export function getOpenRouterModel(apiKey: string, model: string) {
export function getLMStudioModel(baseURL: string, model: string) { export function getLMStudioModel(baseURL: string, model: string) {
const lmstudio = createOpenAI({ const lmstudio = createOpenAI({
baseUrl: `${baseURL}/v1`, baseUrl: `${baseURL}/v1`,
apiKey: "", apiKey: '',
}); });
return lmstudio(model); return lmstudio(model);
} }
export function getXAIModel(apiKey: string, model: string) { export function getXAIModel(apiKey: OptionalApiKey, model: string) {
const openai = createOpenAI({ const openai = createOpenAI({
baseURL: 'https://api.x.ai/v1', baseURL: 'https://api.x.ai/v1',
apiKey, apiKey,
@ -125,11 +128,13 @@ export function getXAIModel(apiKey: string, model: string) {
} }
export function getModel(provider: string, model: string, env: Env, apiKeys?: Record<string, string>) { export function getModel(provider: string, model: string, env: Env, apiKeys?: Record<string, string>) {
let apiKey; // Declare first /*
let baseURL; * let apiKey; // Declare first
* let baseURL;
*/
apiKey = getAPIKey(env, provider, apiKeys); // Then assign const apiKey = getAPIKey(env, provider, apiKeys); // Then assign
baseURL = getBaseURL(env, provider); const baseURL = getBaseURL(env, provider);
switch (provider) { switch (provider) {
case 'Anthropic': case 'Anthropic':
@ -159,4 +164,4 @@ export function getModel(provider: string, model: string, env: Env, apiKeys?: Re
default: default:
return getOllamaModel(baseURL, model); return getOllamaModel(baseURL, model);
} }
} }

View File

@ -1,10 +1,11 @@
// @ts-nocheck // eslint-disable-next-line @typescript-eslint/ban-ts-comment
// Preventing TS checks with files presented in the video for a better presentation. // @ts-nocheck TODO: Provider proper types
import { streamText as _streamText, convertToCoreMessages } from 'ai';
import { convertToCoreMessages, streamText as _streamText } from 'ai';
import { getModel } from '~/lib/.server/llm/model'; import { getModel } from '~/lib/.server/llm/model';
import { MAX_TOKENS } from './constants'; import { MAX_TOKENS } from './constants';
import { getSystemPrompt } from './prompts'; import { getSystemPrompt } from './prompts';
import { MODEL_LIST, DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants'; import { DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_LIST, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants';
interface ToolResult<Name extends string, Args, Result> { interface ToolResult<Name extends string, Args, Result> {
toolCallId: string; toolCallId: string;
@ -26,41 +27,41 @@ export type StreamingOptions = Omit<Parameters<typeof _streamText>[0], 'model'>;
function extractPropertiesFromMessage(message: Message): { model: string; provider: string; content: string } { function extractPropertiesFromMessage(message: Message): { model: string; provider: string; content: string } {
const textContent = Array.isArray(message.content) const textContent = Array.isArray(message.content)
? message.content.find(item => item.type === 'text')?.text || '' ? message.content.find((item) => item.type === 'text')?.text || ''
: message.content; : message.content;
const modelMatch = textContent.match(MODEL_REGEX); const modelMatch = textContent.match(MODEL_REGEX);
const providerMatch = textContent.match(PROVIDER_REGEX); const providerMatch = textContent.match(PROVIDER_REGEX);
// Extract model /*
// const modelMatch = message.content.match(MODEL_REGEX); * Extract model
* const modelMatch = message.content.match(MODEL_REGEX);
*/
const model = modelMatch ? modelMatch[1] : DEFAULT_MODEL; const model = modelMatch ? modelMatch[1] : DEFAULT_MODEL;
// Extract provider /*
// const providerMatch = message.content.match(PROVIDER_REGEX); * Extract provider
* const providerMatch = message.content.match(PROVIDER_REGEX);
*/
const provider = providerMatch ? providerMatch[1] : DEFAULT_PROVIDER; const provider = providerMatch ? providerMatch[1] : DEFAULT_PROVIDER;
const cleanedContent = Array.isArray(message.content) const cleanedContent = Array.isArray(message.content)
? message.content.map(item => { ? message.content.map((item) => {
if (item.type === 'text') { if (item.type === 'text') {
return { return {
type: 'text', type: 'text',
text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''),
}; };
} }
return item; // Preserve image_url and other types as is
}) return item; // Preserve image_url and other types as is
})
: textContent.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''); : textContent.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '');
return { model, provider, content: cleanedContent }; return { model, provider, content: cleanedContent };
} }
export function streamText( export function streamText(messages: Messages, env: Env, options?: StreamingOptions, apiKeys?: Record<string, string>) {
messages: Messages,
env: Env,
options?: StreamingOptions,
apiKeys?: Record<string, string>
) {
let currentModel = DEFAULT_MODEL; let currentModel = DEFAULT_MODEL;
let currentProvider = DEFAULT_PROVIDER; let currentProvider = DEFAULT_PROVIDER;
@ -76,15 +77,13 @@ export function streamText(
return { ...message, content }; return { ...message, content };
} }
return message; return message;
}); });
const modelDetails = MODEL_LIST.find((m) => m.name === currentModel); const modelDetails = MODEL_LIST.find((m) => m.name === currentModel);
const dynamicMaxTokens = const dynamicMaxTokens = modelDetails && modelDetails.maxTokenAllowed ? modelDetails.maxTokenAllowed : MAX_TOKENS;
modelDetails && modelDetails.maxTokenAllowed
? modelDetails.maxTokenAllowed
: MAX_TOKENS;
return _streamText({ return _streamText({
...options, ...options,

View File

@ -161,46 +161,48 @@ async function getUrlIds(db: IDBDatabase): Promise<string[]> {
export async function forkChat(db: IDBDatabase, chatId: string, messageId: string): Promise<string> { export async function forkChat(db: IDBDatabase, chatId: string, messageId: string): Promise<string> {
const chat = await getMessages(db, chatId); const chat = await getMessages(db, chatId);
if (!chat) throw new Error('Chat not found');
// Find the index of the message to fork at
const messageIndex = chat.messages.findIndex(msg => msg.id === messageId);
if (messageIndex === -1) throw new Error('Message not found');
// Get messages up to and including the selected message
const messages = chat.messages.slice(0, messageIndex + 1);
// Generate new IDs
const newId = await getNextId(db);
const urlId = await getUrlId(db, newId);
// Create the forked chat
await setMessages(
db,
newId,
messages,
urlId,
chat.description ? `${chat.description} (fork)` : 'Forked chat'
);
return urlId;
}
export async function duplicateChat(db: IDBDatabase, id: string): Promise<string> {
const chat = await getMessages(db, id);
if (!chat) { if (!chat) {
throw new Error('Chat not found'); throw new Error('Chat not found');
} }
// Find the index of the message to fork at
const messageIndex = chat.messages.findIndex((msg) => msg.id === messageId);
if (messageIndex === -1) {
throw new Error('Message not found');
}
// Get messages up to and including the selected message
const messages = chat.messages.slice(0, messageIndex + 1);
return createChatFromMessages(db, chat.description ? `${chat.description} (fork)` : 'Forked chat', messages);
}
export async function duplicateChat(db: IDBDatabase, id: string): Promise<string> {
const chat = await getMessages(db, id);
if (!chat) {
throw new Error('Chat not found');
}
return createChatFromMessages(db, `${chat.description || 'Chat'} (copy)`, chat.messages);
}
export async function createChatFromMessages(
db: IDBDatabase,
description: string,
messages: Message[],
): Promise<string> {
const newId = await getNextId(db); const newId = await getNextId(db);
const newUrlId = await getUrlId(db, newId); // Get a new urlId for the duplicated chat const newUrlId = await getUrlId(db, newId); // Get a new urlId for the duplicated chat
await setMessages( await setMessages(
db, db,
newId, newId,
chat.messages, messages,
newUrlId, // Use the new urlId newUrlId, // Use the new urlId
`${chat.description || 'Chat'} (copy)` description,
); );
return newUrlId; // Return the urlId instead of id for navigation return newUrlId; // Return the urlId instead of id for navigation

View File

@ -4,7 +4,15 @@ import { atom } from 'nanostores';
import type { Message } from 'ai'; import type { Message } from 'ai';
import { toast } from 'react-toastify'; import { toast } from 'react-toastify';
import { workbenchStore } from '~/lib/stores/workbench'; import { workbenchStore } from '~/lib/stores/workbench';
import { getMessages, getNextId, getUrlId, openDatabase, setMessages, duplicateChat } from './db'; import {
getMessages,
getNextId,
getUrlId,
openDatabase,
setMessages,
duplicateChat,
createChatFromMessages,
} from './db';
export interface ChatHistoryItem { export interface ChatHistoryItem {
id: string; id: string;
@ -99,7 +107,7 @@ export function useChatHistory() {
await setMessages(db, chatId.get() as string, messages, urlId, description.get()); await setMessages(db, chatId.get() as string, messages, urlId, description.get());
}, },
duplicateCurrentChat: async (listItemId:string) => { duplicateCurrentChat: async (listItemId: string) => {
if (!db || (!mixedId && !listItemId)) { if (!db || (!mixedId && !listItemId)) {
return; return;
} }
@ -110,8 +118,48 @@ export function useChatHistory() {
toast.success('Chat duplicated successfully'); toast.success('Chat duplicated successfully');
} catch (error) { } catch (error) {
toast.error('Failed to duplicate chat'); toast.error('Failed to duplicate chat');
console.log(error);
} }
} },
importChat: async (description: string, messages: Message[]) => {
if (!db) {
return;
}
try {
const newId = await createChatFromMessages(db, description, messages);
window.location.href = `/chat/${newId}`;
toast.success('Chat imported successfully');
} catch (error) {
if (error instanceof Error) {
toast.error('Failed to import chat: ' + error.message);
} else {
toast.error('Failed to import chat');
}
}
},
exportChat: async (id = urlId) => {
if (!db || !id) {
return;
}
const chat = await getMessages(db, id);
const chatData = {
messages: chat.messages,
description: chat.description,
exportDate: new Date().toISOString(),
};
const blob = new Blob([JSON.stringify(chatData, null, 2)], { type: 'application/json' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = `chat-${new Date().toISOString()}.json`;
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
URL.revokeObjectURL(url);
},
}; };
} }

View File

@ -1,11 +1,10 @@
import { WebContainer, type WebContainerProcess } from '@webcontainer/api'; import { WebContainer } from '@webcontainer/api';
import { atom, map, type MapStore } from 'nanostores'; import { atom, map, type MapStore } from 'nanostores';
import * as nodePath from 'node:path'; import * as nodePath from 'node:path';
import type { BoltAction } from '~/types/actions'; import type { BoltAction } from '~/types/actions';
import { createScopedLogger } from '~/utils/logger'; import { createScopedLogger } from '~/utils/logger';
import { unreachable } from '~/utils/unreachable'; import { unreachable } from '~/utils/unreachable';
import type { ActionCallbackData } from './message-parser'; import type { ActionCallbackData } from './message-parser';
import type { ITerminal } from '~/types/terminal';
import type { BoltShell } from '~/utils/shell'; import type { BoltShell } from '~/utils/shell';
const logger = createScopedLogger('ActionRunner'); const logger = createScopedLogger('ActionRunner');
@ -45,7 +44,6 @@ export class ActionRunner {
constructor(webcontainerPromise: Promise<WebContainer>, getShellTerminal: () => BoltShell) { constructor(webcontainerPromise: Promise<WebContainer>, getShellTerminal: () => BoltShell) {
this.#webcontainer = webcontainerPromise; this.#webcontainer = webcontainerPromise;
this.#shellTerminal = getShellTerminal; this.#shellTerminal = getShellTerminal;
} }
addAction(data: ActionCallbackData) { addAction(data: ActionCallbackData) {
@ -88,15 +86,16 @@ export class ActionRunner {
if (action.executed) { if (action.executed) {
return; return;
} }
if (isStreaming && action.type !== 'file') { if (isStreaming && action.type !== 'file') {
return; return;
} }
this.#updateAction(actionId, { ...action, ...data.action, executed: !isStreaming }); this.#updateAction(actionId, { ...action, ...data.action, executed: !isStreaming });
return this.#currentExecutionPromise = this.#currentExecutionPromise this.#currentExecutionPromise = this.#currentExecutionPromise
.then(() => { .then(() => {
return this.#executeAction(actionId, isStreaming); this.#executeAction(actionId, isStreaming);
}) })
.catch((error) => { .catch((error) => {
console.error('Action failed:', error); console.error('Action failed:', error);
@ -121,17 +120,23 @@ export class ActionRunner {
case 'start': { case 'start': {
// making the start app non blocking // making the start app non blocking
this.#runStartAction(action).then(()=>this.#updateAction(actionId, { status: 'complete' })) this.#runStartAction(action)
.catch(()=>this.#updateAction(actionId, { status: 'failed', error: 'Action failed' })) .then(() => this.#updateAction(actionId, { status: 'complete' }))
// adding a delay to avoid any race condition between 2 start actions .catch(() => this.#updateAction(actionId, { status: 'failed', error: 'Action failed' }));
// i am up for a better approch
await new Promise(resolve=>setTimeout(resolve,2000)) /*
return * adding a delay to avoid any race condition between 2 start actions
break; * i am up for a better approach
*/
await new Promise((resolve) => setTimeout(resolve, 2000));
return;
} }
} }
this.#updateAction(actionId, { status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete' }); this.#updateAction(actionId, {
status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete',
});
} catch (error) { } catch (error) {
this.#updateAction(actionId, { status: 'failed', error: 'Action failed' }); this.#updateAction(actionId, { status: 'failed', error: 'Action failed' });
logger.error(`[${action.type}]:Action failed\n\n`, error); logger.error(`[${action.type}]:Action failed\n\n`, error);
@ -145,16 +150,19 @@ export class ActionRunner {
if (action.type !== 'shell') { if (action.type !== 'shell') {
unreachable('Expected shell action'); unreachable('Expected shell action');
} }
const shell = this.#shellTerminal()
await shell.ready() const shell = this.#shellTerminal();
await shell.ready();
if (!shell || !shell.terminal || !shell.process) { if (!shell || !shell.terminal || !shell.process) {
unreachable('Shell terminal not found'); unreachable('Shell terminal not found');
} }
const resp = await shell.executeCommand(this.runnerId.get(), action.content)
logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`)
if (resp?.exitCode != 0) {
throw new Error("Failed To Execute Shell Command");
const resp = await shell.executeCommand(this.runnerId.get(), action.content);
logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`);
if (resp?.exitCode != 0) {
throw new Error('Failed To Execute Shell Command');
} }
} }
@ -162,21 +170,26 @@ export class ActionRunner {
if (action.type !== 'start') { if (action.type !== 'start') {
unreachable('Expected shell action'); unreachable('Expected shell action');
} }
if (!this.#shellTerminal) { if (!this.#shellTerminal) {
unreachable('Shell terminal not found'); unreachable('Shell terminal not found');
} }
const shell = this.#shellTerminal()
await shell.ready() const shell = this.#shellTerminal();
await shell.ready();
if (!shell || !shell.terminal || !shell.process) { if (!shell || !shell.terminal || !shell.process) {
unreachable('Shell terminal not found'); unreachable('Shell terminal not found');
} }
const resp = await shell.executeCommand(this.runnerId.get(), action.content)
logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`) const resp = await shell.executeCommand(this.runnerId.get(), action.content);
logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`);
if (resp?.exitCode != 0) { if (resp?.exitCode != 0) {
throw new Error("Failed To Start Application"); throw new Error('Failed To Start Application');
} }
return resp
return resp;
} }
async #runFileAction(action: ActionState) { async #runFileAction(action: ActionState) {

View File

@ -55,7 +55,7 @@ interface MessageState {
export class StreamingMessageParser { export class StreamingMessageParser {
#messages = new Map<string, MessageState>(); #messages = new Map<string, MessageState>();
constructor(private _options: StreamingMessageParserOptions = {}) { } constructor(private _options: StreamingMessageParserOptions = {}) {}
parse(messageId: string, input: string) { parse(messageId: string, input: string) {
let state = this.#messages.get(messageId); let state = this.#messages.get(messageId);
@ -120,20 +120,20 @@ export class StreamingMessageParser {
i = closeIndex + ARTIFACT_ACTION_TAG_CLOSE.length; i = closeIndex + ARTIFACT_ACTION_TAG_CLOSE.length;
} else { } else {
if ('type' in currentAction && currentAction.type === 'file') { if ('type' in currentAction && currentAction.type === 'file') {
let content = input.slice(i); const content = input.slice(i);
this._options.callbacks?.onActionStream?.({ this._options.callbacks?.onActionStream?.({
artifactId: currentArtifact.id, artifactId: currentArtifact.id,
messageId, messageId,
actionId: String(state.actionId - 1), actionId: String(state.actionId - 1),
action: { action: {
...currentAction as FileAction, ...(currentAction as FileAction),
content, content,
filePath: currentAction.filePath, filePath: currentAction.filePath,
}, },
}); });
} }
break; break;
} }
} else { } else {
@ -272,7 +272,7 @@ export class StreamingMessageParser {
} }
(actionAttributes as FileAction).filePath = filePath; (actionAttributes as FileAction).filePath = filePath;
} else if (!(['shell', 'start'].includes(actionType))) { } else if (!['shell', 'start'].includes(actionType)) {
logger.warn(`Unknown action type '${actionType}'`); logger.warn(`Unknown action type '${actionType}'`);
} }

View File

@ -80,10 +80,6 @@ export class FilesStore {
this.#modifiedFiles.clear(); this.#modifiedFiles.clear();
} }
markFileAsNew(filePath: string) {
this.#modifiedFiles.set(filePath, '');
}
async saveFile(filePath: string, content: string) { async saveFile(filePath: string, content: string) {
const webcontainer = await this.#webcontainer; const webcontainer = await this.#webcontainer;
@ -216,9 +212,5 @@ function isBinaryFile(buffer: Uint8Array | undefined) {
* array buffer. * array buffer.
*/ */
function convertToBuffer(view: Uint8Array): Buffer { function convertToBuffer(view: Uint8Array): Buffer {
const buffer = new Uint8Array(view.buffer, view.byteOffset, view.byteLength); return Buffer.from(view.buffer, view.byteOffset, view.byteLength);
Object.setPrototypeOf(buffer, Buffer.prototype);
return buffer as Buffer;
} }

View File

@ -7,7 +7,7 @@ import { coloredText } from '~/utils/terminal';
export class TerminalStore { export class TerminalStore {
#webcontainer: Promise<WebContainer>; #webcontainer: Promise<WebContainer>;
#terminals: Array<{ terminal: ITerminal; process: WebContainerProcess }> = []; #terminals: Array<{ terminal: ITerminal; process: WebContainerProcess }> = [];
#boltTerminal = newBoltShellProcess() #boltTerminal = newBoltShellProcess();
showTerminal: WritableAtom<boolean> = import.meta.hot?.data.showTerminal ?? atom(true); showTerminal: WritableAtom<boolean> = import.meta.hot?.data.showTerminal ?? atom(true);
@ -27,8 +27,8 @@ export class TerminalStore {
} }
async attachBoltTerminal(terminal: ITerminal) { async attachBoltTerminal(terminal: ITerminal) {
try { try {
let wc = await this.#webcontainer const wc = await this.#webcontainer;
await this.#boltTerminal.init(wc, terminal) await this.#boltTerminal.init(wc, terminal);
} catch (error: any) { } catch (error: any) {
terminal.write(coloredText.red('Failed to spawn bolt shell\n\n') + error.message); terminal.write(coloredText.red('Failed to spawn bolt shell\n\n') + error.message);
return; return;

View File

@ -11,9 +11,8 @@ import { PreviewsStore } from './previews';
import { TerminalStore } from './terminal'; import { TerminalStore } from './terminal';
import JSZip from 'jszip'; import JSZip from 'jszip';
import { saveAs } from 'file-saver'; import { saveAs } from 'file-saver';
import { Octokit, type RestEndpointMethodTypes } from "@octokit/rest"; import { Octokit, type RestEndpointMethodTypes } from '@octokit/rest';
import * as nodePath from 'node:path'; import * as nodePath from 'node:path';
import type { WebContainerProcess } from '@webcontainer/api';
import { extractRelativePath } from '~/utils/diff'; import { extractRelativePath } from '~/utils/diff';
export interface ArtifactState { export interface ArtifactState {
@ -32,7 +31,6 @@ export type WorkbenchViewType = 'code' | 'preview';
export class WorkbenchStore { export class WorkbenchStore {
#previewsStore = new PreviewsStore(webcontainer); #previewsStore = new PreviewsStore(webcontainer);
#filesStore = new FilesStore(webcontainer); #filesStore = new FilesStore(webcontainer);
#editorStore = new EditorStore(this.#filesStore); #editorStore = new EditorStore(this.#filesStore);
#terminalStore = new TerminalStore(webcontainer); #terminalStore = new TerminalStore(webcontainer);
@ -43,7 +41,6 @@ export class WorkbenchStore {
unsavedFiles: WritableAtom<Set<string>> = import.meta.hot?.data.unsavedFiles ?? atom(new Set<string>()); unsavedFiles: WritableAtom<Set<string>> = import.meta.hot?.data.unsavedFiles ?? atom(new Set<string>());
modifiedFiles = new Set<string>(); modifiedFiles = new Set<string>();
artifactIdList: string[] = []; artifactIdList: string[] = [];
#boltTerminal: { terminal: ITerminal; process: WebContainerProcess } | undefined;
#globalExecutionQueue = Promise.resolve(); #globalExecutionQueue = Promise.resolve();
constructor() { constructor() {
if (import.meta.hot) { if (import.meta.hot) {
@ -55,7 +52,7 @@ export class WorkbenchStore {
} }
addToExecutionQueue(callback: () => Promise<void>) { addToExecutionQueue(callback: () => Promise<void>) {
this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback()) this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback());
} }
get previews() { get previews() {
@ -97,7 +94,6 @@ export class WorkbenchStore {
this.#terminalStore.attachTerminal(terminal); this.#terminalStore.attachTerminal(terminal);
} }
attachBoltTerminal(terminal: ITerminal) { attachBoltTerminal(terminal: ITerminal) {
this.#terminalStore.attachBoltTerminal(terminal); this.#terminalStore.attachBoltTerminal(terminal);
} }
@ -262,7 +258,8 @@ export class WorkbenchStore {
this.artifacts.setKey(messageId, { ...artifact, ...state }); this.artifacts.setKey(messageId, { ...artifact, ...state });
} }
addAction(data: ActionCallbackData) { addAction(data: ActionCallbackData) {
this._addAction(data) this._addAction(data);
// this.addToExecutionQueue(()=>this._addAction(data)) // this.addToExecutionQueue(()=>this._addAction(data))
} }
async _addAction(data: ActionCallbackData) { async _addAction(data: ActionCallbackData) {
@ -279,10 +276,9 @@ export class WorkbenchStore {
runAction(data: ActionCallbackData, isStreaming: boolean = false) { runAction(data: ActionCallbackData, isStreaming: boolean = false) {
if (isStreaming) { if (isStreaming) {
this._runAction(data, isStreaming) this._runAction(data, isStreaming);
} } else {
else { this.addToExecutionQueue(() => this._runAction(data, isStreaming));
this.addToExecutionQueue(() => this._runAction(data, isStreaming))
} }
} }
async _runAction(data: ActionCallbackData, isStreaming: boolean = false) { async _runAction(data: ActionCallbackData, isStreaming: boolean = false) {
@ -293,16 +289,21 @@ export class WorkbenchStore {
if (!artifact) { if (!artifact) {
unreachable('Artifact not found'); unreachable('Artifact not found');
} }
if (data.action.type === 'file') { if (data.action.type === 'file') {
let wc = await webcontainer const wc = await webcontainer;
const fullPath = nodePath.join(wc.workdir, data.action.filePath); const fullPath = nodePath.join(wc.workdir, data.action.filePath);
if (this.selectedFile.value !== fullPath) { if (this.selectedFile.value !== fullPath) {
this.setSelectedFile(fullPath); this.setSelectedFile(fullPath);
} }
if (this.currentView.value !== 'code') { if (this.currentView.value !== 'code') {
this.currentView.set('code'); this.currentView.set('code');
} }
const doc = this.#editorStore.documents.get()[fullPath]; const doc = this.#editorStore.documents.get()[fullPath];
if (!doc) { if (!doc) {
await artifact.runner.runAction(data, isStreaming); await artifact.runner.runAction(data, isStreaming);
} }
@ -382,63 +383,7 @@ export class WorkbenchStore {
return syncedFiles; return syncedFiles;
} }
async uploadFilesFromDisk(sourceHandle: FileSystemDirectoryHandle) {
const loadedFiles = [];
const wc = await webcontainer;
const newFiles = {};
const processDirectory = async (handle: FileSystemDirectoryHandle, currentPath: string = '') => {
const entries = await Array.fromAsync(handle.values());
for (const entry of entries) {
const entryPath = currentPath ? `${currentPath}/${entry.name}` : entry.name;
const fullPath = `/${entryPath}`;
if (entry.kind === 'directory') {
await wc.fs.mkdir(fullPath, { recursive: true });
const subDirHandle = await handle.getDirectoryHandle(entry.name);
await processDirectory(subDirHandle, entryPath);
} else {
const file = await entry.getFile();
const content = await file.text();
// Write to WebContainer
await wc.fs.writeFile(fullPath, content);
// Mark file as new
this.#filesStore.markFileAsNew(fullPath);
// Update the files store with the current content
this.files.setKey(fullPath, { type: 'file', content, isBinary: false });
// Collect for editor store with actual content
newFiles[fullPath] = { type: 'file', content, isBinary: false };
loadedFiles.push(entryPath);
}
}
}
await processDirectory(sourceHandle);
return loadedFiles;
}
async refreshFiles() {
// Clear old state
this.modifiedFiles = new Set<string>();
this.artifactIdList = [];
// Reset stores
this.#filesStore = new FilesStore(webcontainer);
this.#editorStore = new EditorStore(this.#filesStore);
// Update UI state
this.currentView.set('code');
this.unsavedFiles.set(new Set<string>());
}
async pushToGitHub(repoName: string, githubUsername: string, ghToken: string) { async pushToGitHub(repoName: string, githubUsername: string, ghToken: string) {
try { try {
// Get the GitHub auth token from environment variables // Get the GitHub auth token from environment variables
const githubToken = ghToken; const githubToken = ghToken;
@ -453,10 +398,11 @@ export class WorkbenchStore {
const octokit = new Octokit({ auth: githubToken }); const octokit = new Octokit({ auth: githubToken });
// Check if the repository already exists before creating it // Check if the repository already exists before creating it
let repo: RestEndpointMethodTypes["repos"]["get"]["response"]['data'] let repo: RestEndpointMethodTypes['repos']['get']['response']['data'];
try { try {
let resp = await octokit.repos.get({ owner: owner, repo: repoName }); const resp = await octokit.repos.get({ owner, repo: repoName });
repo = resp.data repo = resp.data;
} catch (error) { } catch (error) {
if (error instanceof Error && 'status' in error && error.status === 404) { if (error instanceof Error && 'status' in error && error.status === 404) {
// Repository doesn't exist, so create a new one // Repository doesn't exist, so create a new one
@ -474,6 +420,7 @@ export class WorkbenchStore {
// Get all files // Get all files
const files = this.files.get(); const files = this.files.get();
if (!files || Object.keys(files).length === 0) { if (!files || Object.keys(files).length === 0) {
throw new Error('No files found to push'); throw new Error('No files found to push');
} }
@ -490,7 +437,9 @@ export class WorkbenchStore {
}); });
return { path: extractRelativePath(filePath), sha: blob.sha }; return { path: extractRelativePath(filePath), sha: blob.sha };
} }
})
return null;
}),
); );
const validBlobs = blobs.filter(Boolean); // Filter out any undefined blobs const validBlobs = blobs.filter(Boolean); // Filter out any undefined blobs
@ -542,21 +491,6 @@ export class WorkbenchStore {
console.error('Error pushing to GitHub:', error instanceof Error ? error.message : String(error)); console.error('Error pushing to GitHub:', error instanceof Error ? error.message : String(error));
} }
} }
async markFileAsModified(filePath: string) {
const file = this.#filesStore.getFile(filePath);
if (file?.type === 'file') {
// First collect all original content
const originalContent = file.content;
console.log(`Processing ${filePath}:`, originalContent);
// Then save modifications
await this.saveFile(filePath, originalContent);
}
}
} }
export const workbenchStore = new WorkbenchStore(); export const workbenchStore = new WorkbenchStore();

View File

@ -1,5 +1,6 @@
// @ts-nocheck // eslint-disable-next-line @typescript-eslint/ban-ts-comment
// Preventing TS checks with files presented in the video for a better presentation. // @ts-nocheck TODO: Provider proper types
import { type ActionFunctionArgs } from '@remix-run/cloudflare'; import { type ActionFunctionArgs } from '@remix-run/cloudflare';
import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants'; import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants';
import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts'; import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts';
@ -14,14 +15,15 @@ function parseCookies(cookieHeader) {
const cookies = {}; const cookies = {};
// Split the cookie string by semicolons and spaces // Split the cookie string by semicolons and spaces
const items = cookieHeader.split(";").map(cookie => cookie.trim()); const items = cookieHeader.split(';').map((cookie) => cookie.trim());
items.forEach((item) => {
const [name, ...rest] = item.split('=');
items.forEach(item => {
const [name, ...rest] = item.split("=");
if (name && rest) { if (name && rest) {
// Decode the name and value, and join value parts in case it contains '=' // Decode the name and value, and join value parts in case it contains '='
const decodedName = decodeURIComponent(name.trim()); const decodedName = decodeURIComponent(name.trim());
const decodedValue = decodeURIComponent(rest.join("=").trim()); const decodedValue = decodeURIComponent(rest.join('=').trim());
cookies[decodedName] = decodedValue; cookies[decodedName] = decodedValue;
} }
}); });
@ -30,17 +32,15 @@ function parseCookies(cookieHeader) {
} }
async function chatAction({ context, request }: ActionFunctionArgs) { async function chatAction({ context, request }: ActionFunctionArgs) {
const { messages, model } = await request.json<{
const { messages, imageData, model } = await request.json<{ messages: Messages;
messages: Messages, model: string;
imageData?: string[],
model: string
}>(); }>();
const cookieHeader = request.headers.get("Cookie"); const cookieHeader = request.headers.get('Cookie');
// Parse the cookie's value (returns an object or null if no cookie exists) // Parse the cookie's value (returns an object or null if no cookie exists)
const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || "{}"); const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || '{}');
const stream = new SwitchableStream(); const stream = new SwitchableStream();
@ -87,7 +87,7 @@ async function chatAction({ context, request }: ActionFunctionArgs) {
if (error.message?.includes('API key')) { if (error.message?.includes('API key')) {
throw new Response('Invalid or missing API key', { throw new Response('Invalid or missing API key', {
status: 401, status: 401,
statusText: 'Unauthorized' statusText: 'Unauthorized',
}); });
} }

View File

@ -1,10 +1,10 @@
import type { ModelInfo } from '~/utils/types'; import type { ModelInfo } from '~/utils/types';
export type ProviderInfo = { export type ProviderInfo = {
staticModels: ModelInfo[], staticModels: ModelInfo[];
name: string, name: string;
getDynamicModels?: () => Promise<ModelInfo[]>, getDynamicModels?: () => Promise<ModelInfo[]>;
getApiKeyLink?: string, getApiKeyLink?: string;
labelForGetApiKey?: string, labelForGetApiKey?: string;
icon?:string, icon?: string;
}; };

View File

@ -12,29 +12,42 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ {
name: 'Anthropic', name: 'Anthropic',
staticModels: [ staticModels: [
{ name: 'claude-3-5-sonnet-latest', label: 'Claude 3.5 Sonnet (new)', provider: 'Anthropic', maxTokenAllowed: 8000 }, {
{ name: 'claude-3-5-sonnet-20240620', label: 'Claude 3.5 Sonnet (old)', provider: 'Anthropic', maxTokenAllowed: 8000 }, name: 'claude-3-5-sonnet-latest',
{ name: 'claude-3-5-haiku-latest', label: 'Claude 3.5 Haiku (new)', provider: 'Anthropic', maxTokenAllowed: 8000 }, label: 'Claude 3.5 Sonnet (new)',
provider: 'Anthropic',
maxTokenAllowed: 8000,
},
{
name: 'claude-3-5-sonnet-20240620',
label: 'Claude 3.5 Sonnet (old)',
provider: 'Anthropic',
maxTokenAllowed: 8000,
},
{
name: 'claude-3-5-haiku-latest',
label: 'Claude 3.5 Haiku (new)',
provider: 'Anthropic',
maxTokenAllowed: 8000,
},
{ name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 }, { name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 }, { name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 },
{ name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 } { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 },
], ],
getApiKeyLink: "https://console.anthropic.com/settings/keys", getApiKeyLink: 'https://console.anthropic.com/settings/keys',
}, },
{ {
name: 'Ollama', name: 'Ollama',
staticModels: [], staticModels: [],
getDynamicModels: getOllamaModels, getDynamicModels: getOllamaModels,
getApiKeyLink: "https://ollama.com/download", getApiKeyLink: 'https://ollama.com/download',
labelForGetApiKey: "Download Ollama", labelForGetApiKey: 'Download Ollama',
icon: "i-ph:cloud-arrow-down", icon: 'i-ph:cloud-arrow-down',
}, { },
{
name: 'OpenAILike', name: 'OpenAILike',
staticModels: [ staticModels: [],
{ name: 'o1-mini', label: 'o1-mini', provider: 'OpenAILike' }, getDynamicModels: getOpenAILikeModels,
{ name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAILike' },
],
getDynamicModels: getOpenAILikeModels
}, },
{ {
name: 'Cohere', name: 'Cohere',
@ -50,7 +63,7 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 }, { name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 },
{ name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 }, { name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 },
], ],
getApiKeyLink: 'https://dashboard.cohere.com/api-keys' getApiKeyLink: 'https://dashboard.cohere.com/api-keys',
}, },
{ {
name: 'OpenRouter', name: 'OpenRouter',
@ -59,50 +72,145 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ {
name: 'anthropic/claude-3.5-sonnet', name: 'anthropic/claude-3.5-sonnet',
label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)', label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)',
provider: 'OpenRouter' provider: 'OpenRouter',
, maxTokenAllowed: 8000 maxTokenAllowed: 8000,
},
{
name: 'anthropic/claude-3-haiku',
label: 'Anthropic: Claude 3 Haiku (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
{
name: 'deepseek/deepseek-coder',
label: 'Deepseek-Coder V2 236B (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
{
name: 'google/gemini-flash-1.5',
label: 'Google Gemini Flash 1.5 (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
{
name: 'google/gemini-pro-1.5',
label: 'Google Gemini Pro 1.5 (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
}, },
{ name: 'anthropic/claude-3-haiku', label: 'Anthropic: Claude 3 Haiku (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'deepseek/deepseek-coder', label: 'Deepseek-Coder V2 236B (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'google/gemini-flash-1.5', label: 'Google Gemini Flash 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'google/gemini-pro-1.5', label: 'Google Gemini Pro 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, { name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 },
{ name: 'mistralai/mistral-nemo', label: 'OpenRouter Mistral Nemo (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, {
{ name: 'qwen/qwen-110b-chat', label: 'OpenRouter Qwen 110b Chat (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, name: 'mistralai/mistral-nemo',
{ name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 } label: 'OpenRouter Mistral Nemo (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
{
name: 'qwen/qwen-110b-chat',
label: 'OpenRouter Qwen 110b Chat (OpenRouter)',
provider: 'OpenRouter',
maxTokenAllowed: 8000,
},
{ name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 },
], ],
getDynamicModels: getOpenRouterModels, getDynamicModels: getOpenRouterModels,
getApiKeyLink: 'https://openrouter.ai/settings/keys', getApiKeyLink: 'https://openrouter.ai/settings/keys',
},
}, { {
name: 'Google', name: 'Google',
staticModels: [ staticModels: [
{ name: 'gemini-exp-1121', label: 'Gemini Experimental 1121', provider: 'Google' }, { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google', maxTokenAllowed: 8192 },
{ name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro 002', provider: 'Google' }, { name: 'gemini-1.5-flash-002', label: 'Gemini 1.5 Flash-002', provider: 'Google', maxTokenAllowed: 8192 },
{ name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google' }, { name: 'gemini-1.5-flash-8b', label: 'Gemini 1.5 Flash-8b', provider: 'Google', maxTokenAllowed: 8192 },
{ name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google' } { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google', maxTokenAllowed: 8192 },
{ name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro-002', provider: 'Google', maxTokenAllowed: 8192 },
{ name: 'gemini-exp-1121', label: 'Gemini exp-1121', provider: 'Google', maxTokenAllowed: 8192 },
], ],
getApiKeyLink: 'https://aistudio.google.com/app/apikey' getApiKeyLink: 'https://aistudio.google.com/app/apikey',
}, { },
{
name: 'Groq', name: 'Groq',
staticModels: [ staticModels: [
{ name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
{ name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 } { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 },
], ],
getApiKeyLink: 'https://console.groq.com/keys' getApiKeyLink: 'https://console.groq.com/keys',
}, },
{ {
name: 'HuggingFace', name: 'HuggingFace',
staticModels: [ staticModels: [
{ name: 'Qwen/Qwen2.5-Coder-32B-Instruct', label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, {
{ name: '01-ai/Yi-1.5-34B-Chat', label: 'Yi-1.5-34B-Chat (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, name: 'Qwen/Qwen2.5-Coder-32B-Instruct',
{ name: 'codellama/CodeLlama-34b-Instruct-hf', label: 'CodeLlama-34b-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)',
{ name: 'NousResearch/Hermes-3-Llama-3.1-8B', label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 } provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: '01-ai/Yi-1.5-34B-Chat',
label: 'Yi-1.5-34B-Chat (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'codellama/CodeLlama-34b-Instruct-hf',
label: 'CodeLlama-34b-Instruct (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'NousResearch/Hermes-3-Llama-3.1-8B',
label: 'Hermes-3-Llama-3.1-8B (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'Qwen/Qwen2.5-Coder-32B-Instruct',
label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'Qwen/Qwen2.5-72B-Instruct',
label: 'Qwen2.5-72B-Instruct (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'meta-llama/Llama-3.1-70B-Instruct',
label: 'Llama-3.1-70B-Instruct (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'meta-llama/Llama-3.1-405B',
label: 'Llama-3.1-405B (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: '01-ai/Yi-1.5-34B-Chat',
label: 'Yi-1.5-34B-Chat (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'codellama/CodeLlama-34b-Instruct-hf',
label: 'CodeLlama-34b-Instruct (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
{
name: 'NousResearch/Hermes-3-Llama-3.1-8B',
label: 'Hermes-3-Llama-3.1-8B (HuggingFace)',
provider: 'HuggingFace',
maxTokenAllowed: 8000,
},
], ],
getApiKeyLink: 'https://huggingface.co/settings/tokens' getApiKeyLink: 'https://huggingface.co/settings/tokens',
}, },
{ {
@ -111,23 +219,24 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 }, { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 }, { name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 }, { name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 },
{ name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 } { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 },
], ],
getApiKeyLink: "https://platform.openai.com/api-keys", getApiKeyLink: 'https://platform.openai.com/api-keys',
}, { },
{
name: 'xAI', name: 'xAI',
staticModels: [ staticModels: [{ name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 }],
{ name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 } getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key',
], },
getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key' {
}, {
name: 'Deepseek', name: 'Deepseek',
staticModels: [ staticModels: [
{ name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 }, { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
{ name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 } { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 },
], ],
getApiKeyLink: 'https://platform.deepseek.com/api_keys' getApiKeyLink: 'https://platform.deepseek.com/apiKeys',
}, { },
{
name: 'Mistral', name: 'Mistral',
staticModels: [ staticModels: [
{ name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 },
@ -138,27 +247,29 @@ const PROVIDER_LIST: ProviderInfo[] = [
{ name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 },
{ name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 } { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 },
], ],
getApiKeyLink: 'https://console.mistral.ai/api-keys/' getApiKeyLink: 'https://console.mistral.ai/api-keys/',
}, { },
{
name: 'LMStudio', name: 'LMStudio',
staticModels: [], staticModels: [],
getDynamicModels: getLMStudioModels, getDynamicModels: getLMStudioModels,
getApiKeyLink: 'https://lmstudio.ai/', getApiKeyLink: 'https://lmstudio.ai/',
labelForGetApiKey: 'Get LMStudio', labelForGetApiKey: 'Get LMStudio',
icon: "i-ph:cloud-arrow-down", icon: 'i-ph:cloud-arrow-down',
} },
]; ];
export const DEFAULT_PROVIDER = PROVIDER_LIST[0]; export const DEFAULT_PROVIDER = PROVIDER_LIST[0];
const staticModels: ModelInfo[] = PROVIDER_LIST.map(p => p.staticModels).flat(); const staticModels: ModelInfo[] = PROVIDER_LIST.map((p) => p.staticModels).flat();
export let MODEL_LIST: ModelInfo[] = [...staticModels]; export let MODEL_LIST: ModelInfo[] = [...staticModels];
const getOllamaBaseUrl = () => { const getOllamaBaseUrl = () => {
const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434'; const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434';
// Check if we're in the browser // Check if we're in the browser
if (typeof window !== 'undefined') { if (typeof window !== 'undefined') {
// Frontend always uses localhost // Frontend always uses localhost
@ -168,47 +279,54 @@ const getOllamaBaseUrl = () => {
// Backend: Check if we're running in Docker // Backend: Check if we're running in Docker
const isDocker = process.env.RUNNING_IN_DOCKER === 'true'; const isDocker = process.env.RUNNING_IN_DOCKER === 'true';
return isDocker return isDocker ? defaultBaseUrl.replace('localhost', 'host.docker.internal') : defaultBaseUrl;
? defaultBaseUrl.replace('localhost', 'host.docker.internal')
: defaultBaseUrl;
}; };
async function getOllamaModels(): Promise<ModelInfo[]> { async function getOllamaModels(): Promise<ModelInfo[]> {
if (typeof window === 'undefined') {
return [];
}
try { try {
const base_url = getOllamaBaseUrl(); const baseUrl = getOllamaBaseUrl();
const response = await fetch(`${base_url}/api/tags`); const response = await fetch(`${baseUrl}/api/tags`);
const data = await response.json() as OllamaApiResponse; const data = (await response.json()) as OllamaApiResponse;
return data.models.map((model: OllamaModel) => ({ return data.models.map((model: OllamaModel) => ({
name: model.name, name: model.name,
label: `${model.name} (${model.details.parameter_size})`, label: `${model.name} (${model.details.parameter_size})`,
provider: 'Ollama', provider: 'Ollama',
maxTokenAllowed:8000, maxTokenAllowed: 8000,
})); }));
} catch (e) { } catch (e) {
console.error('Error getting Ollama models:', e);
return []; return [];
} }
} }
async function getOpenAILikeModels(): Promise<ModelInfo[]> { async function getOpenAILikeModels(): Promise<ModelInfo[]> {
try { try {
const base_url = import.meta.env.OPENAI_LIKE_API_BASE_URL || ''; const baseUrl = import.meta.env.OPENAI_LIKE_API_BASE_URL || '';
if (!base_url) {
if (!baseUrl) {
return []; return [];
} }
const api_key = import.meta.env.OPENAI_LIKE_API_KEY ?? '';
const response = await fetch(`${base_url}/models`, { const apiKey = import.meta.env.OPENAI_LIKE_API_KEY ?? '';
const response = await fetch(`${baseUrl}/models`, {
headers: { headers: {
Authorization: `Bearer ${api_key}` Authorization: `Bearer ${apiKey}`,
} },
}); });
const res = await response.json() as any; const res = (await response.json()) as any;
return res.data.map((model: any) => ({ return res.data.map((model: any) => ({
name: model.id, name: model.id,
label: model.id, label: model.id,
provider: 'OpenAILike' provider: 'OpenAILike',
})); }));
} catch (e) { } catch (e) {
console.error('Error getting OpenAILike models:', e);
return []; return [];
} }
} }
@ -221,51 +339,71 @@ type OpenRouterModelsResponse = {
pricing: { pricing: {
prompt: number; prompt: number;
completion: number; completion: number;
} };
}[] }[];
}; };
async function getOpenRouterModels(): Promise<ModelInfo[]> { async function getOpenRouterModels(): Promise<ModelInfo[]> {
const data: OpenRouterModelsResponse = await (await fetch('https://openrouter.ai/api/v1/models', { const data: OpenRouterModelsResponse = await (
headers: { await fetch('https://openrouter.ai/api/v1/models', {
'Content-Type': 'application/json' headers: {
} 'Content-Type': 'application/json',
})).json(); },
})
).json();
return data.data.sort((a, b) => a.name.localeCompare(b.name)).map(m => ({ return data.data
name: m.id, .sort((a, b) => a.name.localeCompare(b.name))
label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed( .map((m) => ({
2)} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor( name: m.id,
m.context_length / 1000)}k`, label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed(
provider: 'OpenRouter', 2,
maxTokenAllowed:8000, )} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(m.context_length / 1000)}k`,
})); provider: 'OpenRouter',
maxTokenAllowed: 8000,
}));
} }
async function getLMStudioModels(): Promise<ModelInfo[]> { async function getLMStudioModels(): Promise<ModelInfo[]> {
if (typeof window === 'undefined') {
return [];
}
try { try {
const base_url = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234'; const baseUrl = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234';
const response = await fetch(`${base_url}/v1/models`); const response = await fetch(`${baseUrl}/v1/models`);
const data = await response.json() as any; const data = (await response.json()) as any;
return data.data.map((model: any) => ({ return data.data.map((model: any) => ({
name: model.id, name: model.id,
label: model.id, label: model.id,
provider: 'LMStudio' provider: 'LMStudio',
})); }));
} catch (e) { } catch (e) {
console.error('Error getting LMStudio models:', e);
return []; return [];
} }
} }
async function initializeModelList(): Promise<ModelInfo[]> { async function initializeModelList(): Promise<ModelInfo[]> {
MODEL_LIST = [...(await Promise.all( MODEL_LIST = [
PROVIDER_LIST ...(
.filter((p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels) await Promise.all(
.map(p => p.getDynamicModels()))) PROVIDER_LIST.filter(
.flat(), ...staticModels]; (p): p is ProviderInfo & { getDynamicModels: () => Promise<ModelInfo[]> } => !!p.getDynamicModels,
).map((p) => p.getDynamicModels()),
)
).flat(),
...staticModels,
];
return MODEL_LIST; return MODEL_LIST;
} }
export { getOllamaModels, getOpenAILikeModels, getLMStudioModels, initializeModelList, getOpenRouterModels, PROVIDER_LIST }; export {
getOllamaModels,
getOpenAILikeModels,
getLMStudioModels,
initializeModelList,
getOpenRouterModels,
PROVIDER_LIST,
};

View File

@ -52,67 +52,77 @@ export async function newShellProcess(webcontainer: WebContainer, terminal: ITer
return process; return process;
} }
export type ExecutionResult = { output: string; exitCode: number } | undefined;
export class BoltShell { export class BoltShell {
#initialized: (() => void) | undefined #initialized: (() => void) | undefined;
#readyPromise: Promise<void> #readyPromise: Promise<void>;
#webcontainer: WebContainer | undefined #webcontainer: WebContainer | undefined;
#terminal: ITerminal | undefined #terminal: ITerminal | undefined;
#process: WebContainerProcess | undefined #process: WebContainerProcess | undefined;
executionState = atom<{ sessionId: string, active: boolean, executionPrms?: Promise<any> } | undefined>() executionState = atom<{ sessionId: string; active: boolean; executionPrms?: Promise<any> } | undefined>();
#outputStream: ReadableStreamDefaultReader<string> | undefined #outputStream: ReadableStreamDefaultReader<string> | undefined;
#shellInputStream: WritableStreamDefaultWriter<string> | undefined #shellInputStream: WritableStreamDefaultWriter<string> | undefined;
constructor() { constructor() {
this.#readyPromise = new Promise((resolve) => { this.#readyPromise = new Promise((resolve) => {
this.#initialized = resolve this.#initialized = resolve;
}) });
} }
ready() { ready() {
return this.#readyPromise; return this.#readyPromise;
} }
async init(webcontainer: WebContainer, terminal: ITerminal) {
this.#webcontainer = webcontainer
this.#terminal = terminal
let callback = (data: string) => {
console.log(data)
}
let { process, output } = await this.newBoltShellProcess(webcontainer, terminal)
this.#process = process
this.#outputStream = output.getReader()
await this.waitTillOscCode('interactive')
this.#initialized?.()
}
get terminal() {
return this.#terminal
}
get process() {
return this.#process
}
async executeCommand(sessionId: string, command: string) {
if (!this.process || !this.terminal) {
return
}
let state = this.executionState.get()
//interrupt the current execution async init(webcontainer: WebContainer, terminal: ITerminal) {
// this.#shellInputStream?.write('\x03'); this.#webcontainer = webcontainer;
this.terminal.input('\x03'); this.#terminal = terminal;
if (state && state.executionPrms) {
await state.executionPrms const { process, output } = await this.newBoltShellProcess(webcontainer, terminal);
this.#process = process;
this.#outputStream = output.getReader();
await this.waitTillOscCode('interactive');
this.#initialized?.();
}
get terminal() {
return this.#terminal;
}
get process() {
return this.#process;
}
async executeCommand(sessionId: string, command: string): Promise<ExecutionResult> {
if (!this.process || !this.terminal) {
return undefined;
} }
const state = this.executionState.get();
/*
* interrupt the current execution
* this.#shellInputStream?.write('\x03');
*/
this.terminal.input('\x03');
if (state && state.executionPrms) {
await state.executionPrms;
}
//start a new execution //start a new execution
this.terminal.input(command.trim() + '\n'); this.terminal.input(command.trim() + '\n');
//wait for the execution to finish //wait for the execution to finish
let executionPrms = this.getCurrentExecutionResult() const executionPromise = this.getCurrentExecutionResult();
this.executionState.set({ sessionId, active: true, executionPrms }) this.executionState.set({ sessionId, active: true, executionPrms: executionPromise });
let resp = await executionPrms const resp = await executionPromise;
this.executionState.set({ sessionId, active: false }) this.executionState.set({ sessionId, active: false });
return resp
return resp;
} }
async newBoltShellProcess(webcontainer: WebContainer, terminal: ITerminal) { async newBoltShellProcess(webcontainer: WebContainer, terminal: ITerminal) {
const args: string[] = []; const args: string[] = [];
@ -126,6 +136,7 @@ export class BoltShell {
const input = process.input.getWriter(); const input = process.input.getWriter();
this.#shellInputStream = input; this.#shellInputStream = input;
const [internalOutput, terminalOutput] = process.output.tee(); const [internalOutput, terminalOutput] = process.output.tee();
const jshReady = withResolvers<void>(); const jshReady = withResolvers<void>();
@ -162,34 +173,48 @@ export class BoltShell {
return { process, output: internalOutput }; return { process, output: internalOutput };
} }
async getCurrentExecutionResult() {
let { output, exitCode } = await this.waitTillOscCode('exit') async getCurrentExecutionResult(): Promise<ExecutionResult> {
const { output, exitCode } = await this.waitTillOscCode('exit');
return { output, exitCode }; return { output, exitCode };
} }
async waitTillOscCode(waitCode: string) { async waitTillOscCode(waitCode: string) {
let fullOutput = ''; let fullOutput = '';
let exitCode: number = 0; let exitCode: number = 0;
if (!this.#outputStream) return { output: fullOutput, exitCode };
let tappedStream = this.#outputStream if (!this.#outputStream) {
return { output: fullOutput, exitCode };
}
const tappedStream = this.#outputStream;
while (true) { while (true) {
const { value, done } = await tappedStream.read(); const { value, done } = await tappedStream.read();
if (done) break;
if (done) {
break;
}
const text = value || ''; const text = value || '';
fullOutput += text; fullOutput += text;
// Check if command completion signal with exit code // Check if command completion signal with exit code
const [, osc, , pid, code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || []; const [, osc, , , code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || [];
if (osc === 'exit') { if (osc === 'exit') {
exitCode = parseInt(code, 10); exitCode = parseInt(code, 10);
} }
if (osc === waitCode) { if (osc === waitCode) {
break; break;
} }
} }
return { output: fullOutput, exitCode }; return { output: fullOutput, exitCode };
} }
} }
export function newBoltShellProcess() { export function newBoltShellProcess() {
return new BoltShell(); return new BoltShell();
} }

View File

@ -1,4 +1,3 @@
interface OllamaModelDetails { interface OllamaModelDetails {
parent_model: string; parent_model: string;
format: string; format: string;
@ -29,10 +28,10 @@ export interface ModelInfo {
} }
export interface ProviderInfo { export interface ProviderInfo {
staticModels: ModelInfo[], staticModels: ModelInfo[];
name: string, name: string;
getDynamicModels?: () => Promise<ModelInfo[]>, getDynamicModels?: () => Promise<ModelInfo[]>;
getApiKeyLink?: string, getApiKeyLink?: string;
labelForGetApiKey?: string, labelForGetApiKey?: string;
icon?:string, icon?: string;
}; }

View File

@ -12,6 +12,8 @@ export default [
'@blitz/catch-error-name': 'off', '@blitz/catch-error-name': 'off',
'@typescript-eslint/no-this-alias': 'off', '@typescript-eslint/no-this-alias': 'off',
'@typescript-eslint/no-empty-object-type': 'off', '@typescript-eslint/no-empty-object-type': 'off',
'@blitz/comment-syntax': 'off',
'@blitz/block-scope-case': 'off',
}, },
}, },
{ {

View File

@ -11,8 +11,8 @@
"dev": "remix vite:dev", "dev": "remix vite:dev",
"test": "vitest --run", "test": "vitest --run",
"test:watch": "vitest", "test:watch": "vitest",
"lint": "eslint --cache --cache-location ./node_modules/.cache/eslint .", "lint": "eslint --cache --cache-location ./node_modules/.cache/eslint app",
"lint:fix": "npm run lint -- --fix", "lint:fix": "npm run lint -- --fix && prettier app --write",
"start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings", "start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings",
"dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session", "dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session",
"dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai", "dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai",
@ -20,7 +20,8 @@
"dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .", "dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .",
"typecheck": "tsc", "typecheck": "tsc",
"typegen": "wrangler types", "typegen": "wrangler types",
"preview": "pnpm run build && pnpm run start" "preview": "pnpm run build && pnpm run start",
"prepare": "husky"
}, },
"engines": { "engines": {
"node": ">=18.18.0" "node": ">=18.18.0"
@ -70,6 +71,7 @@
"diff": "^5.2.0", "diff": "^5.2.0",
"file-saver": "^2.0.5", "file-saver": "^2.0.5",
"framer-motion": "^11.2.12", "framer-motion": "^11.2.12",
"ignore": "^6.0.2",
"isbot": "^4.1.0", "isbot": "^4.1.0",
"istextorbinary": "^9.5.0", "istextorbinary": "^9.5.0",
"jose": "^5.6.3", "jose": "^5.6.3",
@ -101,6 +103,7 @@
"@types/react": "^18.2.20", "@types/react": "^18.2.20",
"@types/react-dom": "^18.2.7", "@types/react-dom": "^18.2.7",
"fast-glob": "^3.3.2", "fast-glob": "^3.3.2",
"husky": "9.1.7",
"is-ci": "^3.0.1", "is-ci": "^3.0.1",
"node-fetch": "^3.3.2", "node-fetch": "^3.3.2",
"prettier": "^3.3.2", "prettier": "^3.3.2",

View File

@ -143,6 +143,9 @@ importers:
framer-motion: framer-motion:
specifier: ^11.2.12 specifier: ^11.2.12
version: 11.2.12(react-dom@18.3.1(react@18.3.1))(react@18.3.1) version: 11.2.12(react-dom@18.3.1(react@18.3.1))(react@18.3.1)
ignore:
specifier: ^6.0.2
version: 6.0.2
isbot: isbot:
specifier: ^4.1.0 specifier: ^4.1.0
version: 4.4.0 version: 4.4.0
@ -231,6 +234,9 @@ importers:
fast-glob: fast-glob:
specifier: ^3.3.2 specifier: ^3.3.2
version: 3.3.2 version: 3.3.2
husky:
specifier: 9.1.7
version: 9.1.7
is-ci: is-ci:
specifier: ^3.0.1 specifier: ^3.0.1
version: 3.0.1 version: 3.0.1
@ -2482,7 +2488,7 @@ packages:
resolution: {integrity: sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==} resolution: {integrity: sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==}
bytes@3.0.0: bytes@3.0.0:
resolution: {integrity: sha512-pMhOfFDPiv9t5jjIXkHosWmkSyQbvsgEVNkz0ERHbuLh2T/7j4Mqqpz523Fe8MVY89KC6Sh/QfS2sM+SjgFDcw==} resolution: {integrity: sha1-0ygVQE1olpn4Wk6k+odV3ROpYEg=}
engines: {node: '>= 0.8'} engines: {node: '>= 0.8'}
bytes@3.1.2: bytes@3.1.2:
@ -3382,6 +3388,11 @@ packages:
resolution: {integrity: sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==} resolution: {integrity: sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==}
engines: {node: '>=16.17.0'} engines: {node: '>=16.17.0'}
husky@9.1.7:
resolution: {integrity: sha512-5gs5ytaNjBrh5Ow3zrvdUUY+0VxIuWVL4i9irt6friV+BqdCfmV11CQTWMiBYWHbXhco+J1kHfTOUkePhCDvMA==}
engines: {node: '>=18'}
hasBin: true
iconv-lite@0.4.24: iconv-lite@0.4.24:
resolution: {integrity: sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==} resolution: {integrity: sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==}
engines: {node: '>=0.10.0'} engines: {node: '>=0.10.0'}
@ -3399,6 +3410,10 @@ packages:
resolution: {integrity: sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==} resolution: {integrity: sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==}
engines: {node: '>= 4'} engines: {node: '>= 4'}
ignore@6.0.2:
resolution: {integrity: sha512-InwqeHHN2XpumIkMvpl/DCJVrAHgCsG5+cn1XlnLWGwtZBm8QJfSusItfrwx81CTp5agNZqpKU2J/ccC5nGT4A==}
engines: {node: '>= 4'}
immediate@3.0.6: immediate@3.0.6:
resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==} resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==}
@ -9278,6 +9293,8 @@ snapshots:
human-signals@5.0.0: {} human-signals@5.0.0: {}
husky@9.1.7: {}
iconv-lite@0.4.24: iconv-lite@0.4.24:
dependencies: dependencies:
safer-buffer: 2.1.2 safer-buffer: 2.1.2
@ -9290,6 +9307,8 @@ snapshots:
ignore@5.3.1: {} ignore@5.3.1: {}
ignore@6.0.2: {}
immediate@3.0.6: {} immediate@3.0.6: {}
immutable@4.3.7: {} immutable@4.3.7: {}

View File

@ -9,4 +9,7 @@ interface Env {
OPENAI_LIKE_API_BASE_URL: string; OPENAI_LIKE_API_BASE_URL: string;
DEEPSEEK_API_KEY: string; DEEPSEEK_API_KEY: string;
LMSTUDIO_API_BASE_URL: string; LMSTUDIO_API_BASE_URL: string;
GOOGLE_GENERATIVE_AI_API_KEY: string;
MISTRAL_API_KEY: string;
XAI_API_KEY: string;
} }