diff --git a/.husky/pre-commit b/.husky/pre-commit new file mode 100644 index 0000000..966a4ad --- /dev/null +++ b/.husky/pre-commit @@ -0,0 +1,17 @@ +#!/bin/sh + +echo "🔍 Running pre-commit hook to check the code looks good... 🔍" + +if ! pnpm typecheck; then + echo "❌ Type checking failed! Please review TypeScript types." + echo "Once you're done, don't forget to add your changes to the commit! 🚀" + exit 1 +fi + +if ! pnpm lint; then + echo "❌ Linting failed! 'pnpm lint:check' will help you fix the easy ones." + echo "Once you're done, don't forget to add your beautification to the commit! 🤩" + exit 1 +fi + +echo "👍 All good! Committing changes..." diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 0f7744a..68215a2 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,9 +1,6 @@ -# Contributing to Bolt.new Fork -## DEFAULT_NUM_CTX +# Contributing to oTToDev -The `DEFAULT_NUM_CTX` environment variable can be used to limit the maximum number of context values used by the qwen2.5-coder model. For example, to limit the context to 24576 values (which uses 32GB of VRAM), set `DEFAULT_NUM_CTX=24576` in your `.env.local` file. - -First off, thank you for considering contributing to Bolt.new! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make Bolt.new a better tool for developers worldwide. +First off, thank you for considering contributing to oTToDev! This fork aims to expand the capabilities of the original project by integrating multiple LLM providers and enhancing functionality. Every contribution helps make oTToDev a better tool for developers worldwide. ## 📋 Table of Contents - [Code of Conduct](#code-of-conduct) @@ -56,6 +53,8 @@ We're looking for dedicated contributors to help maintain and grow this project. - Comment complex logic - Keep functions focused and small - Use meaningful variable names +- Lint your code. This repo contains a pre-commit-hook that will verify your code is linted properly, +so set up your IDE to do that for you! ## Development Setup diff --git a/README.md b/README.md index f952e0a..0127dd0 100644 --- a/README.md +++ b/README.md @@ -29,6 +29,8 @@ https://thinktank.ottomator.ai - ✅ Bolt terminal to see the output of LLM run commands (@thecodacus) - ✅ Streaming of code output (@thecodacus) - ✅ Ability to revert code to earlier version (@wonderwhy-er) +- ✅ Cohere Integration (@hasanraiyan) +- ✅ Dynamic model max token length (@hasanraiyan) - ⬜ **HIGH PRIORITY** - Prevent Bolt from rewriting files as often (file locking and diffs) - ⬜ **HIGH PRIORITY** - Better prompting for smaller LLMs (code window sometimes doesn't start) - ⬜ **HIGH PRIORITY** - Load local projects into the app @@ -39,8 +41,6 @@ https://thinktank.ottomator.ai - ⬜ Azure Open AI API Integration - ⬜ Perplexity Integration - ⬜ Vertex AI Integration -- ✅ Cohere Integration (@hasanraiyan) -- ✅ Dynamic model max token length (@hasanraiyan) - ⬜ Deploy directly to Vercel/Netlify/other similar platforms - ⬜ Prompt caching - ⬜ Better prompt enhancing @@ -246,14 +246,55 @@ pnpm run dev This will start the Remix Vite development server. You will need Google Chrome Canary to run this locally if you use Chrome! It's an easy install and a good browser for web development anyway. -## Tips and Tricks +## FAQ -Here are some tips to get the most out of Bolt.new: +### How do I get the best results with oTToDev? - **Be specific about your stack**: If you want to use specific frameworks or libraries (like Astro, Tailwind, ShadCN, or any other popular JavaScript framework), mention them in your initial prompt to ensure Bolt scaffolds the project accordingly. - **Use the enhance prompt icon**: Before sending your prompt, try clicking the 'enhance' icon to have the AI model help you refine your prompt, then edit the results before submitting. -- **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps Bolt understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality. +- **Scaffold the basics first, then add features**: Make sure the basic structure of your application is in place before diving into more advanced functionality. This helps oTToDev understand the foundation of your project and ensure everything is wired up right before building out more advanced functionality. -- **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask Bolt to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly. +- **Batch simple instructions**: Save time by combining simple instructions into one message. For example, you can ask oTToDev to change the color scheme, add mobile responsiveness, and restart the dev server, all in one go saving you time and reducing API credit consumption significantly. + +### How do I contribute to oTToDev? + +[Please check out our dedicated page for contributing to oTToDev here!](CONTRIBUTING.md) + +### Do you plan on merging oTToDev back into the official Bolt.new repo? + +More news coming on this coming early next month - stay tuned! + +### What are the future plans for oTToDev? + +[Check out our Roadmap here!](https://roadmap.sh/r/ottodev-roadmap-2ovzo) + +Lot more updates to this roadmap coming soon! + +### Why are there so many open issues/pull requests? + +oTToDev was started simply to showcase how to edit an open source project and to do something cool with local LLMs on my (@ColeMedin) YouTube channel! However, it quickly +grew into a massive community project that I am working hard to keep up with the demand of by forming a team of maintainers and getting as many people involved as I can. +That effort is going well and all of our maintainers are ABSOLUTE rockstars, but it still takes time to organize everything so we can efficiently get through all +the issues and PRs. But rest assured, we are working hard and even working on some partnerships behind the scenes to really help this project take off! + +### How do local LLMs fair compared to larger models like Claude 3.5 Sonnet for oTToDev/Bolt.new? + +As much as the gap is quickly closing between open source and massive close source models, you’re still going to get the best results with the very large models like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. This is one of the big tasks we have at hand - figuring out how to prompt better, use agents, and improve the platform as a whole to make it work better for even the smaller local LLMs! + +### I'm getting the error: "There was an error processing this request" + +If you see this error within oTToDev, that is just the application telling you there is a problem at a high level, and this could mean a number of different things. To find the actual error, please check BOTH the terminal where you started the application (with Docker or pnpm) and the developer console in the browser. For most browsers, you can access the developer console by pressing F12 or right clicking anywhere in the browser and selecting “Inspect”. Then go to the “console” tab in the top right. + +### I'm getting the error: "x-api-key header missing" + +We have seen this error a couple times and for some reason just restarting the Docker container has fixed it. This seems to be Ollama specific. Another thing to try is try to run oTToDev with Docker or pnpm, whichever you didn’t run first. We are still on the hunt for why this happens once and a while! + +### I'm getting a blank preview when oTToDev runs my app! + +We promise you that we are constantly testing new PRs coming into oTToDev and the preview is core functionality, so the application is not broken! When you get a blank preview or don’t get a preview, this is generally because the LLM hallucinated bad code or incorrect commands. We are working on making this more transparent so it is obvious. Sometimes the error will appear in developer console too so check that as well. + +### Everything works but the results are bad + +This goes to the point above about how local LLMs are getting very powerful but you still are going to see better (sometimes much better) results with the largest LLMs like GPT-4o, Claude 3.5 Sonnet, and DeepSeek Coder V2 236b. If you are using smaller LLMs like Qwen-2.5-Coder, consider it more experimental and educational at this point. It can build smaller applications really well, which is super impressive for a local LLM, but for larger scale applications you want to use the larger LLMs still! diff --git a/app/components/chat/APIKeyManager.tsx b/app/components/chat/APIKeyManager.tsx index c61e466..28847bc 100644 --- a/app/components/chat/APIKeyManager.tsx +++ b/app/components/chat/APIKeyManager.tsx @@ -10,6 +10,7 @@ interface APIKeyManagerProps { labelForGetApiKey?: string; } +// eslint-disable-next-line @typescript-eslint/naming-convention export const APIKeyManager: React.FC = ({ provider, apiKey, setApiKey }) => { const [isEditing, setIsEditing] = useState(false); const [tempKey, setTempKey] = useState(apiKey); diff --git a/app/components/chat/BaseChat.tsx b/app/components/chat/BaseChat.tsx index adfd1bf..f004b3d 100644 --- a/app/components/chat/BaseChat.tsx +++ b/app/components/chat/BaseChat.tsx @@ -1,47 +1,45 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. +/* + * @ts-nocheck + * Preventing TS checks with files presented in the video for a better presentation. + */ import type { Message } from 'ai'; -import React, { type RefCallback, useEffect } from 'react'; +import React, { type RefCallback, useEffect, useState } from 'react'; import { ClientOnly } from 'remix-utils/client-only'; import { Menu } from '~/components/sidebar/Menu.client'; import { IconButton } from '~/components/ui/IconButton'; import { Workbench } from '~/components/workbench/Workbench.client'; import { classNames } from '~/utils/classNames'; -import { MODEL_LIST, DEFAULT_PROVIDER, PROVIDER_LIST, initializeModelList } from '~/utils/constants'; +import { MODEL_LIST, PROVIDER_LIST, initializeModelList } from '~/utils/constants'; import { Messages } from './Messages.client'; import { SendButton } from './SendButton.client'; -import { useState } from 'react'; import { APIKeyManager } from './APIKeyManager'; import Cookies from 'js-cookie'; +import * as Tooltip from '@radix-ui/react-tooltip'; import styles from './BaseChat.module.scss'; import type { ProviderInfo } from '~/utils/types'; +import { ExportChatButton } from '~/components/chat/chatExportAndImport/ExportChatButton'; +import { ImportButtons } from '~/components/chat/chatExportAndImport/ImportButtons'; +import { ExamplePrompts } from '~/components/chat/ExamplePrompts'; import FilePreview from './FilePreview'; -const EXAMPLE_PROMPTS = [ - { text: 'Build a todo app in React using Tailwind' }, - { text: 'Build a simple blog using Astro' }, - { text: 'Create a cookie consent form using Material UI' }, - { text: 'Make a space invaders game' }, - { text: 'How do I center a div?' }, -]; - -const providerList = PROVIDER_LIST; - +// @ts-ignore TODO: Introduce proper types +// eslint-disable-next-line @typescript-eslint/no-unused-vars const ModelSelector = ({ model, setModel, provider, setProvider, modelList, providerList, apiKeys }) => { return (
{ + const allFiles = Array.from(e.target.files || []); + const filteredFiles = allFiles.filter((file) => shouldIncludeFile(file.webkitRelativePath)); + + if (filteredFiles.length === 0) { + toast.error('No files found in the selected folder'); + return; + } + + try { + const fileChecks = await Promise.all( + filteredFiles.map(async (file) => ({ + file, + isBinary: await isBinaryFile(file), + })), + ); + + const textFiles = fileChecks.filter((f) => !f.isBinary).map((f) => f.file); + const binaryFilePaths = fileChecks + .filter((f) => f.isBinary) + .map((f) => f.file.webkitRelativePath.split('/').slice(1).join('/')); + + if (textFiles.length === 0) { + toast.error('No text files found in the selected folder'); + return; + } + + if (binaryFilePaths.length > 0) { + toast.info(`Skipping ${binaryFilePaths.length} binary files`); + } + + await createChatFromFolder(textFiles, binaryFilePaths); + } catch (error) { + console.error('Failed to import folder:', error); + toast.error('Failed to import folder'); + } + + e.target.value = ''; // Reset file input + }} + {...({} as any)} // if removed webkitdirectory will throw errors as unknow attribute + /> + + + ); +}; diff --git a/app/components/chat/Messages.client.tsx b/app/components/chat/Messages.client.tsx index a67104c..4a2ac6a 100644 --- a/app/components/chat/Messages.client.tsx +++ b/app/components/chat/Messages.client.tsx @@ -3,11 +3,11 @@ import React from 'react'; import { classNames } from '~/utils/classNames'; import { AssistantMessage } from './AssistantMessage'; import { UserMessage } from './UserMessage'; -import * as Tooltip from '@radix-ui/react-tooltip'; import { useLocation } from '@remix-run/react'; import { db, chatId } from '~/lib/persistence/useChatHistory'; import { forkChat } from '~/lib/persistence/db'; import { toast } from 'react-toastify'; +import WithTooltip from '~/components/ui/Tooltip'; interface MessagesProps { id?: string; @@ -41,92 +41,66 @@ export const Messages = React.forwardRef((props: }; return ( - -
- {messages.length > 0 - ? messages.map((message, index) => { - const { role, content, id: messageId } = message; - const isUserMessage = role === 'user'; - const isFirst = index === 0; - const isLast = index === messages.length - 1; +
+ {messages.length > 0 + ? messages.map((message, index) => { + const { role, content, id: messageId } = message; + const isUserMessage = role === 'user'; + const isFirst = index === 0; + const isLast = index === messages.length - 1; - return ( -
- {isUserMessage && ( -
-
-
- )} -
- {isUserMessage ? : } + return ( +
+ {isUserMessage && ( +
+
- {!isUserMessage && ( -
- - - {messageId && ( -
- )} + )} +
+ {isUserMessage ? : }
- ); - }) - : null} - {isStreaming && ( -
- )} -
- + {!isUserMessage && ( +
+ + {messageId && ( +
+ )} +
+ ); + }) + : null} + {isStreaming && ( +
+ )} +
); }); diff --git a/app/components/chat/UserMessage.tsx b/app/components/chat/UserMessage.tsx index c5d9c9b..167ce87 100644 --- a/app/components/chat/UserMessage.tsx +++ b/app/components/chat/UserMessage.tsx @@ -1,6 +1,7 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. -import { modificationsRegex } from '~/utils/diff'; +/* + * @ts-nocheck + * Preventing TS checks with files presented in the video for a better presentation. + */ import { MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants'; import { Markdown } from './Markdown'; @@ -18,11 +19,11 @@ export function UserMessage({ content }: UserMessageProps) { ); } -function sanitizeUserMessage(content: string | Array<{type: string, text?: string, image_url?: {url: string}}>) { +function sanitizeUserMessage(content: string | Array<{ type: string; text?: string; image_url?: { url: string } }>) { if (Array.isArray(content)) { - const textItem = content.find(item => item.type === 'text'); + const textItem = content.find((item) => item.type === 'text'); return textItem?.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') || ''; } - + return content.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''); -} \ No newline at end of file +} diff --git a/app/components/chat/chatExportAndImport/ExportChatButton.tsx b/app/components/chat/chatExportAndImport/ExportChatButton.tsx new file mode 100644 index 0000000..6ab294b --- /dev/null +++ b/app/components/chat/chatExportAndImport/ExportChatButton.tsx @@ -0,0 +1,13 @@ +import WithTooltip from '~/components/ui/Tooltip'; +import { IconButton } from '~/components/ui/IconButton'; +import React from 'react'; + +export const ExportChatButton = ({ exportChat }: { exportChat?: () => void }) => { + return ( + + exportChat?.()}> +
+
+
+ ); +}; diff --git a/app/components/chat/chatExportAndImport/ImportButtons.tsx b/app/components/chat/chatExportAndImport/ImportButtons.tsx new file mode 100644 index 0000000..2b59574 --- /dev/null +++ b/app/components/chat/chatExportAndImport/ImportButtons.tsx @@ -0,0 +1,71 @@ +import type { Message } from 'ai'; +import { toast } from 'react-toastify'; +import React from 'react'; +import { ImportFolderButton } from '~/components/chat/ImportFolderButton'; + +export function ImportButtons(importChat: ((description: string, messages: Message[]) => Promise) | undefined) { + return ( +
+ { + const file = e.target.files?.[0]; + + if (file && importChat) { + try { + const reader = new FileReader(); + + reader.onload = async (e) => { + try { + const content = e.target?.result as string; + const data = JSON.parse(content); + + if (!Array.isArray(data.messages)) { + toast.error('Invalid chat file format'); + } + + await importChat(data.description, data.messages); + toast.success('Chat imported successfully'); + } catch (error: unknown) { + if (error instanceof Error) { + toast.error('Failed to parse chat file: ' + error.message); + } else { + toast.error('Failed to parse chat file'); + } + } + }; + reader.onerror = () => toast.error('Failed to read chat file'); + reader.readAsText(file); + } catch (error) { + toast.error(error instanceof Error ? error.message : 'Failed to import chat'); + } + e.target.value = ''; // Reset file input + } else { + toast.error('Something went wrong'); + } + }} + /> +
+
+ + +
+
+
+ ); +} diff --git a/app/components/sidebar/HistoryItem.tsx b/app/components/sidebar/HistoryItem.tsx index df270c8..4c28435 100644 --- a/app/components/sidebar/HistoryItem.tsx +++ b/app/components/sidebar/HistoryItem.tsx @@ -1,70 +1,55 @@ import * as Dialog from '@radix-ui/react-dialog'; -import { useEffect, useRef, useState } from 'react'; import { type ChatHistoryItem } from '~/lib/persistence'; +import WithTooltip from '~/components/ui/Tooltip'; interface HistoryItemProps { item: ChatHistoryItem; onDelete?: (event: React.UIEvent) => void; onDuplicate?: (id: string) => void; + exportChat: (id?: string) => void; } -export function HistoryItem({ item, onDelete, onDuplicate }: HistoryItemProps) { - const [hovering, setHovering] = useState(false); - const hoverRef = useRef(null); - - useEffect(() => { - let timeout: NodeJS.Timeout | undefined; - - function mouseEnter() { - setHovering(true); - - if (timeout) { - clearTimeout(timeout); - } - } - - function mouseLeave() { - setHovering(false); - } - - hoverRef.current?.addEventListener('mouseenter', mouseEnter); - hoverRef.current?.addEventListener('mouseleave', mouseLeave); - - return () => { - hoverRef.current?.removeEventListener('mouseenter', mouseEnter); - hoverRef.current?.removeEventListener('mouseleave', mouseLeave); - }; - }, []); - +export function HistoryItem({ item, onDelete, onDuplicate, exportChat }: HistoryItemProps) { return ( -
+
{item.description} - diff --git a/app/components/sidebar/Menu.client.tsx b/app/components/sidebar/Menu.client.tsx index 99e10b7..1fef3a8 100644 --- a/app/components/sidebar/Menu.client.tsx +++ b/app/components/sidebar/Menu.client.tsx @@ -2,7 +2,6 @@ import { motion, type Variants } from 'framer-motion'; import { useCallback, useEffect, useRef, useState } from 'react'; import { toast } from 'react-toastify'; import { Dialog, DialogButton, DialogDescription, DialogRoot, DialogTitle } from '~/components/ui/Dialog'; -import { IconButton } from '~/components/ui/IconButton'; import { ThemeSwitch } from '~/components/ui/ThemeSwitch'; import { db, deleteById, getAll, chatId, type ChatHistoryItem, useChatHistory } from '~/lib/persistence'; import { cubicEasingFn } from '~/utils/easings'; @@ -34,7 +33,7 @@ const menuVariants = { type DialogContent = { type: 'delete'; item: ChatHistoryItem } | null; export const Menu = () => { - const { duplicateCurrentChat } = useChatHistory(); + const { duplicateCurrentChat, exportChat } = useChatHistory(); const menuRef = useRef(null); const [list, setList] = useState([]); const [open, setOpen] = useState(false); @@ -102,7 +101,6 @@ export const Menu = () => { const handleDeleteClick = (event: React.UIEvent, item: ChatHistoryItem) => { event.preventDefault(); - setDialogContent({ type: 'delete', item }); }; @@ -131,7 +129,7 @@ export const Menu = () => {
Your Chats
-
+
{list.length === 0 &&
No previous conversations
} {binDates(list).map(({ category, items }) => ( @@ -143,6 +141,7 @@ export const Menu = () => { handleDeleteClick(event, item)} onDuplicate={() => handleDuplicate(item.id)} /> @@ -186,4 +185,4 @@ export const Menu = () => {
); -} +}; diff --git a/app/components/ui/Tooltip.tsx b/app/components/ui/Tooltip.tsx new file mode 100644 index 0000000..4e22f54 --- /dev/null +++ b/app/components/ui/Tooltip.tsx @@ -0,0 +1,73 @@ +import * as Tooltip from '@radix-ui/react-tooltip'; + +interface TooltipProps { + tooltip: React.ReactNode; + children: React.ReactNode; + sideOffset?: number; + className?: string; + arrowClassName?: string; + tooltipStyle?: React.CSSProperties; + position?: 'top' | 'bottom' | 'left' | 'right'; + maxWidth?: number; + delay?: number; +} + +const WithTooltip = ({ + tooltip, + children, + sideOffset = 5, + className = '', + arrowClassName = '', + tooltipStyle = {}, + position = 'top', + maxWidth = 250, + delay = 0, +}: TooltipProps) => { + return ( + + {children} + + +
{tooltip}
+ +
+
+
+ ); +}; + +export default WithTooltip; diff --git a/app/components/workbench/EditorPanel.tsx b/app/components/workbench/EditorPanel.tsx index a9c9d33..0a18658 100644 --- a/app/components/workbench/EditorPanel.tsx +++ b/app/components/workbench/EditorPanel.tsx @@ -239,7 +239,7 @@ export const EditorPanel = memo(
Terminal {terminalCount > 1 && index} - + )} ); @@ -255,6 +255,7 @@ export const EditorPanel = memo(
{Array.from({ length: terminalCount + 1 }, (_, index) => { const isActive = activeTerminal === index; + if (index == 0) { logger.info('Starting bolt terminal'); @@ -273,6 +274,7 @@ export const EditorPanel = memo( /> ); } + return ( +
{filteredFileList.map((fileOrFolder) => { switch (fileOrFolder.kind) { case 'file': { diff --git a/app/components/workbench/Workbench.client.tsx b/app/components/workbench/Workbench.client.tsx index da5b60d..fb2f49e 100644 --- a/app/components/workbench/Workbench.client.tsx +++ b/app/components/workbench/Workbench.client.tsx @@ -57,7 +57,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) => renderLogger.trace('Workbench'); const [isSyncing, setIsSyncing] = useState(false); - const [isUploading, setIsUploading] = useState(false); const hasPreview = useStore(computed(workbenchStore.previews, (previews) => previews.length > 0)); const showWorkbench = useStore(workbenchStore.showWorkbench); @@ -120,60 +119,6 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) => } }, []); - const handleUploadFiles = useCallback(async () => { - setIsUploading(true); - - try { - // const directoryHandle = await window.showDirectoryPicker(); - - // // First upload new files - // await workbenchStore.uploadFilesFromDisk(directoryHandle); - - // // Get current files state - // const currentFiles = workbenchStore.files.get(); - - // // Create new modifications map with all files as "new" - // const newModifications = new Map(); - // Object.entries(currentFiles).forEach(([path, file]) => { - // if (file.type === 'file') { - // newModifications.set(path, file.content); - // } - // }); - - // // Update workbench state - // await workbenchStore.refreshFiles(); - // workbenchStore.resetAllFileModifications(); - - // toast.success('Files uploaded successfully'); - // } catch (error) { - // toast.error('Failed to upload files'); - // } - await handleUploadFilesFunc(); - } - - finally { - setIsUploading(false); - } - }, []); - - async function handleUploadFilesFunc() { - try { - // First clean all statuses - await workbenchStore.saveAllFiles(); - await workbenchStore.resetAllFileModifications(); - await workbenchStore.refreshFiles(); - - // Now upload new files - const directoryHandle = await window.showDirectoryPicker(); - await workbenchStore.uploadFilesFromDisk(directoryHandle); - - toast.success('Files uploaded successfully'); - } catch (error) { - console.error('Upload files error:', error); - toast.error('Failed to upload files'); - } - } - return ( chatStarted && ( {isSyncing ?
:
} {isSyncing ? 'Syncing...' : 'Sync Files'} - - {isSyncing ?
:
} - {isSyncing ? 'Uploading...' : 'Upload Files'} - { @@ -233,16 +174,21 @@ export const Workbench = memo(({ chatStarted, isStreaming }: WorkspaceProps) => 'Please enter a name for your new GitHub repository:', 'bolt-generated-project', ); + if (!repoName) { alert('Repository name is required. Push to GitHub cancelled.'); return; } + const githubUsername = prompt('Please enter your GitHub username:'); + if (!githubUsername) { alert('GitHub username is required. Push to GitHub cancelled.'); return; } + const githubToken = prompt('Please enter your GitHub personal access token:'); + if (!githubToken) { alert('GitHub token is required. Push to GitHub cancelled.'); return; diff --git a/app/lib/.server/llm/api-key.ts b/app/lib/.server/llm/api-key.ts index 7d8d2f9..2956181 100644 --- a/app/lib/.server/llm/api-key.ts +++ b/app/lib/.server/llm/api-key.ts @@ -1,5 +1,7 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. +/* + * @ts-nocheck + * Preventing TS checks with files presented in the video for a better presentation. + */ import { env } from 'node:process'; export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Record) { @@ -28,17 +30,19 @@ export function getAPIKey(cloudflareEnv: Env, provider: string, userApiKeys?: Re case 'OpenRouter': return env.OPEN_ROUTER_API_KEY || cloudflareEnv.OPEN_ROUTER_API_KEY; case 'Deepseek': - return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY + return env.DEEPSEEK_API_KEY || cloudflareEnv.DEEPSEEK_API_KEY; case 'Mistral': - return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY; - case "OpenAILike": + return env.MISTRAL_API_KEY || cloudflareEnv.MISTRAL_API_KEY; + case 'OpenAILike': return env.OPENAI_LIKE_API_KEY || cloudflareEnv.OPENAI_LIKE_API_KEY; - case "xAI": + case 'xAI': return env.XAI_API_KEY || cloudflareEnv.XAI_API_KEY; - case "Cohere": + case 'Cohere': return env.COHERE_API_KEY; + case 'AzureOpenAI': + return env.AZURE_OPENAI_API_KEY; default: - return ""; + return ''; } } @@ -47,14 +51,17 @@ export function getBaseURL(cloudflareEnv: Env, provider: string) { case 'OpenAILike': return env.OPENAI_LIKE_API_BASE_URL || cloudflareEnv.OPENAI_LIKE_API_BASE_URL; case 'LMStudio': - return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || "http://localhost:1234"; - case 'Ollama': - let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || "http://localhost:11434"; - if (env.RUNNING_IN_DOCKER === 'true') { - baseUrl = baseUrl.replace("localhost", "host.docker.internal"); - } - return baseUrl; + return env.LMSTUDIO_API_BASE_URL || cloudflareEnv.LMSTUDIO_API_BASE_URL || 'http://localhost:1234'; + case 'Ollama': { + let baseUrl = env.OLLAMA_API_BASE_URL || cloudflareEnv.OLLAMA_API_BASE_URL || 'http://localhost:11434'; + + if (env.RUNNING_IN_DOCKER === 'true') { + baseUrl = baseUrl.replace('localhost', 'host.docker.internal'); + } + + return baseUrl; + } default: - return ""; + return ''; } } diff --git a/app/lib/.server/llm/model.ts b/app/lib/.server/llm/model.ts index 307c817..65d65cd 100644 --- a/app/lib/.server/llm/model.ts +++ b/app/lib/.server/llm/model.ts @@ -1,27 +1,29 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. +/* + * @ts-nocheck + * Preventing TS checks with files presented in the video for a better presentation. + */ import { getAPIKey, getBaseURL } from '~/lib/.server/llm/api-key'; import { createAnthropic } from '@ai-sdk/anthropic'; import { createOpenAI } from '@ai-sdk/openai'; import { createGoogleGenerativeAI } from '@ai-sdk/google'; import { ollama } from 'ollama-ai-provider'; -import { createOpenRouter } from "@openrouter/ai-sdk-provider"; +import { createOpenRouter } from '@openrouter/ai-sdk-provider'; import { createMistral } from '@ai-sdk/mistral'; -import { createCohere } from '@ai-sdk/cohere' +import { createCohere } from '@ai-sdk/cohere'; +import type { LanguageModelV1 } from 'ai'; -export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? - parseInt(process.env.DEFAULT_NUM_CTX, 10) : - 32768; +export const DEFAULT_NUM_CTX = process.env.DEFAULT_NUM_CTX ? parseInt(process.env.DEFAULT_NUM_CTX, 10) : 32768; -export function getAnthropicModel(apiKey: string, model: string) { +type OptionalApiKey = string | undefined; + +export function getAnthropicModel(apiKey: OptionalApiKey, model: string) { const anthropic = createAnthropic({ apiKey, }); return anthropic(model); } - -export function getOpenAILikeModel(baseURL: string, apiKey: string, model: string) { +export function getOpenAILikeModel(baseURL: string, apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ baseURL, apiKey, @@ -30,7 +32,7 @@ export function getOpenAILikeModel(baseURL: string, apiKey: string, model: strin return openai(model); } -export function getCohereAIModel(apiKey:string, model: string){ +export function getCohereAIModel(apiKey: OptionalApiKey, model: string) { const cohere = createCohere({ apiKey, }); @@ -38,7 +40,7 @@ export function getCohereAIModel(apiKey:string, model: string){ return cohere(model); } -export function getOpenAIModel(apiKey: string, model: string) { +export function getOpenAIModel(apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ apiKey, }); @@ -46,15 +48,15 @@ export function getOpenAIModel(apiKey: string, model: string) { return openai(model); } -export function getMistralModel(apiKey: string, model: string) { +export function getMistralModel(apiKey: OptionalApiKey, model: string) { const mistral = createMistral({ - apiKey + apiKey, }); return mistral(model); } -export function getGoogleModel(apiKey: string, model: string) { +export function getGoogleModel(apiKey: OptionalApiKey, model: string) { const google = createGoogleGenerativeAI({ apiKey, }); @@ -62,7 +64,7 @@ export function getGoogleModel(apiKey: string, model: string) { return google(model); } -export function getGroqModel(apiKey: string, model: string) { +export function getGroqModel(apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ baseURL: 'https://api.groq.com/openai/v1', apiKey, @@ -71,7 +73,7 @@ export function getGroqModel(apiKey: string, model: string) { return openai(model); } -export function getHuggingFaceModel(apiKey: string, model: string) { +export function getHuggingFaceModel(apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ baseURL: 'https://api-inference.huggingface.co/v1/', apiKey, @@ -81,15 +83,16 @@ export function getHuggingFaceModel(apiKey: string, model: string) { } export function getOllamaModel(baseURL: string, model: string) { - let Ollama = ollama(model, { + const ollamaInstance = ollama(model, { numCtx: DEFAULT_NUM_CTX, - }); + }) as LanguageModelV1 & { config: any }; - Ollama.config.baseURL = `${baseURL}/api`; - return Ollama; + ollamaInstance.config.baseURL = `${baseURL}/api`; + + return ollamaInstance; } -export function getDeepseekModel(apiKey: string, model: string) { +export function getDeepseekModel(apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ baseURL: 'https://api.deepseek.com/beta', apiKey, @@ -98,9 +101,9 @@ export function getDeepseekModel(apiKey: string, model: string) { return openai(model); } -export function getOpenRouterModel(apiKey: string, model: string) { +export function getOpenRouterModel(apiKey: OptionalApiKey, model: string) { const openRouter = createOpenRouter({ - apiKey + apiKey, }); return openRouter.chat(model); @@ -109,13 +112,13 @@ export function getOpenRouterModel(apiKey: string, model: string) { export function getLMStudioModel(baseURL: string, model: string) { const lmstudio = createOpenAI({ baseUrl: `${baseURL}/v1`, - apiKey: "", + apiKey: '', }); return lmstudio(model); } -export function getXAIModel(apiKey: string, model: string) { +export function getXAIModel(apiKey: OptionalApiKey, model: string) { const openai = createOpenAI({ baseURL: 'https://api.x.ai/v1', apiKey, @@ -125,11 +128,13 @@ export function getXAIModel(apiKey: string, model: string) { } export function getModel(provider: string, model: string, env: Env, apiKeys?: Record) { - let apiKey; // Declare first - let baseURL; + /* + * let apiKey; // Declare first + * let baseURL; + */ - apiKey = getAPIKey(env, provider, apiKeys); // Then assign - baseURL = getBaseURL(env, provider); + const apiKey = getAPIKey(env, provider, apiKeys); // Then assign + const baseURL = getBaseURL(env, provider); switch (provider) { case 'Anthropic': @@ -159,4 +164,4 @@ export function getModel(provider: string, model: string, env: Env, apiKeys?: Re default: return getOllamaModel(baseURL, model); } -} \ No newline at end of file +} diff --git a/app/lib/.server/llm/stream-text.ts b/app/lib/.server/llm/stream-text.ts index 965ec95..073a378 100644 --- a/app/lib/.server/llm/stream-text.ts +++ b/app/lib/.server/llm/stream-text.ts @@ -1,10 +1,11 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. -import { streamText as _streamText, convertToCoreMessages } from 'ai'; +// eslint-disable-next-line @typescript-eslint/ban-ts-comment +// @ts-nocheck – TODO: Provider proper types + +import { convertToCoreMessages, streamText as _streamText } from 'ai'; import { getModel } from '~/lib/.server/llm/model'; import { MAX_TOKENS } from './constants'; import { getSystemPrompt } from './prompts'; -import { MODEL_LIST, DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants'; +import { DEFAULT_MODEL, DEFAULT_PROVIDER, MODEL_LIST, MODEL_REGEX, PROVIDER_REGEX } from '~/utils/constants'; interface ToolResult { toolCallId: string; @@ -26,41 +27,41 @@ export type StreamingOptions = Omit[0], 'model'>; function extractPropertiesFromMessage(message: Message): { model: string; provider: string; content: string } { const textContent = Array.isArray(message.content) - ? message.content.find(item => item.type === 'text')?.text || '' + ? message.content.find((item) => item.type === 'text')?.text || '' : message.content; const modelMatch = textContent.match(MODEL_REGEX); const providerMatch = textContent.match(PROVIDER_REGEX); - // Extract model - // const modelMatch = message.content.match(MODEL_REGEX); + /* + * Extract model + * const modelMatch = message.content.match(MODEL_REGEX); + */ const model = modelMatch ? modelMatch[1] : DEFAULT_MODEL; - // Extract provider - // const providerMatch = message.content.match(PROVIDER_REGEX); + /* + * Extract provider + * const providerMatch = message.content.match(PROVIDER_REGEX); + */ const provider = providerMatch ? providerMatch[1] : DEFAULT_PROVIDER; const cleanedContent = Array.isArray(message.content) - ? message.content.map(item => { - if (item.type === 'text') { - return { - type: 'text', - text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, '') - }; - } - return item; // Preserve image_url and other types as is - }) + ? message.content.map((item) => { + if (item.type === 'text') { + return { + type: 'text', + text: item.text?.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''), + }; + } + + return item; // Preserve image_url and other types as is + }) : textContent.replace(MODEL_REGEX, '').replace(PROVIDER_REGEX, ''); return { model, provider, content: cleanedContent }; } -export function streamText( - messages: Messages, - env: Env, - options?: StreamingOptions, - apiKeys?: Record -) { +export function streamText(messages: Messages, env: Env, options?: StreamingOptions, apiKeys?: Record) { let currentModel = DEFAULT_MODEL; let currentProvider = DEFAULT_PROVIDER; @@ -76,15 +77,13 @@ export function streamText( return { ...message, content }; } + return message; }); const modelDetails = MODEL_LIST.find((m) => m.name === currentModel); - const dynamicMaxTokens = - modelDetails && modelDetails.maxTokenAllowed - ? modelDetails.maxTokenAllowed - : MAX_TOKENS; + const dynamicMaxTokens = modelDetails && modelDetails.maxTokenAllowed ? modelDetails.maxTokenAllowed : MAX_TOKENS; return _streamText({ ...options, diff --git a/app/lib/persistence/db.ts b/app/lib/persistence/db.ts index 3aa2004..6ce604d 100644 --- a/app/lib/persistence/db.ts +++ b/app/lib/persistence/db.ts @@ -161,46 +161,48 @@ async function getUrlIds(db: IDBDatabase): Promise { export async function forkChat(db: IDBDatabase, chatId: string, messageId: string): Promise { const chat = await getMessages(db, chatId); - if (!chat) throw new Error('Chat not found'); - // Find the index of the message to fork at - const messageIndex = chat.messages.findIndex(msg => msg.id === messageId); - if (messageIndex === -1) throw new Error('Message not found'); - - // Get messages up to and including the selected message - const messages = chat.messages.slice(0, messageIndex + 1); - - // Generate new IDs - const newId = await getNextId(db); - const urlId = await getUrlId(db, newId); - - // Create the forked chat - await setMessages( - db, - newId, - messages, - urlId, - chat.description ? `${chat.description} (fork)` : 'Forked chat' - ); - - return urlId; -} - -export async function duplicateChat(db: IDBDatabase, id: string): Promise { - const chat = await getMessages(db, id); if (!chat) { throw new Error('Chat not found'); } + // Find the index of the message to fork at + const messageIndex = chat.messages.findIndex((msg) => msg.id === messageId); + + if (messageIndex === -1) { + throw new Error('Message not found'); + } + + // Get messages up to and including the selected message + const messages = chat.messages.slice(0, messageIndex + 1); + + return createChatFromMessages(db, chat.description ? `${chat.description} (fork)` : 'Forked chat', messages); +} + +export async function duplicateChat(db: IDBDatabase, id: string): Promise { + const chat = await getMessages(db, id); + + if (!chat) { + throw new Error('Chat not found'); + } + + return createChatFromMessages(db, `${chat.description || 'Chat'} (copy)`, chat.messages); +} + +export async function createChatFromMessages( + db: IDBDatabase, + description: string, + messages: Message[], +): Promise { const newId = await getNextId(db); const newUrlId = await getUrlId(db, newId); // Get a new urlId for the duplicated chat await setMessages( db, newId, - chat.messages, + messages, newUrlId, // Use the new urlId - `${chat.description || 'Chat'} (copy)` + description, ); return newUrlId; // Return the urlId instead of id for navigation diff --git a/app/lib/persistence/useChatHistory.ts b/app/lib/persistence/useChatHistory.ts index f5e8138..9daa61f 100644 --- a/app/lib/persistence/useChatHistory.ts +++ b/app/lib/persistence/useChatHistory.ts @@ -4,7 +4,15 @@ import { atom } from 'nanostores'; import type { Message } from 'ai'; import { toast } from 'react-toastify'; import { workbenchStore } from '~/lib/stores/workbench'; -import { getMessages, getNextId, getUrlId, openDatabase, setMessages, duplicateChat } from './db'; +import { + getMessages, + getNextId, + getUrlId, + openDatabase, + setMessages, + duplicateChat, + createChatFromMessages, +} from './db'; export interface ChatHistoryItem { id: string; @@ -99,7 +107,7 @@ export function useChatHistory() { await setMessages(db, chatId.get() as string, messages, urlId, description.get()); }, - duplicateCurrentChat: async (listItemId:string) => { + duplicateCurrentChat: async (listItemId: string) => { if (!db || (!mixedId && !listItemId)) { return; } @@ -110,8 +118,48 @@ export function useChatHistory() { toast.success('Chat duplicated successfully'); } catch (error) { toast.error('Failed to duplicate chat'); + console.log(error); } - } + }, + importChat: async (description: string, messages: Message[]) => { + if (!db) { + return; + } + + try { + const newId = await createChatFromMessages(db, description, messages); + window.location.href = `/chat/${newId}`; + toast.success('Chat imported successfully'); + } catch (error) { + if (error instanceof Error) { + toast.error('Failed to import chat: ' + error.message); + } else { + toast.error('Failed to import chat'); + } + } + }, + exportChat: async (id = urlId) => { + if (!db || !id) { + return; + } + + const chat = await getMessages(db, id); + const chatData = { + messages: chat.messages, + description: chat.description, + exportDate: new Date().toISOString(), + }; + + const blob = new Blob([JSON.stringify(chatData, null, 2)], { type: 'application/json' }); + const url = URL.createObjectURL(blob); + const a = document.createElement('a'); + a.href = url; + a.download = `chat-${new Date().toISOString()}.json`; + document.body.appendChild(a); + a.click(); + document.body.removeChild(a); + URL.revokeObjectURL(url); + }, }; } diff --git a/app/lib/runtime/action-runner.ts b/app/lib/runtime/action-runner.ts index e38a8ce..13b17ef 100644 --- a/app/lib/runtime/action-runner.ts +++ b/app/lib/runtime/action-runner.ts @@ -1,11 +1,10 @@ -import { WebContainer, type WebContainerProcess } from '@webcontainer/api'; +import { WebContainer } from '@webcontainer/api'; import { atom, map, type MapStore } from 'nanostores'; import * as nodePath from 'node:path'; import type { BoltAction } from '~/types/actions'; import { createScopedLogger } from '~/utils/logger'; import { unreachable } from '~/utils/unreachable'; import type { ActionCallbackData } from './message-parser'; -import type { ITerminal } from '~/types/terminal'; import type { BoltShell } from '~/utils/shell'; const logger = createScopedLogger('ActionRunner'); @@ -45,7 +44,6 @@ export class ActionRunner { constructor(webcontainerPromise: Promise, getShellTerminal: () => BoltShell) { this.#webcontainer = webcontainerPromise; this.#shellTerminal = getShellTerminal; - } addAction(data: ActionCallbackData) { @@ -88,15 +86,16 @@ export class ActionRunner { if (action.executed) { return; } + if (isStreaming && action.type !== 'file') { return; } this.#updateAction(actionId, { ...action, ...data.action, executed: !isStreaming }); - return this.#currentExecutionPromise = this.#currentExecutionPromise + this.#currentExecutionPromise = this.#currentExecutionPromise .then(() => { - return this.#executeAction(actionId, isStreaming); + this.#executeAction(actionId, isStreaming); }) .catch((error) => { console.error('Action failed:', error); @@ -121,17 +120,23 @@ export class ActionRunner { case 'start': { // making the start app non blocking - this.#runStartAction(action).then(()=>this.#updateAction(actionId, { status: 'complete' })) - .catch(()=>this.#updateAction(actionId, { status: 'failed', error: 'Action failed' })) - // adding a delay to avoid any race condition between 2 start actions - // i am up for a better approch - await new Promise(resolve=>setTimeout(resolve,2000)) - return - break; + this.#runStartAction(action) + .then(() => this.#updateAction(actionId, { status: 'complete' })) + .catch(() => this.#updateAction(actionId, { status: 'failed', error: 'Action failed' })); + + /* + * adding a delay to avoid any race condition between 2 start actions + * i am up for a better approach + */ + await new Promise((resolve) => setTimeout(resolve, 2000)); + + return; } } - this.#updateAction(actionId, { status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete' }); + this.#updateAction(actionId, { + status: isStreaming ? 'running' : action.abortSignal.aborted ? 'aborted' : 'complete', + }); } catch (error) { this.#updateAction(actionId, { status: 'failed', error: 'Action failed' }); logger.error(`[${action.type}]:Action failed\n\n`, error); @@ -145,16 +150,19 @@ export class ActionRunner { if (action.type !== 'shell') { unreachable('Expected shell action'); } - const shell = this.#shellTerminal() - await shell.ready() + + const shell = this.#shellTerminal(); + await shell.ready(); + if (!shell || !shell.terminal || !shell.process) { unreachable('Shell terminal not found'); } - const resp = await shell.executeCommand(this.runnerId.get(), action.content) - logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`) - if (resp?.exitCode != 0) { - throw new Error("Failed To Execute Shell Command"); + const resp = await shell.executeCommand(this.runnerId.get(), action.content); + logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`); + + if (resp?.exitCode != 0) { + throw new Error('Failed To Execute Shell Command'); } } @@ -162,21 +170,26 @@ export class ActionRunner { if (action.type !== 'start') { unreachable('Expected shell action'); } + if (!this.#shellTerminal) { unreachable('Shell terminal not found'); } - const shell = this.#shellTerminal() - await shell.ready() + + const shell = this.#shellTerminal(); + await shell.ready(); + if (!shell || !shell.terminal || !shell.process) { unreachable('Shell terminal not found'); } - const resp = await shell.executeCommand(this.runnerId.get(), action.content) - logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`) + + const resp = await shell.executeCommand(this.runnerId.get(), action.content); + logger.debug(`${action.type} Shell Response: [exit code:${resp?.exitCode}]`); if (resp?.exitCode != 0) { - throw new Error("Failed To Start Application"); + throw new Error('Failed To Start Application'); } - return resp + + return resp; } async #runFileAction(action: ActionState) { diff --git a/app/lib/runtime/message-parser.ts b/app/lib/runtime/message-parser.ts index 4b564da..48f3f52 100644 --- a/app/lib/runtime/message-parser.ts +++ b/app/lib/runtime/message-parser.ts @@ -55,7 +55,7 @@ interface MessageState { export class StreamingMessageParser { #messages = new Map(); - constructor(private _options: StreamingMessageParserOptions = {}) { } + constructor(private _options: StreamingMessageParserOptions = {}) {} parse(messageId: string, input: string) { let state = this.#messages.get(messageId); @@ -120,20 +120,20 @@ export class StreamingMessageParser { i = closeIndex + ARTIFACT_ACTION_TAG_CLOSE.length; } else { if ('type' in currentAction && currentAction.type === 'file') { - let content = input.slice(i); + const content = input.slice(i); this._options.callbacks?.onActionStream?.({ artifactId: currentArtifact.id, messageId, actionId: String(state.actionId - 1), action: { - ...currentAction as FileAction, + ...(currentAction as FileAction), content, filePath: currentAction.filePath, }, - }); } + break; } } else { @@ -272,7 +272,7 @@ export class StreamingMessageParser { } (actionAttributes as FileAction).filePath = filePath; - } else if (!(['shell', 'start'].includes(actionType))) { + } else if (!['shell', 'start'].includes(actionType)) { logger.warn(`Unknown action type '${actionType}'`); } diff --git a/app/lib/stores/files.ts b/app/lib/stores/files.ts index b0f726e..ca9e57b 100644 --- a/app/lib/stores/files.ts +++ b/app/lib/stores/files.ts @@ -80,10 +80,6 @@ export class FilesStore { this.#modifiedFiles.clear(); } - markFileAsNew(filePath: string) { - this.#modifiedFiles.set(filePath, ''); - } - async saveFile(filePath: string, content: string) { const webcontainer = await this.#webcontainer; @@ -216,9 +212,5 @@ function isBinaryFile(buffer: Uint8Array | undefined) { * array buffer. */ function convertToBuffer(view: Uint8Array): Buffer { - const buffer = new Uint8Array(view.buffer, view.byteOffset, view.byteLength); - - Object.setPrototypeOf(buffer, Buffer.prototype); - - return buffer as Buffer; + return Buffer.from(view.buffer, view.byteOffset, view.byteLength); } diff --git a/app/lib/stores/terminal.ts b/app/lib/stores/terminal.ts index b2537cc..9de9f4e 100644 --- a/app/lib/stores/terminal.ts +++ b/app/lib/stores/terminal.ts @@ -7,7 +7,7 @@ import { coloredText } from '~/utils/terminal'; export class TerminalStore { #webcontainer: Promise; #terminals: Array<{ terminal: ITerminal; process: WebContainerProcess }> = []; - #boltTerminal = newBoltShellProcess() + #boltTerminal = newBoltShellProcess(); showTerminal: WritableAtom = import.meta.hot?.data.showTerminal ?? atom(true); @@ -27,8 +27,8 @@ export class TerminalStore { } async attachBoltTerminal(terminal: ITerminal) { try { - let wc = await this.#webcontainer - await this.#boltTerminal.init(wc, terminal) + const wc = await this.#webcontainer; + await this.#boltTerminal.init(wc, terminal); } catch (error: any) { terminal.write(coloredText.red('Failed to spawn bolt shell\n\n') + error.message); return; diff --git a/app/lib/stores/workbench.ts b/app/lib/stores/workbench.ts index 6378ba7..cbb3f8a 100644 --- a/app/lib/stores/workbench.ts +++ b/app/lib/stores/workbench.ts @@ -11,9 +11,8 @@ import { PreviewsStore } from './previews'; import { TerminalStore } from './terminal'; import JSZip from 'jszip'; import { saveAs } from 'file-saver'; -import { Octokit, type RestEndpointMethodTypes } from "@octokit/rest"; +import { Octokit, type RestEndpointMethodTypes } from '@octokit/rest'; import * as nodePath from 'node:path'; -import type { WebContainerProcess } from '@webcontainer/api'; import { extractRelativePath } from '~/utils/diff'; export interface ArtifactState { @@ -32,7 +31,6 @@ export type WorkbenchViewType = 'code' | 'preview'; export class WorkbenchStore { #previewsStore = new PreviewsStore(webcontainer); #filesStore = new FilesStore(webcontainer); - #editorStore = new EditorStore(this.#filesStore); #terminalStore = new TerminalStore(webcontainer); @@ -43,7 +41,6 @@ export class WorkbenchStore { unsavedFiles: WritableAtom> = import.meta.hot?.data.unsavedFiles ?? atom(new Set()); modifiedFiles = new Set(); artifactIdList: string[] = []; - #boltTerminal: { terminal: ITerminal; process: WebContainerProcess } | undefined; #globalExecutionQueue = Promise.resolve(); constructor() { if (import.meta.hot) { @@ -55,7 +52,7 @@ export class WorkbenchStore { } addToExecutionQueue(callback: () => Promise) { - this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback()) + this.#globalExecutionQueue = this.#globalExecutionQueue.then(() => callback()); } get previews() { @@ -97,7 +94,6 @@ export class WorkbenchStore { this.#terminalStore.attachTerminal(terminal); } attachBoltTerminal(terminal: ITerminal) { - this.#terminalStore.attachBoltTerminal(terminal); } @@ -262,7 +258,8 @@ export class WorkbenchStore { this.artifacts.setKey(messageId, { ...artifact, ...state }); } addAction(data: ActionCallbackData) { - this._addAction(data) + this._addAction(data); + // this.addToExecutionQueue(()=>this._addAction(data)) } async _addAction(data: ActionCallbackData) { @@ -279,10 +276,9 @@ export class WorkbenchStore { runAction(data: ActionCallbackData, isStreaming: boolean = false) { if (isStreaming) { - this._runAction(data, isStreaming) - } - else { - this.addToExecutionQueue(() => this._runAction(data, isStreaming)) + this._runAction(data, isStreaming); + } else { + this.addToExecutionQueue(() => this._runAction(data, isStreaming)); } } async _runAction(data: ActionCallbackData, isStreaming: boolean = false) { @@ -293,16 +289,21 @@ export class WorkbenchStore { if (!artifact) { unreachable('Artifact not found'); } + if (data.action.type === 'file') { - let wc = await webcontainer + const wc = await webcontainer; const fullPath = nodePath.join(wc.workdir, data.action.filePath); + if (this.selectedFile.value !== fullPath) { this.setSelectedFile(fullPath); } + if (this.currentView.value !== 'code') { this.currentView.set('code'); } + const doc = this.#editorStore.documents.get()[fullPath]; + if (!doc) { await artifact.runner.runAction(data, isStreaming); } @@ -382,63 +383,7 @@ export class WorkbenchStore { return syncedFiles; } - async uploadFilesFromDisk(sourceHandle: FileSystemDirectoryHandle) { - const loadedFiles = []; - const wc = await webcontainer; - const newFiles = {}; - - const processDirectory = async (handle: FileSystemDirectoryHandle, currentPath: string = '') => { - const entries = await Array.fromAsync(handle.values()); - - for (const entry of entries) { - const entryPath = currentPath ? `${currentPath}/${entry.name}` : entry.name; - const fullPath = `/${entryPath}`; - - if (entry.kind === 'directory') { - await wc.fs.mkdir(fullPath, { recursive: true }); - const subDirHandle = await handle.getDirectoryHandle(entry.name); - await processDirectory(subDirHandle, entryPath); - } else { - const file = await entry.getFile(); - const content = await file.text(); - - // Write to WebContainer - await wc.fs.writeFile(fullPath, content); - - // Mark file as new - this.#filesStore.markFileAsNew(fullPath); - - // Update the files store with the current content - this.files.setKey(fullPath, { type: 'file', content, isBinary: false }); - - // Collect for editor store with actual content - newFiles[fullPath] = { type: 'file', content, isBinary: false }; - loadedFiles.push(entryPath); - } - } - } - - await processDirectory(sourceHandle); - - return loadedFiles; - } - - async refreshFiles() { - // Clear old state - this.modifiedFiles = new Set(); - this.artifactIdList = []; - - // Reset stores - this.#filesStore = new FilesStore(webcontainer); - this.#editorStore = new EditorStore(this.#filesStore); - - // Update UI state - this.currentView.set('code'); - this.unsavedFiles.set(new Set()); - } - async pushToGitHub(repoName: string, githubUsername: string, ghToken: string) { - try { // Get the GitHub auth token from environment variables const githubToken = ghToken; @@ -453,10 +398,11 @@ export class WorkbenchStore { const octokit = new Octokit({ auth: githubToken }); // Check if the repository already exists before creating it - let repo: RestEndpointMethodTypes["repos"]["get"]["response"]['data'] + let repo: RestEndpointMethodTypes['repos']['get']['response']['data']; + try { - let resp = await octokit.repos.get({ owner: owner, repo: repoName }); - repo = resp.data + const resp = await octokit.repos.get({ owner, repo: repoName }); + repo = resp.data; } catch (error) { if (error instanceof Error && 'status' in error && error.status === 404) { // Repository doesn't exist, so create a new one @@ -474,6 +420,7 @@ export class WorkbenchStore { // Get all files const files = this.files.get(); + if (!files || Object.keys(files).length === 0) { throw new Error('No files found to push'); } @@ -490,7 +437,9 @@ export class WorkbenchStore { }); return { path: extractRelativePath(filePath), sha: blob.sha }; } - }) + + return null; + }), ); const validBlobs = blobs.filter(Boolean); // Filter out any undefined blobs @@ -542,21 +491,6 @@ export class WorkbenchStore { console.error('Error pushing to GitHub:', error instanceof Error ? error.message : String(error)); } } - - async markFileAsModified(filePath: string) { - const file = this.#filesStore.getFile(filePath); - if (file?.type === 'file') { - // First collect all original content - const originalContent = file.content; - console.log(`Processing ${filePath}:`, originalContent); - - // Then save modifications - await this.saveFile(filePath, originalContent); - } - } - - - } export const workbenchStore = new WorkbenchStore(); diff --git a/app/routes/api.chat.ts b/app/routes/api.chat.ts index 789beb0..4961b9e 100644 --- a/app/routes/api.chat.ts +++ b/app/routes/api.chat.ts @@ -1,5 +1,6 @@ -// @ts-nocheck -// Preventing TS checks with files presented in the video for a better presentation. +// eslint-disable-next-line @typescript-eslint/ban-ts-comment +// @ts-nocheck – TODO: Provider proper types + import { type ActionFunctionArgs } from '@remix-run/cloudflare'; import { MAX_RESPONSE_SEGMENTS, MAX_TOKENS } from '~/lib/.server/llm/constants'; import { CONTINUE_PROMPT } from '~/lib/.server/llm/prompts'; @@ -14,14 +15,15 @@ function parseCookies(cookieHeader) { const cookies = {}; // Split the cookie string by semicolons and spaces - const items = cookieHeader.split(";").map(cookie => cookie.trim()); + const items = cookieHeader.split(';').map((cookie) => cookie.trim()); + + items.forEach((item) => { + const [name, ...rest] = item.split('='); - items.forEach(item => { - const [name, ...rest] = item.split("="); if (name && rest) { // Decode the name and value, and join value parts in case it contains '=' const decodedName = decodeURIComponent(name.trim()); - const decodedValue = decodeURIComponent(rest.join("=").trim()); + const decodedValue = decodeURIComponent(rest.join('=').trim()); cookies[decodedName] = decodedValue; } }); @@ -30,17 +32,15 @@ function parseCookies(cookieHeader) { } async function chatAction({ context, request }: ActionFunctionArgs) { - - const { messages, imageData, model } = await request.json<{ - messages: Messages, - imageData?: string[], - model: string + const { messages, model } = await request.json<{ + messages: Messages; + model: string; }>(); - const cookieHeader = request.headers.get("Cookie"); + const cookieHeader = request.headers.get('Cookie'); // Parse the cookie's value (returns an object or null if no cookie exists) - const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || "{}"); + const apiKeys = JSON.parse(parseCookies(cookieHeader).apiKeys || '{}'); const stream = new SwitchableStream(); @@ -87,7 +87,7 @@ async function chatAction({ context, request }: ActionFunctionArgs) { if (error.message?.includes('API key')) { throw new Response('Invalid or missing API key', { status: 401, - statusText: 'Unauthorized' + statusText: 'Unauthorized', }); } diff --git a/app/types/model.ts b/app/types/model.ts index 12b6929..32522c6 100644 --- a/app/types/model.ts +++ b/app/types/model.ts @@ -1,10 +1,10 @@ import type { ModelInfo } from '~/utils/types'; export type ProviderInfo = { - staticModels: ModelInfo[], - name: string, - getDynamicModels?: () => Promise, - getApiKeyLink?: string, - labelForGetApiKey?: string, - icon?:string, + staticModels: ModelInfo[]; + name: string; + getDynamicModels?: () => Promise; + getApiKeyLink?: string; + labelForGetApiKey?: string; + icon?: string; }; diff --git a/app/utils/constants.ts b/app/utils/constants.ts index d781353..17fe9d8 100644 --- a/app/utils/constants.ts +++ b/app/utils/constants.ts @@ -12,29 +12,42 @@ const PROVIDER_LIST: ProviderInfo[] = [ { name: 'Anthropic', staticModels: [ - { name: 'claude-3-5-sonnet-latest', label: 'Claude 3.5 Sonnet (new)', provider: 'Anthropic', maxTokenAllowed: 8000 }, - { name: 'claude-3-5-sonnet-20240620', label: 'Claude 3.5 Sonnet (old)', provider: 'Anthropic', maxTokenAllowed: 8000 }, - { name: 'claude-3-5-haiku-latest', label: 'Claude 3.5 Haiku (new)', provider: 'Anthropic', maxTokenAllowed: 8000 }, + { + name: 'claude-3-5-sonnet-latest', + label: 'Claude 3.5 Sonnet (new)', + provider: 'Anthropic', + maxTokenAllowed: 8000, + }, + { + name: 'claude-3-5-sonnet-20240620', + label: 'Claude 3.5 Sonnet (old)', + provider: 'Anthropic', + maxTokenAllowed: 8000, + }, + { + name: 'claude-3-5-haiku-latest', + label: 'Claude 3.5 Haiku (new)', + provider: 'Anthropic', + maxTokenAllowed: 8000, + }, { name: 'claude-3-opus-latest', label: 'Claude 3 Opus', provider: 'Anthropic', maxTokenAllowed: 8000 }, { name: 'claude-3-sonnet-20240229', label: 'Claude 3 Sonnet', provider: 'Anthropic', maxTokenAllowed: 8000 }, - { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 } + { name: 'claude-3-haiku-20240307', label: 'Claude 3 Haiku', provider: 'Anthropic', maxTokenAllowed: 8000 }, ], - getApiKeyLink: "https://console.anthropic.com/settings/keys", + getApiKeyLink: 'https://console.anthropic.com/settings/keys', }, { name: 'Ollama', staticModels: [], getDynamicModels: getOllamaModels, - getApiKeyLink: "https://ollama.com/download", - labelForGetApiKey: "Download Ollama", - icon: "i-ph:cloud-arrow-down", - }, { + getApiKeyLink: 'https://ollama.com/download', + labelForGetApiKey: 'Download Ollama', + icon: 'i-ph:cloud-arrow-down', + }, + { name: 'OpenAILike', - staticModels: [ - { name: 'o1-mini', label: 'o1-mini', provider: 'OpenAILike' }, - { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAILike' }, - ], - getDynamicModels: getOpenAILikeModels + staticModels: [], + getDynamicModels: getOpenAILikeModels, }, { name: 'Cohere', @@ -50,7 +63,7 @@ const PROVIDER_LIST: ProviderInfo[] = [ { name: 'c4ai-aya-expanse-8b', label: 'c4AI Aya Expanse 8b', provider: 'Cohere', maxTokenAllowed: 4096 }, { name: 'c4ai-aya-expanse-32b', label: 'c4AI Aya Expanse 32b', provider: 'Cohere', maxTokenAllowed: 4096 }, ], - getApiKeyLink: 'https://dashboard.cohere.com/api-keys' + getApiKeyLink: 'https://dashboard.cohere.com/api-keys', }, { name: 'OpenRouter', @@ -59,50 +72,145 @@ const PROVIDER_LIST: ProviderInfo[] = [ { name: 'anthropic/claude-3.5-sonnet', label: 'Anthropic: Claude 3.5 Sonnet (OpenRouter)', - provider: 'OpenRouter' - , maxTokenAllowed: 8000 + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { + name: 'anthropic/claude-3-haiku', + label: 'Anthropic: Claude 3 Haiku (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { + name: 'deepseek/deepseek-coder', + label: 'Deepseek-Coder V2 236B (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { + name: 'google/gemini-flash-1.5', + label: 'Google Gemini Flash 1.5 (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { + name: 'google/gemini-pro-1.5', + label: 'Google Gemini Pro 1.5 (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, }, - { name: 'anthropic/claude-3-haiku', label: 'Anthropic: Claude 3 Haiku (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'deepseek/deepseek-coder', label: 'Deepseek-Coder V2 236B (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'google/gemini-flash-1.5', label: 'Google Gemini Flash 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'google/gemini-pro-1.5', label: 'Google Gemini Pro 1.5 (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, { name: 'x-ai/grok-beta', label: 'xAI Grok Beta (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'mistralai/mistral-nemo', label: 'OpenRouter Mistral Nemo (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'qwen/qwen-110b-chat', label: 'OpenRouter Qwen 110b Chat (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 8000 }, - { name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 } + { + name: 'mistralai/mistral-nemo', + label: 'OpenRouter Mistral Nemo (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { + name: 'qwen/qwen-110b-chat', + label: 'OpenRouter Qwen 110b Chat (OpenRouter)', + provider: 'OpenRouter', + maxTokenAllowed: 8000, + }, + { name: 'cohere/command', label: 'Cohere Command (OpenRouter)', provider: 'OpenRouter', maxTokenAllowed: 4096 }, ], getDynamicModels: getOpenRouterModels, getApiKeyLink: 'https://openrouter.ai/settings/keys', - - }, { + }, + { name: 'Google', - staticModels: [ - { name: 'gemini-exp-1121', label: 'Gemini Experimental 1121', provider: 'Google' }, - { name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro 002', provider: 'Google' }, - { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google' }, - { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google' } + staticModels: [ + { name: 'gemini-1.5-flash-latest', label: 'Gemini 1.5 Flash', provider: 'Google', maxTokenAllowed: 8192 }, + { name: 'gemini-1.5-flash-002', label: 'Gemini 1.5 Flash-002', provider: 'Google', maxTokenAllowed: 8192 }, + { name: 'gemini-1.5-flash-8b', label: 'Gemini 1.5 Flash-8b', provider: 'Google', maxTokenAllowed: 8192 }, + { name: 'gemini-1.5-pro-latest', label: 'Gemini 1.5 Pro', provider: 'Google', maxTokenAllowed: 8192 }, + { name: 'gemini-1.5-pro-002', label: 'Gemini 1.5 Pro-002', provider: 'Google', maxTokenAllowed: 8192 }, + { name: 'gemini-exp-1121', label: 'Gemini exp-1121', provider: 'Google', maxTokenAllowed: 8192 }, ], - getApiKeyLink: 'https://aistudio.google.com/app/apikey' - }, { + getApiKeyLink: 'https://aistudio.google.com/app/apikey', + }, + { name: 'Groq', staticModels: [ { name: 'llama-3.1-70b-versatile', label: 'Llama 3.1 70b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.1-8b-instant', label: 'Llama 3.1 8b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.2-11b-vision-preview', label: 'Llama 3.2 11b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, { name: 'llama-3.2-3b-preview', label: 'Llama 3.2 3b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, - { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 } + { name: 'llama-3.2-1b-preview', label: 'Llama 3.2 1b (Groq)', provider: 'Groq', maxTokenAllowed: 8000 }, ], - getApiKeyLink: 'https://console.groq.com/keys' + getApiKeyLink: 'https://console.groq.com/keys', }, { name: 'HuggingFace', staticModels: [ - { name: 'Qwen/Qwen2.5-Coder-32B-Instruct', label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, - { name: '01-ai/Yi-1.5-34B-Chat', label: 'Yi-1.5-34B-Chat (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, - { name: 'codellama/CodeLlama-34b-Instruct-hf', label: 'CodeLlama-34b-Instruct (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 }, - { name: 'NousResearch/Hermes-3-Llama-3.1-8B', label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', provider: 'HuggingFace', maxTokenAllowed: 8000 } + { + name: 'Qwen/Qwen2.5-Coder-32B-Instruct', + label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: '01-ai/Yi-1.5-34B-Chat', + label: 'Yi-1.5-34B-Chat (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'codellama/CodeLlama-34b-Instruct-hf', + label: 'CodeLlama-34b-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'NousResearch/Hermes-3-Llama-3.1-8B', + label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'Qwen/Qwen2.5-Coder-32B-Instruct', + label: 'Qwen2.5-Coder-32B-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'Qwen/Qwen2.5-72B-Instruct', + label: 'Qwen2.5-72B-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'meta-llama/Llama-3.1-70B-Instruct', + label: 'Llama-3.1-70B-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'meta-llama/Llama-3.1-405B', + label: 'Llama-3.1-405B (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: '01-ai/Yi-1.5-34B-Chat', + label: 'Yi-1.5-34B-Chat (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'codellama/CodeLlama-34b-Instruct-hf', + label: 'CodeLlama-34b-Instruct (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, + { + name: 'NousResearch/Hermes-3-Llama-3.1-8B', + label: 'Hermes-3-Llama-3.1-8B (HuggingFace)', + provider: 'HuggingFace', + maxTokenAllowed: 8000, + }, ], - getApiKeyLink: 'https://huggingface.co/settings/tokens' + getApiKeyLink: 'https://huggingface.co/settings/tokens', }, { @@ -111,23 +219,24 @@ const PROVIDER_LIST: ProviderInfo[] = [ { name: 'gpt-4o-mini', label: 'GPT-4o Mini', provider: 'OpenAI', maxTokenAllowed: 8000 }, { name: 'gpt-4-turbo', label: 'GPT-4 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 }, { name: 'gpt-4', label: 'GPT-4', provider: 'OpenAI', maxTokenAllowed: 8000 }, - { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 } + { name: 'gpt-3.5-turbo', label: 'GPT-3.5 Turbo', provider: 'OpenAI', maxTokenAllowed: 8000 }, ], - getApiKeyLink: "https://platform.openai.com/api-keys", - }, { + getApiKeyLink: 'https://platform.openai.com/api-keys', + }, + { name: 'xAI', - staticModels: [ - { name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 } - ], - getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key' - }, { + staticModels: [{ name: 'grok-beta', label: 'xAI Grok Beta', provider: 'xAI', maxTokenAllowed: 8000 }], + getApiKeyLink: 'https://docs.x.ai/docs/quickstart#creating-an-api-key', + }, + { name: 'Deepseek', staticModels: [ { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 }, - { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 } + { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 }, ], - getApiKeyLink: 'https://platform.deepseek.com/api_keys' - }, { + getApiKeyLink: 'https://platform.deepseek.com/apiKeys', + }, + { name: 'Mistral', staticModels: [ { name: 'open-mistral-7b', label: 'Mistral 7B', provider: 'Mistral', maxTokenAllowed: 8000 }, @@ -138,27 +247,29 @@ const PROVIDER_LIST: ProviderInfo[] = [ { name: 'ministral-8b-latest', label: 'Mistral 8B', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'mistral-small-latest', label: 'Mistral Small', provider: 'Mistral', maxTokenAllowed: 8000 }, { name: 'codestral-latest', label: 'Codestral', provider: 'Mistral', maxTokenAllowed: 8000 }, - { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 } + { name: 'mistral-large-latest', label: 'Mistral Large Latest', provider: 'Mistral', maxTokenAllowed: 8000 }, ], - getApiKeyLink: 'https://console.mistral.ai/api-keys/' - }, { + getApiKeyLink: 'https://console.mistral.ai/api-keys/', + }, + { name: 'LMStudio', staticModels: [], getDynamicModels: getLMStudioModels, getApiKeyLink: 'https://lmstudio.ai/', labelForGetApiKey: 'Get LMStudio', - icon: "i-ph:cloud-arrow-down", - } + icon: 'i-ph:cloud-arrow-down', + }, ]; export const DEFAULT_PROVIDER = PROVIDER_LIST[0]; -const staticModels: ModelInfo[] = PROVIDER_LIST.map(p => p.staticModels).flat(); +const staticModels: ModelInfo[] = PROVIDER_LIST.map((p) => p.staticModels).flat(); export let MODEL_LIST: ModelInfo[] = [...staticModels]; const getOllamaBaseUrl = () => { const defaultBaseUrl = import.meta.env.OLLAMA_API_BASE_URL || 'http://localhost:11434'; + // Check if we're in the browser if (typeof window !== 'undefined') { // Frontend always uses localhost @@ -168,47 +279,54 @@ const getOllamaBaseUrl = () => { // Backend: Check if we're running in Docker const isDocker = process.env.RUNNING_IN_DOCKER === 'true'; - return isDocker - ? defaultBaseUrl.replace('localhost', 'host.docker.internal') - : defaultBaseUrl; + return isDocker ? defaultBaseUrl.replace('localhost', 'host.docker.internal') : defaultBaseUrl; }; async function getOllamaModels(): Promise { + if (typeof window === 'undefined') { + return []; + } + try { - const base_url = getOllamaBaseUrl(); - const response = await fetch(`${base_url}/api/tags`); - const data = await response.json() as OllamaApiResponse; + const baseUrl = getOllamaBaseUrl(); + const response = await fetch(`${baseUrl}/api/tags`); + const data = (await response.json()) as OllamaApiResponse; return data.models.map((model: OllamaModel) => ({ name: model.name, label: `${model.name} (${model.details.parameter_size})`, provider: 'Ollama', - maxTokenAllowed:8000, + maxTokenAllowed: 8000, })); } catch (e) { + console.error('Error getting Ollama models:', e); return []; } } async function getOpenAILikeModels(): Promise { try { - const base_url = import.meta.env.OPENAI_LIKE_API_BASE_URL || ''; - if (!base_url) { + const baseUrl = import.meta.env.OPENAI_LIKE_API_BASE_URL || ''; + + if (!baseUrl) { return []; } - const api_key = import.meta.env.OPENAI_LIKE_API_KEY ?? ''; - const response = await fetch(`${base_url}/models`, { + + const apiKey = import.meta.env.OPENAI_LIKE_API_KEY ?? ''; + const response = await fetch(`${baseUrl}/models`, { headers: { - Authorization: `Bearer ${api_key}` - } + Authorization: `Bearer ${apiKey}`, + }, }); - const res = await response.json() as any; + const res = (await response.json()) as any; + return res.data.map((model: any) => ({ name: model.id, label: model.id, - provider: 'OpenAILike' + provider: 'OpenAILike', })); } catch (e) { + console.error('Error getting OpenAILike models:', e); return []; } } @@ -221,51 +339,71 @@ type OpenRouterModelsResponse = { pricing: { prompt: number; completion: number; - } - }[] + }; + }[]; }; async function getOpenRouterModels(): Promise { - const data: OpenRouterModelsResponse = await (await fetch('https://openrouter.ai/api/v1/models', { - headers: { - 'Content-Type': 'application/json' - } - })).json(); + const data: OpenRouterModelsResponse = await ( + await fetch('https://openrouter.ai/api/v1/models', { + headers: { + 'Content-Type': 'application/json', + }, + }) + ).json(); - return data.data.sort((a, b) => a.name.localeCompare(b.name)).map(m => ({ - name: m.id, - label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed( - 2)} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor( - m.context_length / 1000)}k`, - provider: 'OpenRouter', - maxTokenAllowed:8000, - })); + return data.data + .sort((a, b) => a.name.localeCompare(b.name)) + .map((m) => ({ + name: m.id, + label: `${m.name} - in:$${(m.pricing.prompt * 1_000_000).toFixed( + 2, + )} out:$${(m.pricing.completion * 1_000_000).toFixed(2)} - context ${Math.floor(m.context_length / 1000)}k`, + provider: 'OpenRouter', + maxTokenAllowed: 8000, + })); } async function getLMStudioModels(): Promise { + if (typeof window === 'undefined') { + return []; + } + try { - const base_url = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234'; - const response = await fetch(`${base_url}/v1/models`); - const data = await response.json() as any; + const baseUrl = import.meta.env.LMSTUDIO_API_BASE_URL || 'http://localhost:1234'; + const response = await fetch(`${baseUrl}/v1/models`); + const data = (await response.json()) as any; + return data.data.map((model: any) => ({ name: model.id, label: model.id, - provider: 'LMStudio' + provider: 'LMStudio', })); } catch (e) { + console.error('Error getting LMStudio models:', e); return []; } } - - async function initializeModelList(): Promise { - MODEL_LIST = [...(await Promise.all( - PROVIDER_LIST - .filter((p): p is ProviderInfo & { getDynamicModels: () => Promise } => !!p.getDynamicModels) - .map(p => p.getDynamicModels()))) - .flat(), ...staticModels]; + MODEL_LIST = [ + ...( + await Promise.all( + PROVIDER_LIST.filter( + (p): p is ProviderInfo & { getDynamicModels: () => Promise } => !!p.getDynamicModels, + ).map((p) => p.getDynamicModels()), + ) + ).flat(), + ...staticModels, + ]; return MODEL_LIST; } -export { getOllamaModels, getOpenAILikeModels, getLMStudioModels, initializeModelList, getOpenRouterModels, PROVIDER_LIST }; +export { + getOllamaModels, + getOpenAILikeModels, + getLMStudioModels, + initializeModelList, + getOpenRouterModels, + PROVIDER_LIST, +}; diff --git a/app/utils/shell.ts b/app/utils/shell.ts index d45e8a6..53b450f 100644 --- a/app/utils/shell.ts +++ b/app/utils/shell.ts @@ -52,67 +52,77 @@ export async function newShellProcess(webcontainer: WebContainer, terminal: ITer return process; } - +export type ExecutionResult = { output: string; exitCode: number } | undefined; export class BoltShell { - #initialized: (() => void) | undefined - #readyPromise: Promise - #webcontainer: WebContainer | undefined - #terminal: ITerminal | undefined - #process: WebContainerProcess | undefined - executionState = atom<{ sessionId: string, active: boolean, executionPrms?: Promise } | undefined>() - #outputStream: ReadableStreamDefaultReader | undefined - #shellInputStream: WritableStreamDefaultWriter | undefined + #initialized: (() => void) | undefined; + #readyPromise: Promise; + #webcontainer: WebContainer | undefined; + #terminal: ITerminal | undefined; + #process: WebContainerProcess | undefined; + executionState = atom<{ sessionId: string; active: boolean; executionPrms?: Promise } | undefined>(); + #outputStream: ReadableStreamDefaultReader | undefined; + #shellInputStream: WritableStreamDefaultWriter | undefined; + constructor() { this.#readyPromise = new Promise((resolve) => { - this.#initialized = resolve - }) + this.#initialized = resolve; + }); } + ready() { return this.#readyPromise; } - async init(webcontainer: WebContainer, terminal: ITerminal) { - this.#webcontainer = webcontainer - this.#terminal = terminal - let callback = (data: string) => { - console.log(data) - } - let { process, output } = await this.newBoltShellProcess(webcontainer, terminal) - this.#process = process - this.#outputStream = output.getReader() - await this.waitTillOscCode('interactive') - this.#initialized?.() - } - get terminal() { - return this.#terminal - } - get process() { - return this.#process - } - async executeCommand(sessionId: string, command: string) { - if (!this.process || !this.terminal) { - return - } - let state = this.executionState.get() - //interrupt the current execution - // this.#shellInputStream?.write('\x03'); - this.terminal.input('\x03'); - if (state && state.executionPrms) { - await state.executionPrms + async init(webcontainer: WebContainer, terminal: ITerminal) { + this.#webcontainer = webcontainer; + this.#terminal = terminal; + + const { process, output } = await this.newBoltShellProcess(webcontainer, terminal); + this.#process = process; + this.#outputStream = output.getReader(); + await this.waitTillOscCode('interactive'); + this.#initialized?.(); + } + + get terminal() { + return this.#terminal; + } + + get process() { + return this.#process; + } + + async executeCommand(sessionId: string, command: string): Promise { + if (!this.process || !this.terminal) { + return undefined; } + + const state = this.executionState.get(); + + /* + * interrupt the current execution + * this.#shellInputStream?.write('\x03'); + */ + this.terminal.input('\x03'); + + if (state && state.executionPrms) { + await state.executionPrms; + } + //start a new execution this.terminal.input(command.trim() + '\n'); //wait for the execution to finish - let executionPrms = this.getCurrentExecutionResult() - this.executionState.set({ sessionId, active: true, executionPrms }) + const executionPromise = this.getCurrentExecutionResult(); + this.executionState.set({ sessionId, active: true, executionPrms: executionPromise }); - let resp = await executionPrms - this.executionState.set({ sessionId, active: false }) - return resp + const resp = await executionPromise; + this.executionState.set({ sessionId, active: false }); + return resp; } + async newBoltShellProcess(webcontainer: WebContainer, terminal: ITerminal) { const args: string[] = []; @@ -126,6 +136,7 @@ export class BoltShell { const input = process.input.getWriter(); this.#shellInputStream = input; + const [internalOutput, terminalOutput] = process.output.tee(); const jshReady = withResolvers(); @@ -162,34 +173,48 @@ export class BoltShell { return { process, output: internalOutput }; } - async getCurrentExecutionResult() { - let { output, exitCode } = await this.waitTillOscCode('exit') + + async getCurrentExecutionResult(): Promise { + const { output, exitCode } = await this.waitTillOscCode('exit'); return { output, exitCode }; } + async waitTillOscCode(waitCode: string) { let fullOutput = ''; let exitCode: number = 0; - if (!this.#outputStream) return { output: fullOutput, exitCode }; - let tappedStream = this.#outputStream + + if (!this.#outputStream) { + return { output: fullOutput, exitCode }; + } + + const tappedStream = this.#outputStream; while (true) { const { value, done } = await tappedStream.read(); - if (done) break; + + if (done) { + break; + } + const text = value || ''; fullOutput += text; // Check if command completion signal with exit code - const [, osc, , pid, code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || []; + const [, osc, , , code] = text.match(/\x1b\]654;([^\x07=]+)=?((-?\d+):(\d+))?\x07/) || []; + if (osc === 'exit') { exitCode = parseInt(code, 10); } + if (osc === waitCode) { break; } } + return { output: fullOutput, exitCode }; } } + export function newBoltShellProcess() { return new BoltShell(); } diff --git a/app/utils/types.ts b/app/utils/types.ts index 171edc3..8742891 100644 --- a/app/utils/types.ts +++ b/app/utils/types.ts @@ -1,4 +1,3 @@ - interface OllamaModelDetails { parent_model: string; format: string; @@ -29,10 +28,10 @@ export interface ModelInfo { } export interface ProviderInfo { - staticModels: ModelInfo[], - name: string, - getDynamicModels?: () => Promise, - getApiKeyLink?: string, - labelForGetApiKey?: string, - icon?:string, -}; + staticModels: ModelInfo[]; + name: string; + getDynamicModels?: () => Promise; + getApiKeyLink?: string; + labelForGetApiKey?: string; + icon?: string; +} diff --git a/eslint.config.mjs b/eslint.config.mjs index 123aaf1..160e5f3 100644 --- a/eslint.config.mjs +++ b/eslint.config.mjs @@ -12,6 +12,8 @@ export default [ '@blitz/catch-error-name': 'off', '@typescript-eslint/no-this-alias': 'off', '@typescript-eslint/no-empty-object-type': 'off', + '@blitz/comment-syntax': 'off', + '@blitz/block-scope-case': 'off', }, }, { diff --git a/package.json b/package.json index cc1c256..c211182 100644 --- a/package.json +++ b/package.json @@ -11,8 +11,8 @@ "dev": "remix vite:dev", "test": "vitest --run", "test:watch": "vitest", - "lint": "eslint --cache --cache-location ./node_modules/.cache/eslint .", - "lint:fix": "npm run lint -- --fix", + "lint": "eslint --cache --cache-location ./node_modules/.cache/eslint app", + "lint:fix": "npm run lint -- --fix && prettier app --write", "start": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings", "dockerstart": "bindings=$(./bindings.sh) && wrangler pages dev ./build/client $bindings --ip 0.0.0.0 --port 5173 --no-show-interactive-dev-session", "dockerrun": "docker run -it -d --name bolt-ai-live -p 5173:5173 --env-file .env.local bolt-ai", @@ -20,7 +20,8 @@ "dockerbuild": "docker build -t bolt-ai:development -t bolt-ai:latest --target bolt-ai-development .", "typecheck": "tsc", "typegen": "wrangler types", - "preview": "pnpm run build && pnpm run start" + "preview": "pnpm run build && pnpm run start", + "prepare": "husky" }, "engines": { "node": ">=18.18.0" @@ -70,6 +71,7 @@ "diff": "^5.2.0", "file-saver": "^2.0.5", "framer-motion": "^11.2.12", + "ignore": "^6.0.2", "isbot": "^4.1.0", "istextorbinary": "^9.5.0", "jose": "^5.6.3", @@ -101,6 +103,7 @@ "@types/react": "^18.2.20", "@types/react-dom": "^18.2.7", "fast-glob": "^3.3.2", + "husky": "9.1.7", "is-ci": "^3.0.1", "node-fetch": "^3.3.2", "prettier": "^3.3.2", diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 951d1a4..cd2355c 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -143,6 +143,9 @@ importers: framer-motion: specifier: ^11.2.12 version: 11.2.12(react-dom@18.3.1(react@18.3.1))(react@18.3.1) + ignore: + specifier: ^6.0.2 + version: 6.0.2 isbot: specifier: ^4.1.0 version: 4.4.0 @@ -231,6 +234,9 @@ importers: fast-glob: specifier: ^3.3.2 version: 3.3.2 + husky: + specifier: 9.1.7 + version: 9.1.7 is-ci: specifier: ^3.0.1 version: 3.0.1 @@ -2482,7 +2488,7 @@ packages: resolution: {integrity: sha512-HpGFw18DgFWlncDfjTa2rcQ4W88O1mC8e8yZ2AvQY5KDaktSTwo+KRf6nHK6FRI5FyRyb/5T6+TSxfP7QyGsmQ==} bytes@3.0.0: - resolution: {integrity: sha512-pMhOfFDPiv9t5jjIXkHosWmkSyQbvsgEVNkz0ERHbuLh2T/7j4Mqqpz523Fe8MVY89KC6Sh/QfS2sM+SjgFDcw==} + resolution: {integrity: sha1-0ygVQE1olpn4Wk6k+odV3ROpYEg=} engines: {node: '>= 0.8'} bytes@3.1.2: @@ -3382,6 +3388,11 @@ packages: resolution: {integrity: sha512-AXcZb6vzzrFAUE61HnN4mpLqd/cSIwNQjtNWR0euPm6y0iqx3G4gOXaIDdtdDwZmhwe82LA6+zinmW4UBWVePQ==} engines: {node: '>=16.17.0'} + husky@9.1.7: + resolution: {integrity: sha512-5gs5ytaNjBrh5Ow3zrvdUUY+0VxIuWVL4i9irt6friV+BqdCfmV11CQTWMiBYWHbXhco+J1kHfTOUkePhCDvMA==} + engines: {node: '>=18'} + hasBin: true + iconv-lite@0.4.24: resolution: {integrity: sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==} engines: {node: '>=0.10.0'} @@ -3399,6 +3410,10 @@ packages: resolution: {integrity: sha512-5Fytz/IraMjqpwfd34ke28PTVMjZjJG2MPn5t7OE4eUCUNf8BAa7b5WUS9/Qvr6mwOQS7Mk6vdsMno5he+T8Xw==} engines: {node: '>= 4'} + ignore@6.0.2: + resolution: {integrity: sha512-InwqeHHN2XpumIkMvpl/DCJVrAHgCsG5+cn1XlnLWGwtZBm8QJfSusItfrwx81CTp5agNZqpKU2J/ccC5nGT4A==} + engines: {node: '>= 4'} + immediate@3.0.6: resolution: {integrity: sha512-XXOFtyqDjNDAQxVfYxuF7g9Il/IbWmmlQg2MYKOH8ExIT1qg6xc4zyS3HaEEATgs1btfzxq15ciUiY7gjSXRGQ==} @@ -9278,6 +9293,8 @@ snapshots: human-signals@5.0.0: {} + husky@9.1.7: {} + iconv-lite@0.4.24: dependencies: safer-buffer: 2.1.2 @@ -9290,6 +9307,8 @@ snapshots: ignore@5.3.1: {} + ignore@6.0.2: {} + immediate@3.0.6: {} immutable@4.3.7: {} diff --git a/worker-configuration.d.ts b/worker-configuration.d.ts index 1d8993b..9c074b8 100644 --- a/worker-configuration.d.ts +++ b/worker-configuration.d.ts @@ -9,4 +9,7 @@ interface Env { OPENAI_LIKE_API_BASE_URL: string; DEEPSEEK_API_KEY: string; LMSTUDIO_API_BASE_URL: string; + GOOGLE_GENERATIVE_AI_API_KEY: string; + MISTRAL_API_KEY: string; + XAI_API_KEY: string; }