bolt.diy/app
fernsdavid25 f7615093dd
Update max_tokens in constants.ts
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed.
2024-10-21 19:32:57 +05:30
..
components Fixing up codebase after merging pull requests 2024-10-19 13:21:24 -05:00
lib Update max_tokens in constants.ts 2024-10-21 19:32:57 +05:30
routes let the ollama models be auto generated from ollama api 2024-10-18 14:34:08 +03:00
styles fix: remove monorepo 2024-09-25 19:54:09 +01:00
types fix: remove monorepo 2024-09-25 19:54:09 +01:00
utils Merge branch 'main' into main 2024-10-19 12:44:01 -05:00
entry.client.tsx fix(browser-extensions): don't render directly in body 2024-10-07 10:49:31 +02:00
entry.server.tsx let the ollama models be auto generated from ollama api 2024-10-18 14:34:08 +03:00
root.tsx fix(browser-extensions): don't render directly in body 2024-10-07 10:49:31 +02:00