mirror of
https://github.com/stackblitz-labs/bolt.diy
synced 2025-01-23 03:07:05 +00:00
f7615093dd
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed. |
||
---|---|---|
.. | ||
.server/llm | ||
hooks | ||
persistence | ||
runtime | ||
stores | ||
webcontainer | ||
crypto.ts | ||
fetch.ts |