bolt.diy/app/lib/.server
fernsdavid25 f7615093dd
Update max_tokens in constants.ts
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed.
2024-10-21 19:32:57 +05:30
..
llm Update max_tokens in constants.ts 2024-10-21 19:32:57 +05:30