mirror of
https://github.com/stackblitz-labs/bolt.diy
synced 2025-01-23 03:07:05 +00:00
f7615093dd
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed.
6 lines
216 B
TypeScript
6 lines
216 B
TypeScript
// see https://docs.anthropic.com/en/docs/about-claude/models
|
|
export const MAX_TOKENS = 8000;
|
|
|
|
// limits the number of model responses that can be returned in a single request
|
|
export const MAX_RESPONSE_SEGMENTS = 2;
|