bolt.diy/app/lib/.server/llm/constants.ts
fernsdavid25 f7615093dd
Update max_tokens in constants.ts
max_tokens for llama 3.1 models must be less than or equal to 8000 but it is set to 8192. just change it to 8000 and the error is fixed.
2024-10-21 19:32:57 +05:30

6 lines
216 B
TypeScript

// see https://docs.anthropic.com/en/docs/about-claude/models
export const MAX_TOKENS = 8000;
// limits the number of model responses that can be returned in a single request
export const MAX_RESPONSE_SEGMENTS = 2;