bolt.diy/app/lib/modules/llm/providers
Mohammad Saif Khan 39a0724ef3
feat: add Gemini 2.0 Flash-thinking-exp-01-21 model with 65k token support (#1202)
Added the new gemini-2.0-flash-thinking-exp-01-21 model to the GoogleProvider's static model configuration. This model supports a significantly increased maxTokenAllowed limit of 65,536 tokens, enabling it to handle larger context windows compared to existing Gemini models (previously capped at 8k tokens). The model is labeled as "Gemini 2.0 Flash-thinking-exp-01-21" for clear identification in the UI/dropdowns.
2025-01-28 23:30:50 +05:30
..
amazon-bedrock.ts feat: implement Claude 3, Claude3.5, Nova Pro, Nova Lite and Mistral model integration with AWS Bedrock (#974) 2025-01-06 17:49:16 +05:30
anthropic.ts refactor: refactored LLM Providers: Adapting Modular Approach (#832) 2024-12-21 11:45:17 +05:30
cohere.ts refactor: refactored LLM Providers: Adapting Modular Approach (#832) 2024-12-21 11:45:17 +05:30
deepseek.ts feat: added support for reasoning content (#1168) 2025-01-25 16:16:19 +05:30
github.ts bug fix for Open preview in a new tab. 2025-01-18 19:25:01 +01:00
google.ts feat: add Gemini 2.0 Flash-thinking-exp-01-21 model with 65k token support (#1202) 2025-01-28 23:30:50 +05:30
groq.ts feat: add deepseek-r1-distill-llama-70b to groq provider (#1187) 2025-01-27 18:08:46 +05:30
huggingface.ts fix: updated logger and model caching minor bugfix #release (#895) 2024-12-31 22:47:32 +05:30
hyperbolic.ts Update hyperbolic.ts 2025-01-01 17:59:11 +05:30
lmstudio.ts fix: docker prod env variable fix (#1170) 2025-01-25 03:52:26 +05:30
mistral.ts refactor: refactored LLM Providers: Adapting Modular Approach (#832) 2024-12-21 11:45:17 +05:30
ollama.ts fix: docker prod env variable fix (#1170) 2025-01-25 03:52:26 +05:30
open-router.ts fix: updated logger and model caching minor bugfix #release (#895) 2024-12-31 22:47:32 +05:30
openai-like.ts fix: updated logger and model caching minor bugfix #release (#895) 2024-12-31 22:47:32 +05:30
openai.ts fix: updated logger and model caching minor bugfix #release (#895) 2024-12-31 22:47:32 +05:30
perplexity.ts refactor: refactored LLM Providers: Adapting Modular Approach (#832) 2024-12-21 11:45:17 +05:30
together.ts fix: updated logger and model caching minor bugfix #release (#895) 2024-12-31 22:47:32 +05:30
xai.ts refactor: refactored LLM Providers: Adapting Modular Approach (#832) 2024-12-21 11:45:17 +05:30