bolt.new-any-llm/app
Martin Ouimet 76cc7a8139 feat: add Together AI integration and provider implementation guide
- Create detailed provider implementation guide with:
  - Architecture overview and implementation steps
  - Configuration patterns and best practices
  - Testing checklist and Docker integration guide
  - Example using Together AI implementation
- Add Together AI as new provider with:
  - Environment variables and Docker configuration
  - Support for Qwen, Llama, and Mixtral models
  - API key and base URL management
  - OpenAI-compatible API integration
2024-11-23 00:20:35 -05:00
..
components Fix linting issues 2024-11-22 09:49:45 +01:00
lib feat: add Together AI integration and provider implementation guide 2024-11-23 00:20:35 -05:00
routes Merge remote-tracking branch 'upstream/main' into linting 2024-11-22 20:38:58 +01:00
styles fix: remove monorepo 2024-09-25 19:54:09 +01:00
types Lint-fix all files in app 2024-11-21 22:05:35 +01:00
utils feat: add Together AI integration and provider implementation guide 2024-11-23 00:20:35 -05:00
entry.client.tsx fix(browser-extensions): don't render directly in body 2024-10-07 10:49:31 +02:00
entry.server.tsx let the ollama models be auto generated from ollama api 2024-10-18 14:34:08 +03:00
root.tsx fix(browser-extensions): don't render directly in body 2024-10-07 10:49:31 +02:00