bolt.diy/.env.example
Nirmal Arya b9415e1d81
Feature/bayer mga provider (#33)
* fix: enhance Bayer MGA provider reliability and Docker integration

* Merge latest dev branch changes into Bayer MGA feature branch
* Improve Bayer MGA provider model filtering and error handling
* Add robust model validation with fallback mechanisms
* Enhance logging and debugging capabilities for model selection
* Add Bayer MGA environment variables to Docker configurations
* Update worker configuration with Bayer MGA API keys
* Add comprehensive Bayer MGA setup to .env.example
* Create standalone test script for Bayer MGA provider debugging
* Fix intermittent model selection issues beyond Claude 3.7 Sonnet
* Ensure provider switching works without breaking other providers

* Bayer MGA provider multimodel support and test coverage.

* Add Claude.md.
2025-06-22 02:21:14 -04:00

148 lines
5.5 KiB
Plaintext

# Rename this file to .env once you have filled in the below environment variables!
# GitHub OAuth Authentication Configuration
# ---------------------------------------
# To set up GitHub OAuth:
# 1. Go to https://github.com/settings/developers
# 2. Click "New OAuth App"
# 3. Fill in the application details:
# - Application name: Buildify (or your preferred name)
# - Homepage URL: http://localhost:5173 (or your deployment URL)
# - Authorization callback URL: http://localhost:5173/auth/callback
# 4. Click "Register application"
# 5. Copy the Client ID and generate a new Client Secret
# 6. Paste them below
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=
# Session Secret - Used to encrypt session cookies
# Generate a random string (e.g., using `openssl rand -hex 32` in terminal)
SESSION_SECRET=
# Get your GROQ API Key here -
# https://console.groq.com/keys
# You only need this environment variable set if you want to use Groq models
GROQ_API_KEY=
# Get your HuggingFace API Key here -
# https://huggingface.co/settings/tokens
# You only need this environment variable set if you want to use HuggingFace models
HuggingFace_API_KEY=
# Get your Open AI API Key by following these instructions -
# https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key
# You only need this environment variable set if you want to use GPT models
OPENAI_API_KEY=
# Get your Anthropic API Key in your account settings -
# https://console.anthropic.com/settings/keys
# You only need this environment variable set if you want to use Claude models
ANTHROPIC_API_KEY=
# Get your OpenRouter API Key in your account settings -
# https://openrouter.ai/settings/keys
# You only need this environment variable set if you want to use OpenRouter models
OPEN_ROUTER_API_KEY=
# Get your Google Generative AI API Key by following these instructions -
# https://console.cloud.google.com/apis/credentials
# You only need this environment variable set if you want to use Google Generative AI models
GOOGLE_GENERATIVE_AI_API_KEY=
# You only need this environment variable set if you want to use oLLAMA models
# DONT USE http://localhost:11434 due to IPV6 issues
# USE EXAMPLE http://127.0.0.1:11434
OLLAMA_API_BASE_URL=
# You only need this environment variable set if you want to use OpenAI Like models
OPENAI_LIKE_API_BASE_URL=
# You only need this environment variable set if you want to use Together AI models
TOGETHER_API_BASE_URL=
# You only need this environment variable set if you want to use DeepSeek models through their API
DEEPSEEK_API_KEY=
# Get your OpenAI Like API Key
OPENAI_LIKE_API_KEY=
# Get your Together API Key
TOGETHER_API_KEY=
# You only need this environment variable set if you want to use Hyperbolic models
#Get your Hyperbolics API Key at https://app.hyperbolic.xyz/settings
#baseURL="https://api.hyperbolic.xyz/v1/chat/completions"
HYPERBOLIC_API_KEY=
HYPERBOLIC_API_BASE_URL=
# Get your Mistral API Key by following these instructions -
# https://console.mistral.ai/api-keys/
# You only need this environment variable set if you want to use Mistral models
MISTRAL_API_KEY=
# Get the Cohere Api key by following these instructions -
# https://dashboard.cohere.com/api-keys
# You only need this environment variable set if you want to use Cohere models
COHERE_API_KEY=
# Get LMStudio Base URL from LM Studio Developer Console
# Make sure to enable CORS
# DONT USE http://localhost:1234 due to IPV6 issues
# Example: http://127.0.0.1:1234
LMSTUDIO_API_BASE_URL=
# Get your xAI API key
# https://x.ai/api
# You only need this environment variable set if you want to use xAI models
XAI_API_KEY=
# Get your Perplexity API Key here -
# https://www.perplexity.ai/settings/api
# You only need this environment variable set if you want to use Perplexity models
PERPLEXITY_API_KEY=
# Get your AWS configuration
# https://console.aws.amazon.com/iam/home
# The JSON should include the following keys:
# - region: The AWS region where Bedrock is available.
# - accessKeyId: Your AWS access key ID.
# - secretAccessKey: Your AWS secret access key.
# - sessionToken (optional): Temporary session token if using an IAM role or temporary credentials.
# Example JSON:
# {"region": "us-east-1", "accessKeyId": "yourAccessKeyId", "secretAccessKey": "yourSecretAccessKey", "sessionToken": "yourSessionToken"}
AWS_BEDROCK_CONFIG=
# Get your Bayer MGA API Key from your internal Bayer MGA system
# You only need these environment variables set if you want to use Bayer MGA models
# Contact your system administrator for access to the internal Bayer MGA API
BAYER_MGA_API_KEY=
BAYER_MGA_API_BASE_URL=https://chat.int.bayer.com/api/v2
# Include this environment variable if you want more logging for debugging locally
VITE_LOG_LEVEL=debug
# Get your GitHub Personal Access Token here -
# https://github.com/settings/tokens
# This token is used for:
# 1. Importing/cloning GitHub repositories without rate limiting
# 2. Accessing private repositories
# 3. Automatic GitHub authentication (no need to manually connect in the UI)
#
# For classic tokens, ensure it has these scopes: repo, read:org, read:user
# For fine-grained tokens, ensure it has Repository and Organization access
VITE_GITHUB_ACCESS_TOKEN=
# Specify the type of GitHub token you're using
# Can be 'classic' or 'fine-grained'
# Classic tokens are recommended for broader access
VITE_GITHUB_TOKEN_TYPE=classic
# Example Context Values for qwen2.5-coder:32b
#
# DEFAULT_NUM_CTX=32768 # Consumes 36GB of VRAM
# DEFAULT_NUM_CTX=24576 # Consumes 32GB of VRAM
# DEFAULT_NUM_CTX=12288 # Consumes 26GB of VRAM
# DEFAULT_NUM_CTX=6144 # Consumes 24GB of VRAM
DEFAULT_NUM_CTX=