Merge pull request #9082 from Alex1607/main
Some checks are pending
Deploy to HuggingFace Spaces / check-secret (push) Waiting to run
Deploy to HuggingFace Spaces / deploy (push) Blocked by required conditions
Create and publish Docker images with specific build args / build-main-image (linux/amd64) (push) Waiting to run
Create and publish Docker images with specific build args / build-main-image (linux/arm64) (push) Waiting to run
Create and publish Docker images with specific build args / build-cuda-image (linux/amd64) (push) Waiting to run
Create and publish Docker images with specific build args / build-cuda-image (linux/arm64) (push) Waiting to run
Create and publish Docker images with specific build args / build-ollama-image (linux/amd64) (push) Waiting to run
Create and publish Docker images with specific build args / build-ollama-image (linux/arm64) (push) Waiting to run
Create and publish Docker images with specific build args / merge-main-images (push) Blocked by required conditions
Create and publish Docker images with specific build args / merge-cuda-images (push) Blocked by required conditions
Create and publish Docker images with specific build args / merge-ollama-images (push) Blocked by required conditions
Python CI / Format Backend (3.11) (push) Waiting to run
Frontend Build / Format & Build Frontend (push) Waiting to run
Frontend Build / Frontend Unit Tests (push) Waiting to run

FIX max_tokens not being set properly
This commit is contained in:
Timothy Jaeryang Baek 2025-01-29 11:48:05 -08:00 committed by GitHub
commit 40a4443949
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -666,6 +666,9 @@ def apply_params_to_form_data(form_data, model):
if "temperature" in params:
form_data["temperature"] = params["temperature"]
if "max_tokens" in params:
form_data["max_tokens"] = params["max_tokens"]
if "top_p" in params:
form_data["top_p"] = params["top_p"]