Samuel
|
dc5771d6f0
|
fix conflicts
|
2024-11-06 20:22:40 +00:00 |
|
Timothy J. Baek
|
4e7951d5fc
|
fix: allow openai list message format
|
2024-11-03 01:34:45 -08:00 |
|
Hugo Haldi
|
b596b8f0cb
|
Fix: generate endpoint form model
According to the [Ollama documentation] the parameter `context` should be an array of int, if specified.
|
2024-10-27 13:41:50 +01:00 |
|
Nate.Dorr
|
b4acf689e3
|
update the GenerateChatCompletionForm stream to be defaulted to true.
This defaults the /api/chat/ endpoint to default to streaming being true
|
2024-10-25 08:00:37 -05:00 |
|
Timothy J. Baek
|
856c00bc2f
|
fix: arena model exclude filter
|
2024-10-24 14:22:04 -07:00 |
|
Samuel
|
05746a9960
|
feat: ollama non streaming case working
|
2024-10-24 13:48:56 +00:00 |
|
Timothy J. Baek
|
9936583477
|
chore: format
|
2024-10-20 18:38:06 -07:00 |
|
Peter De-Ath
|
c748d33192
|
support list[str] | str as input
|
2024-10-10 22:41:57 +01:00 |
|
Peter De-Ath
|
885b9f1ece
|
refactor: Update GenerateEmbeddingsForm to support batch processing
refactor: Update embedding batch size handling in RAG configuration
refactor: add query_doc query caching
refactor: update logging statements in generate_chat_completion function
change embedding_batch_size to Optional
|
2024-10-08 00:04:35 +01:00 |
|
Timothy J. Baek
|
c7a0e45bea
|
refac
|
2024-09-30 16:32:38 +02:00 |
|
Timothy J. Baek
|
c93a10388b
|
refac
|
2024-09-28 19:51:28 +02:00 |
|
Timothy J. Baek
|
ee33b4e2a3
|
fix: ollama /embed form_data
|
2024-09-25 22:34:02 +02:00 |
|
Timothy J. Baek
|
8426874426
|
fix
|
2024-09-19 16:54:34 +02:00 |
|
Timothy J. Baek
|
f1fae805a2
|
fix: separate /embed and /embedding ollama endpoint
|
2024-09-09 23:02:26 +01:00 |
|
Timothy J. Baek
|
1c20db775c
|
refac: enable /api/embed
|
2024-09-07 05:12:46 +01:00 |
|
Timothy J. Baek
|
ff46fe2b4a
|
refac
|
2024-09-07 03:09:57 +01:00 |
|
Timothy J. Baek
|
03d5a670f6
|
refac: mv backend files to /open_webui dir
|
2024-09-04 16:54:48 +02:00 |
|