mirror of
https://github.com/open-webui/open-webui
synced 2025-06-26 18:26:48 +00:00
chore: format
This commit is contained in:
parent
de69c08a1d
commit
867fddb821
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "وهذا يضمن حفظ محادثاتك القيمة بشكل آمن في قاعدة بياناتك الخلفية. شكرًا لك!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "وهذا يضمن حفظ محادثاتك القيمة بشكل آمن في قاعدة بياناتك الخلفية. شكرًا لك!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "هذه ميزة تجريبية، وقد لا تعمل كما هو متوقع وقد تتغير في أي وقت.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "هذا الخيار يحدد عدد الرموز التي يتم الاحتفاظ بها عند تحديث السياق. مثلاً، إذا تم ضبطه على 2، سيتم الاحتفاظ بآخر رمزين من السياق. الحفاظ على السياق يساعد في استمرارية المحادثة، لكنه قد يحد من التفاعل مع مواضيع جديدة.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "يحدد هذا الخيار الحد الأقصى لعدد الرموز التي يمكن للنموذج توليدها في الرد. زيادته تتيح للنموذج تقديم إجابات أطول، لكنها قد تزيد من احتمالية توليد محتوى غير مفيد أو غير ذي صلة.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "سيؤدي هذا الخيار إلى حذف جميع الملفات الحالية في المجموعة واستبدالها بالملفات التي تم تحميلها حديثًا.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Това гарантира, че ценните ви разговори се запазват сигурно във вашата бекенд база данни. Благодарим ви!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Това е експериментална функция, може да не работи според очакванията и подлежи на промяна по всяко време.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Тази опция ще изтрие всички съществуващи файлове в колекцията и ще ги замени с новокачени файлове.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "এটা নিশ্চিত করে যে, আপনার গুরুত্বপূর্ণ আলোচনা নিরাপদে আপনার ব্যাকএন্ড ডেটাবেজে সংরক্ষিত আছে। ধন্যবাদ!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "འདིས་ཁྱེད་ཀྱི་རྩ་ཆེའི་ཁ་བརྡ་དག་བདེ་འཇགས་ངང་ཁྱེད་ཀྱི་རྒྱབ་སྣེ་གནས་ཚུལ་མཛོད་དུ་ཉར་ཚགས་བྱེད་པ་ཁག་ཐེག་བྱེད། ཐུགས་རྗེ་ཆེ།",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "འདི་ནི་ཚོད་ལྟའི་རང་བཞིན་གྱི་ཁྱད་ཆོས་ཤིག་ཡིན། དེ་རེ་སྒུག་ལྟར་ལས་ཀ་བྱེད་མི་སྲིད། དེ་མིན་དུས་ཚོད་གང་རུང་ལ་འགྱུར་བ་འགྲོ་སྲིད།",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "འདེམས་ཀ་འདིས་ནང་དོན་གསར་སྒྱུར་བྱེད་སྐབས་ཊོཀ་ཀེན་ག་ཚོད་ཉར་ཚགས་བྱེད་དགོས་ཚོད་འཛིན་བྱེད། དཔེར་ན། གལ་ཏེ་ ༢ ལ་བཀོད་སྒྲིག་བྱས་ན། ཁ་བརྡའི་ནང་དོན་གྱི་ཊོཀ་ཀེན་མཐའ་མ་ ༢ ཉར་ཚགས་བྱེད་ངེས། ནང་དོན་ཉར་ཚགས་བྱས་ན་ཁ་བརྡའི་རྒྱུན་མཐུད་རང་བཞིན་རྒྱུན་སྲུང་བྱེད་པར་རོགས་པ་བྱེད་ཐུབ། འོན་ཀྱང་དེས་བརྗོད་གཞི་གསར་པར་ལན་འདེབས་བྱེད་པའི་ནུས་པ་ཉུང་དུ་གཏོང་སྲིད།",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "འདེམས་ཀ་འདིས་དཔེ་དབྱིབས་ཀྱིས་དེའི་ལན་ནང་བཟོ་ཐུབ་པའི་ཊོཀ་ཀེན་གྱི་གྲངས་མང་ཤོས་འཇོག་པ། ཚད་བཀག་འདི་མང་དུ་བཏང་ན་དཔེ་དབྱིབས་ཀྱིས་ལན་རིང་བ་སྤྲོད་པར་གནང་བ་སྤྲོད། འོན་ཀྱང་དེས་ཕན་ཐོགས་མེད་པའམ་འབྲེལ་མེད་ཀྱི་ནང་དོན་བཟོ་བའི་ཆགས་ཚུལ་མང་དུ་གཏོང་སྲིད།",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "འདེམས་ཀ་འདིས་བསྡུ་གསོག་ནང་གི་ཡོད་པའི་ཡིག་ཆ་ཡོངས་རྫོགས་བསུབ་ནས་དེ་དག་གསར་དུ་སྤར་བའི་ཡིག་ཆས་ཚབ་བྱེད་ངེས།",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Aquest xat no apareixerà a l'historial i els teus missatges no es desaran.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Això assegura que les teves converses valuoses queden desades de manera segura a la teva base de dades. Gràcies!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Aquesta és una funció experimental, és possible que no funcioni com s'espera i està subjecta a canvis en qualsevol moment.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Aquesta opció controla quants tokens es conserven en actualitzar el context. Per exemple, si s'estableix en 2, es conservaran els darrers 2 tokens del context de conversa. Preservar el context pot ajudar a mantenir la continuïtat d'una conversa, però pot reduir la capacitat de respondre a nous temes.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Aquesta opció estableix el nombre màxim de tokens que el model pot generar en la seva resposta. Augmentar aquest límit permet que el model proporcioni respostes més llargues, però també pot augmentar la probabilitat que es generi contingut poc útil o irrellevant.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Aquesta opció eliminarà tots els fitxers existents de la col·lecció i els substituirà per fitxers recentment penjats.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Kini nagsiguro nga ang imong bililhon nga mga panag-istoryahanay luwas nga natipig sa imong backend database. ",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "To zajišťuje, že vaše cenné konverzace jsou bezpečně uloženy ve vaší backendové databázi. Děkujeme!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Jedná se o experimentální funkci, nemusí fungovat podle očekávání a může být kdykoliv změněna.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Tato volba odstraní všechny existující soubory ve sbírce a nahradí je nově nahranými soubory.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Denne chat vil ikke vises i historikken, og dine beskeder vil ikke blive gemt.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Dette sikrer, at dine værdifulde samtaler gemmes sikkert i din backend-database. Tak!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Dette er en eksperimentel funktion, den fungerer muligvis ikke som forventet og kan ændres når som helst.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Denne indstilling sletter alle eksisterende filer i samlingen og erstatter dem med nyligt uploadede filer.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Diese Unterhaltung erscheint nicht in deinem Chat-Verlauf. Alle Nachrichten sind privat und werden nicht gespeichert.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Dies stellt sicher, dass Ihre wertvollen Chats sicher in Ihrer Backend-Datenbank gespeichert werden. Vielen Dank!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Dies ist eine experimentelle Funktion, sie funktioniert möglicherweise nicht wie erwartet und kann jederzeit geändert werden.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Diese Option löscht alle vorhandenen Dateien in der Sammlung und ersetzt sie durch neu hochgeladene Dateien.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "This ensures that your valuable conversations are securely saved to your backend database. Thank you! Much secure!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Αυτό διασφαλίζει ότι οι πολύτιμες συνομιλίες σας αποθηκεύονται με ασφάλεια στη βάση δεδομένων backend σας. Ευχαριστούμε!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Αυτή είναι μια πειραματική λειτουργία, μπορεί να μην λειτουργεί όπως αναμένεται και υπόκειται σε αλλαγές οποιαδήποτε στιγμή.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Αυτή η επιλογή θα διαγράψει όλα τα υπάρχοντα αρχεία στη συλλογή και θα τα αντικαταστήσει με νέα ανεβασμένα αρχεία.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Este chat no aparecerá en el historial y los mensajes no se guardarán.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Esto garantiza que sus valiosas conversaciones se guardan de forma segura en tu base de datos del servidor trasero (backend). ¡Gracias!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Esta es una característica experimental, por lo que puede no funcionar como se esperaba y está sujeta a cambios en cualquier momento.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Esta opción controla cuántos tokens se conservan cuando se actualiza el contexto. Por ejemplo, si se establece en 2, se conservarán los primeros 2 tokens del contexto de la conversación. Conservar el contexto puede ayudar a mantener la continuidad de una conversación, pero puede reducir la habilidad para responder a nuevos temas.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Esta opción establece el número máximo de tokens que el modelo puede generar en sus respuestas. Aumentar este límite permite al modelo proporcionar respuestas más largas, pero también puede aumentar la probabilidad de que se genere contenido inútil o irrelevante.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Esta opción eliminará todos los archivos existentes en la colección y los reemplazará con los nuevos archivos subidos.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "See tagab, et teie väärtuslikud vestlused salvestatakse turvaliselt teie tagarakenduse andmebaasi. Täname!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "See on katsetuslik funktsioon, see ei pruugi toimida ootuspäraselt ja võib igal ajal muutuda.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "See valik kontrollib, mitu tokenit säilitatakse konteksti värskendamisel. Näiteks kui see on määratud 2-le, säilitatakse vestluse konteksti viimased 2 tokenit. Konteksti säilitamine võib aidata säilitada vestluse järjepidevust, kuid võib vähendada võimet reageerida uutele teemadele.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "See valik määrab maksimaalse tokenite arvu, mida mudel saab oma vastuses genereerida. Selle piirmäära suurendamine võimaldab mudelil anda pikemaid vastuseid, kuid võib suurendada ka ebavajaliku või ebaolulise sisu genereerimise tõenäosust.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "See valik kustutab kõik olemasolevad failid kogust ja asendab need äsja üleslaaditud failidega.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Honek zure elkarrizketa baliotsuak modu seguruan zure backend datu-basean gordeko direla ziurtatzen du. Eskerrik asko!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Hau funtzionalitate esperimental bat da, baliteke espero bezala ez funtzionatzea eta edozein unetan aldaketak izatea.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Aukera honek bilduman dauden fitxategi guztiak ezabatuko ditu eta berriki kargatutako fitxategiekin ordezkatuko ditu.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "این اطمینان می\u200cدهد که مکالمات ارزشمند شما به طور امن در پایگاه داده پشتیبان ذخیره می\u200cشوند. متشکریم!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "این یک ویژگی آزمایشی است، ممکن است طبق انتظار کار نکند و در هر زمان ممکن است تغییر کند.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "این گزینه کنترل می\u200cکند که هنگام تازه\u200cسازی متن، چند توکن حفظ شوند. برای مثال، اگر روی 2 تنظیم شود، 2 توکن آخر متن مکالمه حفظ خواهند شد. حفظ متن می\u200cتواند به حفظ پیوستگی مکالمه کمک کند، اما ممکن است توانایی پاسخ به موضوعات جدید را کاهش دهد.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "این گزینه حداکثر تعداد توکن\u200cهایی را که مدل می\u200cتواند در پاسخ خود تولید کند تنظیم می\u200cکند. افزایش این محدودیت به مدل اجازه می\u200cدهد پاسخ\u200cهای طولانی\u200cتری ارائه دهد، اما ممکن است احتمال تولید محتوای بی\u200cفایده یا نامربوط را نیز افزایش دهد.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "این گزینه تمام فایل\u200cهای موجود در مجموعه را حذف کرده و با فایل\u200cهای جدید آپلود شده جایگزین می\u200cکند.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Tämä keskustelu ei näy historiassa, eikä viestejäsi tallenneta.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Tämä varmistaa, että arvokkaat keskustelusi tallennetaan turvallisesti backend-tietokantaasi. Kiitos!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Tämä on kokeellinen ominaisuus, se ei välttämättä toimi odotetulla tavalla ja se voi muuttua milloin tahansa.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Tämä vaihtoehto poistaa kaikki kokoelman nykyiset tiedostot ja korvaa ne uusilla ladatuilla tiedostoilla.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Cela garantit que vos conversations précieuses soient sauvegardées en toute sécurité dans votre base de données backend. Merci !",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Il s'agit d'une fonctionnalité expérimentale, elle peut ne pas fonctionner comme prévu et est sujette à modification à tout moment.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Cette conversation n'apparaîtra pas dans l'historique et vos messages ne seront pas enregistrés.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Cela garantit que vos conversations précieuses soient sauvegardées en toute sécurité dans votre base de données backend. Merci !",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Il s'agit d'une fonctionnalité expérimentale, elle peut ne pas fonctionner comme prévu et est sujette à modification à tout moment.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Cette option détermine combien de Token sont conservés lors du rafraîchissement du contexte. Par exemple, avec une valeur de 2, les 2 derniers Token seront conservés. Cela aide à maintenir la continuité de la conversation, mais peut limiter la capacité à traiter de nouveaux sujets.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Cette option définit le nombre maximal de Token que le modèle peut générer dans sa réponse. Une valeur plus élevée permet des réponses plus longues, mais peut aussi générer du contenu moins pertinent.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Cette option supprimera tous les fichiers existants dans la collection et les remplacera par les fichiers nouvellement téléchargés.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "פעולה זו מבטיחה שהשיחות בעלות הערך שלך יישמרו באופן מאובטח במסד הנתונים העורפי שלך. תודה!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "यह सुनिश्चित करता है कि आपकी मूल्यवान बातचीत आपके बैकएंड डेटाबेस में सुरक्षित रूप से सहेजी गई है। धन्यवाद!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ovo osigurava da su vaši vrijedni razgovori sigurno spremljeni u bazu podataka. Hvala vam!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Ovo je eksperimentalna značajka, možda neće funkcionirati prema očekivanjima i podložna je promjenama u bilo kojem trenutku.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Ez a csevegés nem jelenik meg az előzményekben, és az üzeneteid nem kerülnek mentésre.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ez biztosítja, hogy értékes beszélgetései biztonságosan mentésre kerüljenek a backend adatbázisban. Köszönjük!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Ez egy kísérleti funkció, lehet, hogy nem a várt módon működik és bármikor változhat.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Ez az opció szabályozza, hány token marad meg a kontextus frissítésekor. Például, ha 2-re van állítva, a beszélgetés kontextusának utolsó 2 tokenje megmarad. A kontextus megőrzése segíthet a beszélgetés folytonosságának fenntartásában, de csökkentheti az új témákra való reagálás képességét.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Ez az opció beállítja a modell által generálható tokenek maximális számát a válaszban. Ezen limit növelése hosszabb válaszokat tesz lehetővé, de növelheti a nem hasznos vagy irreleváns tartalom generálásának valószínűségét.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Ez az opció törli az összes meglévő fájlt a gyűjteményben és lecseréli őket az újonnan feltöltött fájlokkal.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ini akan memastikan bahwa percakapan Anda yang berharga disimpan dengan aman ke basis data backend. Terima kasih!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Ini adalah fitur eksperimental, mungkin tidak berfungsi seperti yang diharapkan dan dapat berubah sewaktu-waktu.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Ní bheidh an comhrá seo le feiceáil sa stair agus ní shábhálfar do theachtaireachtaí.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Cinntíonn sé seo go sábhálfar do chomhráite luachmhara go daingean i do bhunachar sonraí cúltaca Go raibh maith agat!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Is gné turgnamhach í seo, b'fhéidir nach bhfeidhmeoidh sé mar a bhíothas ag súil leis agus tá sé faoi réir athraithe ag am ar bith.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Rialaíonn an rogha seo cé mhéad comhartha a chaomhnaítear agus an comhthéacs á athnuachan. Mar shampla, má shocraítear go 2 é, coinneofar an 2 chomhartha dheireanacha de chomhthéacs an chomhrá. Is féidir le comhthéacs a chaomhnú cabhrú le leanúnachas comhrá a choinneáil, ach d’fhéadfadh sé laghdú a dhéanamh ar an gcumas freagairt do thopaicí nua.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Socraíonn an rogha seo an t-uaslíon comharthaí is féidir leis an tsamhail a ghiniúint ina fhreagra. Tríd an teorainn seo a mhéadú is féidir leis an tsamhail freagraí níos faide a sholáthar, ach d’fhéadfadh go méadódh sé an dóchúlacht go nginfear ábhar neamhchabhrach nó nach mbaineann le hábhar.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Scriosfaidh an rogha seo gach comhad atá sa bhailiúchán agus cuirfear comhaid nua-uaslódála ina n-ionad.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Questa chat non apparirà nella cronologia e i tuoi messaggi non verranno salvati.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ciò garantisce che le tue preziose conversazioni siano salvate in modo sicuro nel tuo database backend. Grazie!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Questa è una funzionalità sperimentale, potrebbe non funzionare come previsto ed è soggetta a modifiche in qualsiasi momento.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Questa opzione controlla quanti token vengono preservati quando si aggiorna il contesto. Ad esempio, se impostato su 2, gli ultimi 2 token del contesto della conversazione verranno mantenuti. Preservare il contesto può aiutare a mantenere la continuità di una conversazione, ma potrebbe ridurre la capacità di rispondere a nuovi argomenti.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Questa opzione imposta il numero massimo di token che il modello può generare nella sua risposta. Aumentare questo limite consente al modello di fornire risposte più lunghe, ma potrebbe anche aumentare la probabilità che vengano generati contenuti non utili o irrilevanti.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Questa opzione eliminerà tutti i file esistenti nella collezione e li sostituirà con i file appena caricati.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "これは、貴重な会話がバックエンドデータベースに安全に保存されることを保証します。ありがとうございます!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "実験的機能であり正常動作しない場合があります。",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "ეს უზრუნველყოფს, რომ თქვენი ღირებული საუბრები უსაფრთხოდ შეინახება თქვენს უკანაბოლო მონაცემთა ბაზაში. მადლობა!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "이렇게 하면 소중한 대화 내용이 백엔드 데이터베이스에 안전하게 저장됩니다. 감사합니다!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "이것은 실험적 기능으로, 예상대로 작동하지 않을 수 있으며 언제든지 변경될 수 있습니다.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "이 행동은 컬렉션에 존재하는 모든 파일을 삭제하고 새로 업로드된 파일들로 대체됩니다",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Tai užtikrina, kad Jūsų pokalbiai saugiai saugojami duomenų bazėje. Ačiū!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Tai eksperimentinė funkcija ir gali veikti nevisada.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ini akan memastikan bahawa perbualan berharga anda disimpan dengan selamat ke pangkalan data 'backend' anda. Terima kasih!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "ni adalah ciri percubaan, ia mungkin tidak berfungsi seperti yang diharapkan dan tertakluk kepada perubahan pada bila-bila masa.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Dette sikrer at de verdifulle samtalene dine lagres sikkert i backend-databasen din. Takk!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Dette er en eksperimentell funksjon. Det er mulig den ikke fungerer som forventet, og den kan endres når som helst.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Dette alternativet sletter alle eksisterende filer i samlingen og erstatter dem med nyopplastede filer.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Dit zorgt ervoor dat je waardevolle gesprekken veilig worden opgeslagen in je backend database. Dank je wel!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Dit is een experimentele functie, het kan functioneren zoals verwacht en kan op elk moment veranderen.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Deze optie bepaalt hoeveel tokens bewaard blijven bij het verversen van de context. Als deze bijvoorbeeld op 2 staat, worden de laatste 2 tekens van de context van het gesprek bewaard. Het behouden van de context kan helpen om de continuïteit van een gesprek te behouden, maar het kan de mogelijkheid om te reageren op nieuwe onderwerpen verminderen.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Deze optie stelt het maximum aantal tokens in dat het model kan genereren in zijn antwoord. Door dit limiet te verhogen, kan het model langere antwoorden geven, maar het kan ook de kans vergroten dat er onbehulpzame of irrelevante inhoud wordt gegenereerd.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Deze optie verwijdert alle bestaande bestanden in de collectie en vervangt ze door nieuw geüploade bestanden.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "ਇਹ ਯਕੀਨੀ ਬਣਾਉਂਦਾ ਹੈ ਕਿ ਤੁਹਾਡੀਆਂ ਕੀਮਤੀ ਗੱਲਾਂ ਤੁਹਾਡੇ ਬੈਕਐਂਡ ਡਾਟਾਬੇਸ ਵਿੱਚ ਸੁਰੱਖਿਅਤ ਤੌਰ 'ਤੇ ਸੰਭਾਲੀਆਂ ਗਈਆਂ ਹਨ। ਧੰਨਵਾਦ!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "To gwarantuje, że Twoje wartościowe rozmowy są bezpiecznie zapisywane w bazie danych backendowej. Dziękujemy!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "To jest funkcja eksperymentalna, może nie działać zgodnie z oczekiwaniami i jest podatna na zmiany w dowolnym momencie.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Ta opcja usunie wszystkie istniejące pliki w kolekcji i zastąpi je nowo przesłanymi plikami.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Isso garante que suas conversas valiosas sejam salvas com segurança no banco de dados do backend. Obrigado!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Esta é uma funcionalidade experimental, pode não funcionar como esperado e está sujeita a alterações a qualquer momento.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Essa opção deletará todos os arquivos existentes na coleção e todos eles serão substituídos.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Isto garante que suas conversas valiosas sejam guardadas com segurança na sua base de dados de backend. Obrigado!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Isto é um recurso experimental, pode não funcionar conforme o esperado e está sujeito a alterações a qualquer momento.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Acest lucru asigură că conversațiile dvs. valoroase sunt salvate în siguranță în baza de date a backend-ului dvs. Mulțumim!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Aceasta este o funcție experimentală, poate să nu funcționeze așa cum vă așteptați și este supusă schimbării în orice moment.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Această opțiune va șterge toate fișierelor existente din colecție și le va înlocui cu fișierele nou încărcate.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Этот чат не появится в истории, и ваши сообщения не будут сохранены.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Это обеспечивает сохранение ваших ценных разговоров в безопасной базе данных на вашем сервере. Спасибо!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Это экспериментальная функция, она может работать не так, как ожидалось, и может быть изменена в любое время.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Этот параметр определяет, сколько токенов сохраняется при обновлении контекста. Например, если задано значение 2, будут сохранены последние 2 токена контекста беседы. Сохранение контекста может помочь сохранить непрерывность беседы, но может уменьшить возможность отвечать на новые темы.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Этот параметр устанавливает максимальное количество токенов, которые модель может генерировать в своем ответе. Увеличение этого ограничения позволяет модели предоставлять более длинные ответы, но также может увеличить вероятность создания бесполезного или нерелевантного контента.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Эта опция удалит все существующие файлы в коллекции и заменит их вновь загруженными файлами.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Týmto je zaistené, že vaše cenné konverzácie sú bezpečne uložené vo vašej backendovej databáze. Ďakujeme!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Toto je experimentálna funkcia, nemusí fungovať podľa očakávania a môže byť kedykoľvek zmenená.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Táto voľba odstráni všetky existujúce súbory v kolekcii a nahradí ich novo nahranými súbormi.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Ово осигурава да су ваши вредни разговори безбедно сачувани у вашој бекенд бази података. Хвала вам!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Detta säkerställer att dina värdefulla samtal sparas säkert till din backend-databas. Tack!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Detta är en experimentell funktion som kanske inte fungerar som förväntat och som kan komma att ändras när som helst.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "สิ่งนี้ทำให้มั่นใจได้ว่าการสนทนาที่มีค่าของคุณจะถูกบันทึกอย่างปลอดภัยในฐานข้อมูลแบ็กเอนด์ของคุณ ขอบคุณ!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "นี่เป็นฟีเจอร์ทดลอง อาจไม่ทำงานตามที่คาดไว้และอาจมีการเปลี่ยนแปลงได้ตลอดเวลา",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Bu, önemli konuşmalarınızın güvenli bir şekilde arkayüz veritabanınıza kaydedildiğini garantiler. Teşekkür ederiz!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Bu deneysel bir özelliktir, beklendiği gibi çalışmayabilir ve her an değişiklik yapılabilir.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Bu seçenek, koleksiyondaki tüm mevcut dosyaları silecek ve bunları yeni yüklenen dosyalarla değiştirecek.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Це забезпечує збереження ваших цінних розмов у безпечному бекенд-сховищі. Дякуємо!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Це експериментальна функція, вона може працювати не так, як очікувалося, і може бути змінена в будь-який час.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Ця опція контролює, скільки токенів зберігається при оновленні контексту. Наприклад, якщо встановити значення 2, останні 2 токени контексту розмови будуть збережені. Збереження контексту допомагає підтримувати послідовність розмови, але може зменшити здатність реагувати на нові теми.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Ця опція встановлює максимальну кількість токенів, які модель може згенерувати у своїй відповіді. Збільшення цього ліміту дозволяє моделі надавати довші відповіді, але також може підвищити ймовірність генерації непотрібного або нерелевантного контенту.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Цей варіант видалить усі існуючі файли в колекції та замінить їх новими завантаженими файлами.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "یہ یقینی بناتا ہے کہ آپ کی قیمتی گفتگو محفوظ طریقے سے آپ کے بیک اینڈ ڈیٹا بیس میں محفوظ کی گئی ہیں شکریہ!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "یہ ایک تجرباتی خصوصیت ہے، یہ متوقع طور پر کام نہ کر سکتی ہو اور کسی بھی وقت تبدیل کی جا سکتی ہے",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "اس اختیار سے مجموعہ میں موجود تمام فائلز حذف ہو جائیں گی اور ان کی جگہ نئی اپ لوڈ کردہ فائلز لی جائیں گی",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "Cuộc trò chuyện này sẽ không xuất hiện trong lịch sử và tin nhắn của bạn sẽ không được lưu.",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "Điều này đảm bảo rằng các nội dung chat có giá trị của bạn được lưu an toàn vào cơ sở dữ liệu backend của bạn. Cảm ơn bạn!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "Đây là tính năng thử nghiệm, có thể không hoạt động như mong đợi và có thể thay đổi bất kỳ lúc nào.",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "Tùy chọn này kiểm soát số lượng token được bảo tồn khi làm mới ngữ cảnh. Ví dụ: nếu đặt thành 2, 2 token cuối cùng của ngữ cảnh hội thoại sẽ được giữ lại. Bảo tồn ngữ cảnh có thể giúp duy trì tính liên tục của cuộc trò chuyện, nhưng nó có thể làm giảm khả năng phản hồi các chủ đề mới.",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "Tùy chọn này đặt số lượng token tối đa mà mô hình có thể tạo ra trong phản hồi của nó. Tăng giới hạn này cho phép mô hình cung cấp câu trả lời dài hơn, nhưng nó cũng có thể làm tăng khả năng tạo ra nội dung không hữu ích hoặc không liên quan.",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "Tùy chọn này sẽ xóa tất cả các tệp hiện có trong bộ sưu tập và thay thế chúng bằng các tệp mới được tải lên.",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "此聊天不会出现在历史记录中,且您的消息不会被保存。",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "这将确保您的宝贵对话被安全地保存到后台数据库中。感谢!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "这是一个实验功能,可能不会如预期那样工作,而且可能随时发生变化。",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "此选项控制刷新上下文时保留多少 Token。例如,如果设置为 2,则将保留对话上下文的最后 2 个 Token。保留上下文有助于保持对话的连续性,但可能会降低响应新主题的能力。",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "此项用于设置模型在其响应中可以生成的最大 Token 数。增加此限制可让模型提供更长的答案,但也可能增加生成无用或不相关内容的可能性。",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "此选项将会删除文件集中所有文件,并用新上传的文件替换。",
|
||||
|
@ -1140,6 +1140,7 @@
|
||||
"This chat won’t appear in history and your messages will not be saved.": "此對話不會出現在歷史記錄中,且您的訊息將不被儲存。",
|
||||
"This ensures that your valuable conversations are securely saved to your backend database. Thank you!": "這確保您寶貴的對話會安全地儲存到您的後端資料庫。謝謝!",
|
||||
"This is an experimental feature, it may not function as expected and is subject to change at any time.": "這是一個實驗性功能,它可能無法如預期運作,並且可能會隨時變更。",
|
||||
"This model is not publicly available. Please select another model.": "",
|
||||
"This option controls how many tokens are preserved when refreshing the context. For example, if set to 2, the last 2 tokens of the conversation context will be retained. Preserving context can help maintain the continuity of a conversation, but it may reduce the ability to respond to new topics.": "此選項控制在重新整理上下文時保留多少 token。例如,如果設定為 2,則會保留對話上下文的最後 2 個 token。保留上下文有助於保持對話的連貫性,但也可能降低對新主題的回應能力。",
|
||||
"This option sets the maximum number of tokens the model can generate in its response. Increasing this limit allows the model to provide longer answers, but it may also increase the likelihood of unhelpful or irrelevant content being generated.": "此選項設定模型在其回應中可以生成的最大 token 數量。增加此限制允許模型提供更長的答案,但也可能增加生成無用或不相關內容的可能性。",
|
||||
"This option will delete all existing files in the collection and replace them with newly uploaded files.": "此選項將刪除集合中的所有現有檔案,並用新上傳的檔案取代它們。",
|
||||
|
Loading…
Reference in New Issue
Block a user