Merge pull request #575 from ayanahye/doc-updates-faq-release

Update Faq with Release Info
This commit is contained in:
Tim Jaeryang Baek 2025-06-16 11:25:04 +04:00 committed by GitHub
commit 8d09730ca2
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -8,8 +8,8 @@ import { TopBanners } from "@site/src/components/TopBanners";
<TopBanners /> <TopBanners />
### 💡 Why Docker? ### 💡 Why Docker?
We understand Docker might not be everyone's preference; however, this approach is central to our project's design and operational efficiency. We view the project's commitment to Docker as a fundamental aspect and encourage those looking for different deployment methods to explore community-driven alternatives.
We understand Docker might not be everyone's preference; however, this approach is central to our project's design and operational efficiency. We view the project's commitment to Docker as a fundamental aspect and encourage those looking for different deployment methods to explore community-driven alternatives.
#### **Q: How do I customize the logo and branding?** #### **Q: How do I customize the logo and branding?**
@ -91,6 +91,7 @@ Everything you need to run Open WebUI, including your data, remains within your
```bash ```bash
docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main docker run -d -p 3000:8080 -e HF_ENDPOINT=https://hf-mirror.com/ --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
``` ```
#### **Q: RAG with Open WebUI is very bad or not working at all. Why?** #### **Q: RAG with Open WebUI is very bad or not working at all. Why?**
**A:** If you're using **Ollama**, be aware that Ollama sets the context length to **2048 tokens by default**. This means that none of the retrieved data might be used because it doesn't fit within the available context window. **A:** If you're using **Ollama**, be aware that Ollama sets the context length to **2048 tokens by default**. This means that none of the retrieved data might be used because it doesn't fit within the available context window.
@ -100,6 +101,7 @@ To improve the performance of Retrieval-Augmented Generation (**RAG**) with Open
To do this, configure your **Ollama model params** to allow a larger context window. You can check and modify this setting in your chat directly or from model editor page to enhance the RAG experience significantly. To do this, configure your **Ollama model params** to allow a larger context window. You can check and modify this setting in your chat directly or from model editor page to enhance the RAG experience significantly.
#### **Q: Is MCP (Model Context Protocol) supported in Open WebUI?** #### **Q: Is MCP (Model Context Protocol) supported in Open WebUI?**
**A:** [Yes, Open WebUI officially supports MCP Tool Servers—but exclusively through an **OpenAPI-compliant proxy**](/openapi-servers/mcp) ([openapi-servers](https://github.com/open-webui/openapi-servers)) for optimal compatibility, security, and maintainability. **A:** [Yes, Open WebUI officially supports MCP Tool Servers—but exclusively through an **OpenAPI-compliant proxy**](/openapi-servers/mcp) ([openapi-servers](https://github.com/open-webui/openapi-servers)) for optimal compatibility, security, and maintainability.
To bridge MCP (and other backend protocols), we provide a purpose-built proxy implementation available at: 👉 [https://github.com/open-webui/mcpo](https://github.com/open-webui/mcpo) To bridge MCP (and other backend protocols), we provide a purpose-built proxy implementation available at: 👉 [https://github.com/open-webui/mcpo](https://github.com/open-webui/mcpo)
@ -113,6 +115,12 @@ This design choice is motivated by several core principles:
In summary: MCP is supported — as long as the MCP Tool Server is fronted by an OpenAPI-compatible proxy. This architectural decision is deliberate and ensures that Open WebUI remains scalable, secure, and maintainable. In summary: MCP is supported — as long as the MCP Tool Server is fronted by an OpenAPI-compatible proxy. This architectural decision is deliberate and ensures that Open WebUI remains scalable, secure, and maintainable.
#### **Q: How often is Open WebUI updated?** (Release Schedule)
**A:** We aim to ship **major releases weekly**, with **bug fixes and minor updates delivered as needed**. However, this is not a rigid schedule—some weeks may see multiple releases, while others might have none at all.
To stay informed, you can follow release notes and announcements on our [GitHub Releases page](https://github.com/open-webui/open-webui/releases).
#### **Need Further Assistance?** #### **Need Further Assistance?**
If you have any further questions or concerns, please reach out to our [GitHub Issues page](https://github.com/open-webui/open-webui/issues) or our [Discord channel](https://discord.gg/5rJgQTnV4s) for more help and information. If you have any further questions or concerns, please reach out to our [GitHub Issues page](https://github.com/open-webui/open-webui/issues) or our [Discord channel](https://discord.gg/5rJgQTnV4s) for more help and information.