# Ollama Web UI: A User-Friendly Web Interface for Chat Interactions ๐Ÿ‘‹ ![GitHub stars](https://img.shields.io/github/stars/ollama-webui/ollama-webui?style=social) ![GitHub forks](https://img.shields.io/github/forks/ollama-webui/ollama-webui?style=social) ![GitHub watchers](https://img.shields.io/github/watchers/ollama-webui/ollama-webui?style=social) ![GitHub repo size](https://img.shields.io/github/repo-size/ollama-webui/ollama-webui) ![GitHub language count](https://img.shields.io/github/languages/count/ollama-webui/ollama-webui) ![GitHub top language](https://img.shields.io/github/languages/top/ollama-webui/ollama-webui) ![GitHub last commit](https://img.shields.io/github/last-commit/ollama-webui/ollama-webui?color=red) ![Hits](https://hits.seeyoufarm.com/api/count/incr/badge.svg?url=https%3A%2F%2Fgithub.com%2Follama-webui%2Follama-wbui&count_bg=%2379C83D&title_bg=%23555555&icon=&icon_color=%23E7E7E7&title=hits&edge_flat=false) [![Discord](https://img.shields.io/badge/Discord-Ollama_Web_UI-blue?logo=discord&logoColor=white)](https://discord.gg/5rJgQTnV4s) [![](https://img.shields.io/static/v1?label=Sponsor&message=%E2%9D%A4&logo=GitHub&color=%23fe8e86)](https://github.com/sponsors/tjbck) ChatGPT-Style Web Interface for Ollama ๐Ÿฆ™ ![Ollama Web UI Demo](./demo.gif) ## Features โญ - ๐Ÿ–ฅ๏ธ **Intuitive Interface**: Our chat interface takes inspiration from ChatGPT, ensuring a user-friendly experience. - ๐Ÿ“ฑ **Responsive Design**: Enjoy a seamless experience on both desktop and mobile devices. - โšก **Swift Responsiveness**: Enjoy fast and responsive performance. - ๐Ÿš€ **Effortless Setup**: Install seamlessly using Docker for a hassle-free experience. - ๐Ÿ’ป **Code Syntax Highlighting**: Enjoy enhanced code readability with our syntax highlighting feature. - โœ’๏ธ๐Ÿ”ข **Full Markdown and LaTeX Support**: Elevate your LLM experience with comprehensive Markdown and LaTeX capabilities for enriched interaction. - ๐Ÿ“ฅ๐Ÿ—‘๏ธ **Download/Delete Models**: Easily download or remove models directly from the web UI. - ๐Ÿค– **Multiple Model Support**: Seamlessly switch between different chat models for diverse interactions. - โš™๏ธ **Many Models Conversations**: Effortlessly engage with various models simultaneously, harnessing their unique strengths for optimal responses. Enhance your experience by leveraging a diverse set of models in parallel. - ๐Ÿค **OpenAI Model Integration**: Seamlessly utilize OpenAI models alongside Ollama models for a versatile conversational experience. - ๐Ÿ”„ **Regeneration History Access**: Easily revisit and explore your entire regeneration history. - ๐Ÿ“œ **Chat History**: Effortlessly access and manage your conversation history. - ๐Ÿ“ค๐Ÿ“ฅ **Import/Export Chat History**: Seamlessly move your chat data in and out of the platform. - ๐Ÿ—ฃ๏ธ **Voice Input Support**: Engage with your model through voice interactions; enjoy the convenience of talking to your model directly. Additionally, explore the option for sending voice input automatically after 3 seconds of silence for a streamlined experience. - โš™๏ธ **Fine-Tuned Control with Advanced Parameters**: Gain a deeper level of control by adjusting parameters such as temperature and defining your system prompts to tailor the conversation to your specific preferences and needs. - ๐Ÿ” **Auth Header Support**: Effortlessly enhance security by adding Authorization headers to Ollama requests directly from the web UI settings, ensuring access to secured Ollama servers. - ๐Ÿ”— **External Ollama Server Connection**: Seamlessly link to an external Ollama server hosted on a different address by configuring the environment variable during the Docker build phase. Additionally, you can also set the external server connection URL from the web UI post-build. - ๐Ÿ”’ **Backend Reverse Proxy Support**: Strengthen security by enabling direct communication between Ollama Web UI backend and Ollama, eliminating the need to expose Ollama over LAN. - ๐ŸŒŸ **Continuous Updates**: We are committed to improving Ollama Web UI with regular updates and new features. ## How to Install ๐Ÿš€ ### Installing Both Ollama and Ollama Web UI Using Docker Compose If you don't have Ollama installed yet, you can use the provided Docker Compose file for a hassle-free installation. Simply run the following command: ```bash docker compose up -d --build ``` This command will install both Ollama and Ollama Web UI on your system. Ensure to modify the `compose.yaml` file for GPU support and Exposing Ollama API outside the container stack if needed. ### Installing Ollama Web UI Only #### Prerequisites Make sure you have the latest version of Ollama installed before proceeding with the installation. You can find the latest version of Ollama at [https://ollama.ai/](https://ollama.ai/). ##### Checking Ollama After installing Ollama, verify that Ollama is running by accessing the following link in your web browser: [http://127.0.0.1:11434/](http://127.0.0.1:11434/). Note that the port number may differ based on your system configuration. #### Using Docker ๐Ÿณ If Ollama is hosted on your local machine and accessible at [http://127.0.0.1:11434/](http://127.0.0.1:11434/), run the following command: ```bash docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` Alternatively, if you prefer to build the container yourself, use the following command: ```bash docker build -t ollama-webui . docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway --name ollama-webui --restart always ollama-webui ``` Your Ollama Web UI should now be hosted at [http://localhost:3000](http://localhost:3000) and accessible over LAN (or Network). Enjoy! ๐Ÿ˜„ #### Accessing External Ollama on a Different Server Change `OLLAMA_API_BASE_URL` environment variable to match the external Ollama Server url: ```bash docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main ``` Alternatively, if you prefer to build the container yourself, use the following command: ```bash docker build -t ollama-webui . docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api --name ollama-webui --restart always ollama-webui ``` ## How to Build for Static Deployment 1. Clone & Enter the project ```sh git clone https://github.com/ollama-webui/ollama-webui.git pushd ./ollama-webui/ ``` 2. Create and edit `.env` ```sh cp -RPp example.env .env ``` 3. Install node dependencies ```sh npm i ``` 4. Run in dev mode, or build the site for deployment - Test in Dev mode: ```sh npm run dev ``` - Build for Deploy: ```sh #`PUBLIC_API_BASE_URL` will overwrite the value in `.env` PUBLIC_API_BASE_URL='https://example.com/api' npm run build ``` 5. Test the build with `caddy` (or the server of your choice) ```sh curl https://webi.sh/caddy | sh PUBLIC_API_BASE_URL='https://localhost/api' npm run build caddy run --envfile .env --config ./Caddyfile.localhost ``` ## Troubleshooting See [TROUBLESHOOTING.md](/TROUBLESHOOTING.md) for information on how to troubleshoot and/or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s). ## What's Next? ๐Ÿš€ ### To-Do List ๐Ÿ“ Here are some exciting tasks on our to-do list: - ๐Ÿ” **Access Control**: Securely manage requests to Ollama by utilizing the backend as a reverse proxy gateway, ensuring only authenticated users can send specific requests. - ๐Ÿงช **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research. - ๐Ÿ“ˆ **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy. - ๐Ÿ“š **Enhanced Documentation**: Elevate your setup and customization experience with improved, comprehensive documentation. Feel free to contribute and help us make Ollama Web UI even better! ๐Ÿ™Œ ## Supporters โœจ A big shoutout to our amazing supporters who's helping to make this project possible! ๐Ÿ™ ### Platinum Sponsors ๐Ÿค - [Prof. Lawrence Kim @ SFU](https://www.lhkim.com/) ## License ๐Ÿ“œ This project is licensed under the [MIT License](LICENSE) - see the [LICENSE](LICENSE) file for details. ๐Ÿ“„ ## Support ๐Ÿ’ฌ If you have any questions, suggestions, or need assistance, please open an issue or join our [Ollama Web UI Discord community](https://discord.gg/5rJgQTnV4s) or [Ollama Discord community](https://discord.gg/ollama) to connect with us! ๐Ÿค --- Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Ollama Web UI even more amazing together! ๐Ÿ’ช