From 87eb53b4cd01cb9c29f5f26d9ddfbbf8d09916a3 Mon Sep 17 00:00:00 2001 From: Luciano Tonet Date: Wed, 28 Feb 2024 02:24:59 -0300 Subject: [PATCH 01/11] Update index.md --- docs/getting-started/index.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index 9db92c2..e71afc8 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -170,7 +170,7 @@ git clone https://github.com/open-webui/open-webui.git cd open-webui/ # Copying required .env file -cp -RPp example.env .env +cp -RPp .env.example .env # Building Frontend Using Node npm i From b4ca364b2ee582bb4bdbe12971a24bc7ad69236b Mon Sep 17 00:00:00 2001 From: lainedfles <126992880+lainedfles@users.noreply.github.com> Date: Wed, 28 Feb 2024 22:59:54 +0000 Subject: [PATCH 02/11] Update intro.md to add rootless (Podman) local-only & auto-update Add instructions for configuration of rootless (Podman) local-only Open WebUI with Systemd service and auto-update capability. --- docs/intro.md | 47 +++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 47 insertions(+) diff --git a/docs/intro.md b/docs/intro.md index 23acc6f..12afe10 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -75,6 +75,53 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` +## Installing with Podman +
+Rootless (Podman) local-only Open WebUI with Systemd service and auto-update + +- **Important:** Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only. + +1. Pull the latest image: + ```bash + podman pull ghcr.io/open-webui/open-webui:main + ``` +2. Create a new container using desired configuration: + + **Note:** `-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example. + ```bash + podman create -p 127.0.0.1:3000:8080 --network slirp4netns:allow_host_loopback=true --add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main + ``` +4. Prepare for systemd user service: + ```bash + mkdir -p ~/.config/systemd/user/ + ``` +5. Generate user service with Podman: + ```bash + podman generate systemd --new open-webui > ~/.config/systemd/user/open-webui.service + ``` +6. Reload systemd configuration: + ```bash + systemctl --user daemon-reload + ``` +7. Enable and validate new service: + ```bash + systemctl --user enable open-webui.service + systemctl --user start open-webui.service + systemctl --user status open-webui.service + ``` +8. Enable and validate Podman auto-update: + ```bash + systemctl --user enable podman-auto-update.timer + systemctl --user enable podman-auto-update.service + systemctl --user status podman-auto-update.timer + ``` + Dry run with the following command (omit `--dry-run` to force an update): + ```bash + podman auto-update --dry-run + ``` + +
+ ## Troubleshooting If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s). From 8e013387e587eeb1719962f950fdc11d769546b1 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Wed, 28 Feb 2024 15:19:47 -0800 Subject: [PATCH 03/11] Update index.md --- docs/getting-started/index.md | 50 +++++++++++++++++++++++++++++++++++ 1 file changed, 50 insertions(+) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index e71afc8..177644e 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -130,6 +130,56 @@ title: "πŸš€ Getting Started" ./run-compose.sh --enable-gpu --build ``` +## Installing with Podman + +
+Rootless (Podman) local-only Open WebUI with Systemd service and auto-update + +- **Important:** Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only. + +1. Pull the latest image: + ```bash + podman pull ghcr.io/open-webui/open-webui:main + ``` +2. Create a new container using desired configuration: + + **Note:** `-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example. + + ```bash + podman create -p 127.0.0.1:3000:8080 --network slirp4netns:allow_host_loopback=true --add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main + ``` + +3. Prepare for systemd user service: + ```bash + mkdir -p ~/.config/systemd/user/ + ``` +4. Generate user service with Podman: + ```bash + podman generate systemd --new open-webui > ~/.config/systemd/user/open-webui.service + ``` +5. Reload systemd configuration: + ```bash + systemctl --user daemon-reload + ``` +6. Enable and validate new service: + ```bash + systemctl --user enable open-webui.service + systemctl --user start open-webui.service + systemctl --user status open-webui.service + ``` +7. Enable and validate Podman auto-update: + ```bash + systemctl --user enable podman-auto-update.timer + systemctl --user enable podman-auto-update.service + systemctl --user status podman-auto-update.timer + ``` + Dry run with the following command (omit `--dry-run` to force an update): + ```bash + podman auto-update --dry-run + ``` + +
+ ### Alternative Installation Methods For other ways to install, like using Kustomize or Helm, check out [INSTALLATION](/getting-started/installation). Join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s) for more help and information. From 1c01f5c5de185233a75d96b870477335cc3f9cb1 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Wed, 28 Feb 2024 15:20:09 -0800 Subject: [PATCH 04/11] Update intro.md --- docs/intro.md | 47 ----------------------------------------------- 1 file changed, 47 deletions(-) diff --git a/docs/intro.md b/docs/intro.md index 12afe10..23acc6f 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -75,53 +75,6 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` -## Installing with Podman -
-Rootless (Podman) local-only Open WebUI with Systemd service and auto-update - -- **Important:** Consult the Docker documentation because much of the configuration and syntax is interchangeable with [Podman](https://github.com/containers/podman). See also [rootless_tutorial](https://github.com/containers/podman/blob/main/docs/tutorials/rootless_tutorial.md). This example requires the [slirp4netns](https://github.com/rootless-containers/slirp4netns) network backend to facilitate server listen and Ollama communication over localhost only. - -1. Pull the latest image: - ```bash - podman pull ghcr.io/open-webui/open-webui:main - ``` -2. Create a new container using desired configuration: - - **Note:** `-p 127.0.0.1:3000:8080` ensures that we listen only on localhost, `--network slirp4netns:allow_host_loopback=true` permits the container to access Ollama when it also listens strictly on localhost. `--add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api'` adds a hosts record to the container and configures open-webui to use the friendly hostname. `10.0.2.2` is the default slirp4netns address used for localhost mapping. `--env 'ANONYMIZED_TELEMETRY=False'` isn't necessary since Chroma telemetry has been disabled in the code but is included as an example. - ```bash - podman create -p 127.0.0.1:3000:8080 --network slirp4netns:allow_host_loopback=true --add-host=ollama.local:10.0.2.2 --env 'OLLAMA_API_BASE_URL=http://ollama.local:11434/api' --env 'ANONYMIZED_TELEMETRY=False' -v open-webui:/app/backend/data --label io.containers.autoupdate=registry --name open-webui ghcr.io/open-webui/open-webui:main - ``` -4. Prepare for systemd user service: - ```bash - mkdir -p ~/.config/systemd/user/ - ``` -5. Generate user service with Podman: - ```bash - podman generate systemd --new open-webui > ~/.config/systemd/user/open-webui.service - ``` -6. Reload systemd configuration: - ```bash - systemctl --user daemon-reload - ``` -7. Enable and validate new service: - ```bash - systemctl --user enable open-webui.service - systemctl --user start open-webui.service - systemctl --user status open-webui.service - ``` -8. Enable and validate Podman auto-update: - ```bash - systemctl --user enable podman-auto-update.timer - systemctl --user enable podman-auto-update.service - systemctl --user status podman-auto-update.timer - ``` - Dry run with the following command (omit `--dry-run` to force an update): - ```bash - podman auto-update --dry-run - ``` - -
- ## Troubleshooting If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s). From 4ebf66d69a4d1f196e3fddfb3f5040f2cfbc30ce Mon Sep 17 00:00:00 2001 From: Ismael Date: Wed, 28 Feb 2024 23:30:55 -0400 Subject: [PATCH 05/11] Update migration.md to fix a typo on the word "volume" The word "volume" was spelled as "volumen" in part of the comments. --- docs/migration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/migration.md b/docs/migration.md index 98b1087..9b27f9d 100644 --- a/docs/migration.md +++ b/docs/migration.md @@ -39,7 +39,7 @@ docker run --rm -v ollama-webui:/from -v open-webui:/to alpine ash -c "cd /from [insert the equivalent command that you used to install with the new Docker image name] ``` -Once you verify that all the data has been migrated you can erase the old volumen using the following command: +Once you verify that all the data has been migrated you can erase the old volume using the following command: ```bash docker volume rm ollama-webui From cba88f7b6bf7faca7e2bd736b03bf144293d2dde Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Thu, 29 Feb 2024 17:20:30 -0800 Subject: [PATCH 06/11] doc: update --- docs/getting-started/index.md | 110 +++++++++++++++++----------------- docs/intro.md | 22 +++++-- 2 files changed, 73 insertions(+), 59 deletions(-) diff --git a/docs/getting-started/index.md b/docs/getting-started/index.md index 177644e..72f72e1 100644 --- a/docs/getting-started/index.md +++ b/docs/getting-started/index.md @@ -15,31 +15,31 @@ title: "πŸš€ Getting Started" ::: -## Before You Begin - +
+Before You Begin 1. **Installing Docker:** - - **For Windows and Mac Users:** +- **For Windows and Mac Users:** - - Download Docker Desktop from [Docker's official website](https://www.docker.com/products/docker-desktop). - - Follow the installation instructions provided on the website. After installation, open Docker Desktop to ensure it's running properly. + - Download Docker Desktop from [Docker's official website](https://www.docker.com/products/docker-desktop). + - Follow the installation instructions provided on the website. After installation, open Docker Desktop to ensure it's running properly. - - **For Ubuntu and Other Linux Users:** - - Open your terminal. - - Set up your Docker apt repository according to the [Docker documentation](https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository) - - Update your package index: - ```bash - sudo apt-get update - ``` - - Install Docker using the following command: - ```bash - sudo apt-get install docker-ce docker-ce-cli containerd.io - ``` - - Verify the Docker installation with: - ```bash - sudo docker run hello-world - ``` - This command downloads a test image and runs it in a container, which prints an informational message. +- **For Ubuntu and Other Linux Users:** + - Open your terminal. + - Set up your Docker apt repository according to the [Docker documentation](https://docs.docker.com/engine/install/ubuntu/#install-using-the-repository) + - Update your package index: + ```bash + sudo apt-get update + ``` + - Install Docker using the following command: + ```bash + sudo apt-get install docker-ce docker-ce-cli containerd.io + ``` + - Verify the Docker installation with: + ```bash + sudo docker run hello-world + ``` + This command downloads a test image and runs it in a container, which prints an informational message. 2. **Ensure You Have the Latest Version of Ollama:** @@ -48,41 +48,9 @@ title: "πŸš€ Getting Started" 3. **Verify Ollama Installation:** - After installing Ollama, check if it's working by visiting [http://127.0.0.1:11434/](http://127.0.0.1:11434/) in your web browser. Remember, the port number might be different for you. -## Installing with Docker 🐳 +
-- **Important:** When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. - -- **If Ollama is on your computer**, use this command: - - ```bash - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main - ``` - -- **To build the container yourself**, follow these steps: - - ```bash - docker build -t open-webui . - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always open-webui - ``` - -- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). - -### Using Ollama on a Different Server - -- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL: - - ```bash - docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main - ``` - - Or for a self-built container: - - ```bash - docker build -t open-webui . - docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always open-webui - ``` - -### Installing Ollama and Open WebUI Together +## One-line Command to Install Ollama and Open WebUI Together #### Using Docker Compose @@ -130,6 +98,38 @@ title: "πŸš€ Getting Started" ./run-compose.sh --enable-gpu --build ``` +## Quick Start with Docker 🐳 + +:::info +When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. +::: + +- **If Ollama is on your computer**, use this command: + + ```bash + docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main + ``` + +- **If Ollama is on a Different Server**, use this command: + +- To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL: + + ```bash + docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main + ``` + +- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! πŸ˜„ + +#### Open WebUI: Server Connection Error + +If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`. + +**Example Docker Command**: + +```bash +docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main +``` + ## Installing with Podman
diff --git a/docs/intro.md b/docs/intro.md index 23acc6f..9ba572d 100644 --- a/docs/intro.md +++ b/docs/intro.md @@ -57,7 +57,11 @@ hide_title: true Don't forget to explore our sibling project, [Open WebUI Community](https://openwebui.com/), where you can discover, download, and explore customized Modelfiles. Open WebUI Community offers a wide range of exciting possibilities for enhancing your chat interactions with Ollama! πŸš€ -## Installing with Docker 🐳 +### Quick Start with Docker 🐳 + +:::info +When using Docker to install Open WebUI, make sure to include the `-v open-webui:/app/backend/data` in your Docker command. This step is crucial as it ensures your database is properly mounted and prevents any loss of data. +::: - **If Ollama is on your computer**, use this command: @@ -65,9 +69,7 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` -- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). - -#### Using Ollama on a Different Server +- **If Ollama is on a Different Server**, use this command: - To connect to Ollama on another server, change the `OLLAMA_API_BASE_URL` to the server's URL: @@ -75,6 +77,18 @@ Don't forget to explore our sibling project, [Open WebUI Community](https://open docker run -d -p 3000:8080 -e OLLAMA_API_BASE_URL=https://example.com/api -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main ``` +- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! πŸ˜„ + +#### Open WebUI: Server Connection Error + +If you're experiencing connection issues, it’s often due to the WebUI docker container not being able to reach the Ollama server at 127.0.0.1:11434 (host.docker.internal:11434) inside the container . Use the `--network=host` flag in your docker command to resolve this. Note that the port changes from 3000 to 8080, resulting in the link: `http://localhost:8080`. + +**Example Docker Command**: + +```bash +docker run -d --network=host -v open-webui:/app/backend/data -e OLLAMA_API_BASE_URL=http://127.0.0.1:11434/api --name open-webui --restart always ghcr.io/open-webui/open-webui:main +``` + ## Troubleshooting If you're facing various issues like "Open WebUI: Server Connection Error", see [TROUBLESHOOTING](/getting-started/troubleshooting) for information on how to troubleshoot and/or join our [Open WebUI Discord community](https://discord.gg/5rJgQTnV4s). From 3e8587f43755790402fb922d0f914c7fb57b49b9 Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Fri, 1 Mar 2024 20:41:03 -0800 Subject: [PATCH 07/11] doc: research --- docs/research.md | 28 ++++++++++++++++++++++++++++ docs/roadmap.md | 9 +++++++-- 2 files changed, 35 insertions(+), 2 deletions(-) create mode 100644 docs/research.md diff --git a/docs/research.md b/docs/research.md new file mode 100644 index 0000000..39a5917 --- /dev/null +++ b/docs/research.md @@ -0,0 +1,28 @@ +--- +sidebar_position: 7 +title: "πŸ§‘β€πŸ”¬ Research Tools" +--- + +# πŸ§‘β€πŸ”¬ Research Tools + +## Interested in Research Collaboration? + +πŸ” **Do you want to collaborate?** We're eager to explore partnership opportunities! In addition to our ongoing efforts in maintaining our Open WebUI repository, we are enthusiastic about building a custom pipeline with a tailored UI specifically designed to meet your research requirements. + +πŸ§ͺ **Research-Centric Features:** + +- Our goal is to provide a comprehensive web UI catering to the demands of conducting user studies, especially in the dynamic fields of AI and HCI. +- Features include surveys, analytics, and participant tracking, all meticulously crafted to streamline your research processes. + +πŸ“ˆ **User Study Tools:** + +- We empower researchers with precision and accuracy by offering specialized tools such as heat maps and behavior tracking modules. +- These tools are indispensable in capturing and analyzing intricate user behavior patterns. + +Moreover, we are committed to supporting survey and analytics features. Our approach involves building custom pipelines, complete with an intuitive UI, tailored to the unique requirements of each project. We understand that one size does not fit all when it comes to research, and thus, our solutions are custom-made and exclusive to each case. + +Please note that for custom projects, we adhere to our regular rate as listed on the sponsorship page. Additionally, we kindly request being listed as one of the co-authors. This ensures that our collaboration yields the best possible outcome, leveraging my expertise as a core maintainer to deliver high-quality results. We provide continuous support throughout the project lifecycle to ensure smooth integration and satisfaction with the final deliverables. + +However, we are open to exploring collaborative opportunities free of charge under certain circumstances, such as projects with significant potential for mutual benefit or those aligned with our research interests. So, don't hesitate to reach out if you're interested in collaborating! + +**Contact:** [jaeryang_baek[at]sfu[dot]com](mailto:jaeryang_baek@sfu.com) diff --git a/docs/roadmap.md b/docs/roadmap.md index 1d1a197..c6170cb 100644 --- a/docs/roadmap.md +++ b/docs/roadmap.md @@ -13,8 +13,13 @@ Here are some exciting tasks on our roadmap: - βš™οΈ **Custom Python Backend Actions**: Empower your Open WebUI by creating or downloading custom Python backend actions. Unleash the full potential of your web interface with tailored actions that suit your specific needs, enhancing functionality and versatility. - πŸ”§ **Fine-tune Model (LoRA)**: Fine-tune your model directly from the user interface. This feature allows for precise customization and optimization of the chat experience to better suit your needs and preferences. - 🧠 **Long-Term Memory**: Witness the power of persistent memory in our agents. Enjoy conversations that feel continuous as agents remember and reference past interactions, creating a more cohesive and personalized user experience. -- πŸ§ͺ **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research. -- πŸ“ˆ **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy. - πŸ“š **Enhanced Documentation**: Elevate your setup and customization experience with improved, comprehensive documentation. +### πŸ§‘β€πŸ”¬ Research Tools + +- πŸ§ͺ **Research-Centric Features**: Empower researchers in the fields of LLM and HCI with a comprehensive web UI for conducting user studies. Stay tuned for ongoing feature enhancements (e.g., surveys, analytics, and participant tracking) to facilitate their research. +- πŸ“ˆ **User Study Tools**: Providing specialized tools, like heat maps and behavior tracking modules, to empower researchers in capturing and analyzing user behavior patterns with precision and accuracy. + +Read more about our [research offerings](/research) + Feel free to contribute and help us make Open WebUI even better! πŸ™Œ From 46e1f71d4c2794238da9830d072f310f02485d6d Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Sat, 2 Mar 2024 02:38:24 -0500 Subject: [PATCH 08/11] Update research.md --- docs/research.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/research.md b/docs/research.md index 39a5917..d192d8c 100644 --- a/docs/research.md +++ b/docs/research.md @@ -25,4 +25,4 @@ Please note that for custom projects, we adhere to our regular rate as listed on However, we are open to exploring collaborative opportunities free of charge under certain circumstances, such as projects with significant potential for mutual benefit or those aligned with our research interests. So, don't hesitate to reach out if you're interested in collaborating! -**Contact:** [jaeryang_baek[at]sfu[dot]com](mailto:jaeryang_baek@sfu.com) +**Contact:** [jaeryang_baek[at]sfu[dot]ca](mailto:jaeryang_baek@sfu.ca) From 53deaafb3ee8db0d3ebd9669c00533c23fad9abe Mon Sep 17 00:00:00 2001 From: "Timothy J. Baek" Date: Sat, 2 Mar 2024 19:37:32 -0800 Subject: [PATCH 09/11] Update index.md --- docs/tutorial-deployment/index.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/tutorial-deployment/index.md b/docs/tutorial-deployment/index.md index eb8da99..ac26284 100644 --- a/docs/tutorial-deployment/index.md +++ b/docs/tutorial-deployment/index.md @@ -17,3 +17,5 @@ title: "☁️ Deployment" + + From 720c9ff77ce8886c663562844d3e7008ea062056 Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Sat, 2 Mar 2024 23:50:23 -0500 Subject: [PATCH 10/11] Update research.md --- docs/research.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/research.md b/docs/research.md index d192d8c..7a26500 100644 --- a/docs/research.md +++ b/docs/research.md @@ -5,9 +5,9 @@ title: "πŸ§‘β€πŸ”¬ Research Tools" # πŸ§‘β€πŸ”¬ Research Tools -## Interested in Research Collaboration? +## Interested in Using Open WebUI for Research? -πŸ” **Do you want to collaborate?** We're eager to explore partnership opportunities! In addition to our ongoing efforts in maintaining our Open WebUI repository, we are enthusiastic about building a custom pipeline with a tailored UI specifically designed to meet your research requirements. +πŸ” **Are you interested in leveraging Open WebUI for your research?** We're excited about the prospect of collaborating with you! Alongside our continuous work on maintaining the Open WebUI repository, we're keen on developing a customized pipeline featuring a tailored UI crafted specifically to fulfill your research needs. πŸ§ͺ **Research-Centric Features:** From f37ed1e89d09cb8c8122eae1329d39fe6f7d150a Mon Sep 17 00:00:00 2001 From: Timothy Jaeryang Baek Date: Sat, 2 Mar 2024 23:51:54 -0500 Subject: [PATCH 11/11] Update research.md --- docs/research.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/research.md b/docs/research.md index 7a26500..830b6d3 100644 --- a/docs/research.md +++ b/docs/research.md @@ -1,9 +1,9 @@ --- sidebar_position: 7 -title: "πŸ§‘β€πŸ”¬ Research Tools" +title: "πŸ§‘β€πŸ”¬ Open WebUI for Research" --- -# πŸ§‘β€πŸ”¬ Research Tools +# πŸ§‘β€πŸ”¬ Open WebUI for Research ## Interested in Using Open WebUI for Research?