Add documentation around RamaLama

I've had a lot of people screaming at me how to do this, just
documenting it.

Signed-off-by: Eric Curtin <ecurtin@redhat.com>
This commit is contained in:
Eric Curtin 2025-03-21 18:59:49 +00:00
parent 05e3c8bddb
commit 50fd80863e
2 changed files with 40 additions and 0 deletions

View File

@ -147,6 +147,9 @@ After installing, visit:
You are now ready to start using Open WebUI!
## Using Open WebUI with RamaLama
If you're using Open WebUI with RamaLama, be sure to check out our [Starting with RamaLama Guide](/getting-started/quick-start/starting-with-ramalama) to learn how to manage your RamaLama instances with Open WebUI.
## Using Open WebUI with Ollama
If you're using Open WebUI with Ollama, be sure to check out our [Starting with Ollama Guide](/getting-started/quick-start/starting-with-ollama) to learn how to manage your Ollama instances with Open WebUI.

View File

@ -0,0 +1,37 @@
---
sidebar_position: 1
title: "🦙 Starting With RamaLama"
---
## Overview
Open WebUI makes it easy to connect and manage your **RamaLama** instance. This guide will walk you through setting up the connection, managing models, and getting started.
---
## Step 1: Starting a RamaLama server
Run RamaLama with some model:
```bash
ramalama serve granite3-moe
```
---
## Step 2: Starting a Open WebUI server
In another terminal start Open WebUI:
```bash
podman run -it --rm --network slirp4netns:allow_host_loopback=true -e OPENAI_API_BASE_URL=http://host.containers.internal:8080 -p 3000:8080 -v open-webui:/app/backend/data:Z --name open-webui ghcr.io/open-webui/open-webui:main
```
---
## Step 3: Connect via your web browser
In a browser go to http://127.0.0.1:3000
Have fun!