diff --git a/docs/getting-started/quick-start/tab-docker/Podman.md b/docs/getting-started/quick-start/tab-docker/Podman.md
index 9cf11fd..8b3738b 100644
--- a/docs/getting-started/quick-start/tab-docker/Podman.md
+++ b/docs/getting-started/quick-start/tab-docker/Podman.md
@@ -8,7 +8,7 @@ Podman is a daemonless container engine for developing, managing, and running OC
 - **Run a Container:**
 
   ```bash
-  podman run -d --name openwebui -p 3000:8080 ghcr.io/open-webui/open-webui:main
+  podman run -d --name openwebui -p 3000:8080 -v open-webui:/app/backend/data ghcr.io/open-webui/open-webui:main
   ```
 
 - **List Running Containers:**
@@ -19,10 +19,17 @@ Podman is a daemonless container engine for developing, managing, and running OC
 
 ## Networking with Podman
 
-If networking issues arise, you may need to adjust your network settings:
+If networking issues arise, use slirp4netns to adjust the pod's network settings to allow the container to access your computer's ports.
+
+Ensure you have [slirp4netns installed](https://github.com/rootless-containers/slirp4netns?tab=readme-ov-file#install), remove the previous container if it exists using `podman rm`, and start a new container with
 
 ```bash
---network=slirp4netns:allow_host_loopback=true
+  podman run -d --network=slirp4netns:allow_host_loopback=true --name openwebui -p 3000:8080 -v open-webui:/app/backend/data ghcr.io/open-webui/open-webui:main
 ```
 
+If you are using Ollama from your computer (not running inside a container),
+
+Once inside open-webui, navigate to Settings > Admin Settings > Connections and create a new Ollama API connection to `http://10.0.2.2:[OLLAMA PORT]`. By default, the Ollama port is 11434.
+
+
 Refer to the Podman [documentation](https://podman.io/) for advanced configurations.