Skip to content

Instantly share code, notes, and snippets.

@jessedufrene
Last active August 27, 2024 13:24
Show Gist options
  • Select an option

  • Save jessedufrene/c6716119a1d61bf6802da7bb7d6df73a to your computer and use it in GitHub Desktop.

Select an option

Save jessedufrene/c6716119a1d61bf6802da7bb7d6df73a to your computer and use it in GitHub Desktop.
Ollama + Open WebUI docker compose

Ollama + Open WebUI docker compose

This setup should work right out of the box.

Using Ollama on other machine

If Ollama is being hosted on a separate machine:

  1. Comment out the whole ollama service part
  2. Comment out the extra_hosts: section of the open-webui service
  3. Update the OLLAMA_BASE_URL environment variable to point to the Ollama service on the other machine

Additionally, for the Ollama service on the other machine, see the FAQ page on setting up the Ollama service to allow network connections from your local network, as it defaults to only accepting localhost connections. (You'll want to set the OLLAMA_HOST environment variable on the machine running it - not in this docker compose - to 0.0.0.0)

services:
ollama:
volumes:
- ./ollama:/root/.ollama
#ports:
# - 11434:11434
image: ollama/ollama
restart: unless-stopped
open-webui:
ports:
- 3000:8080
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
image: ghcr.io/open-webui/open-webui:main
volumes:
- ./open-webui:/app/backend/data
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
- 'WEBUI_SECRET_KEY='
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment