This setup should work right out of the box.
If Ollama is being hosted on a separate machine:
- Comment out the whole
ollamaservice part - Comment out the
extra_hosts:section of the open-webui service - Update the
OLLAMA_BASE_URLenvironment variable to point to the Ollama service on the other machine
Additionally, for the Ollama service on the other machine, see the FAQ page on setting up the Ollama service to allow network connections from your local network, as it defaults to only accepting localhost connections. (You'll want to set the OLLAMA_HOST environment variable on the machine running it - not in this docker compose - to 0.0.0.0)