Skip to content

Instantly share code, notes, and snippets.

View cepa's full-sized avatar

Lukasz Cepowski cepa

View GitHub Profile
@cepa
cepa / ollama_gpu_docker_vm.md
Last active January 8, 2026 03:49
Setup Ollama with Open WebUI in Docker on Ubuntu Server VM with Nvidia GPU on Proxmox

Setup the VM on Proxmox

You can skip that if you install directly on a machine

  • 16 CPU (or the max you have in the system, up to you)
  • 32GB RAM (or more, for model caching to avoid heavy disk usage!)
  • 32GB vda (or more for /)
  • 256GB vdb (or more for /srv - ollama, docker, you can easily resize it later)
  • UEFI
  • Q35 for Passthrough

Install Ubuntu Server 24.04