Skip to content

Instantly share code, notes, and snippets.

@Raju
Forked from awni/mlx_lm_open_webui.md
Created March 1, 2026 15:37
Show Gist options
  • Select an option

  • Save Raju/10da653ef4d8a2f71aba5d566a3b43ea to your computer and use it in GitHub Desktop.

Select an option

Save Raju/10da653ef4d8a2f71aba5d566a3b43ea to your computer and use it in GitHub Desktop.
Open WebUI with MLX LM

Setup

Install packages:

pip install open-webui mlx-lm

Start Open WebUI server:

open-webui serve

Setup the MLX LM server:

  1. Click your profile icon -> settings -> connections
  2. Click plus sign to add new connection
  3. Enter mlx-lm server address (e.g. http://127.0.0.1:8090/v1)
  4. Enter none for API key
  5. Click save

Start the mlx-lm server:

mlx_lm.server --port 8090

Choose an MLX LM model and start chatting.

Note, only already downloaded models show up in the list. So if you don't have any downloaded models, use the mlx-lm CLI to download them. For example:

mlx_lm.generate --model mlx-community/gemma-3-4b-it-qat-4bit --prompt "Hi"
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment