Created
January 16, 2026 19:48
-
-
Save JoaquinRuiz/19c1d9f374bb6b05c291bf44c4fae9b5 to your computer and use it in GitHub Desktop.
Ollama configuration (llama3.2) for Wave Terminal
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| { | |
| "ollama-llama": { | |
| "display:name": "Ollama - llama3.2", | |
| "display:order": 1, | |
| "display:icon": "microchip", | |
| "display:description": "Local llama3.2:latest model via Ollama", | |
| "ai:apitype": "openai-chat", | |
| "ai:model": "llama3.2:latest", | |
| "ai:thinkinglevel": "medium", | |
| "ai:endpoint": "http://localhost:11434/v1/chat/completions", | |
| "ai:apitoken": "ollama" | |
| } | |
| } |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment