10/13/2025
Published on Convert on 13/10/2025.
Chat UIs are fine for trying a model. For workflows, scripts, or tools you need the model as a server other apps can call.
- LM Studio: Turn on Developer Mode in the sidebar. The API URL at the top is what apps use. Test with cURL:
curl {url}/v1/models. - Ollama (recommended for a real server): Install from ollama.com. In the terminal:
ollama pull <model>, thenollama serve. Server runs athttp://127.0.0.1:11434. - Wider access: To reach Ollama from other devices or from n8n in Docker: get your local IP, set
export OLLAMA_HOST=0.0.0.0, runollama serveagain, and test withcurl {your_ip}:11434.
The article also explains APIs and endpoints in plain terms and points to Part 4 for connecting a chat UI to this server.