entrypoint
1 | 127.0.0.1:11434 |
Third-party
openclaw
You can easily connect OpenClaw to your local Ollama model. Because OpenClaw requires an API key for all its integrations—even when connecting to local, unauthenticated services like Ollama—you simply need to provide a dummy key.[2][6]
Method 1: Ollama Launch
Ollama recently introduced a native integration command for OpenClaw that handles the setup for you.[4][5]
- Ensure Ollama is running and you have downloaded a model (e.g.,
ollama pull qwen3:8b).[8] - Open your terminal and run:
ollama launch openclaw. - Ollama will automatically install OpenClaw (if needed), set up the necessary background gateway, configure the dummy API key, and let you select your downloaded local model from a menu.
Method 2: Manual
If you prefer to configure OpenClaw manually or are integrating it into an existing setup, you can set the dummy key yourself using environment variables or OpenClaw’s config file.
- Set the Environment Variable: Before starting OpenClaw, export a fake API key. Any string will work.[15][2]
1
export OLLAMA_API_KEY="ollama-local"
- Start the OpenClaw Gateway: With the environment variable active, start the gateway.[8]
1
openclaw gateway
- Configure via CLI (Alternative): Alternatively, you can permanently save the dummy key into OpenClaw’s configuration file.[9][2]
1
openclaw config set models.providers.ollama.apiKey "ollama-local"
Once the dummy key is set, OpenClaw will automatically discover the models available on your local Ollama instance (running at http://127.0.0.1:11434) and allow you to use them without any external API calls.