Local models
title: Local Models description: Run OpenClaw completely offline using local LLMs (LM Studio, Ollama). sidebar: label: Local Models order: 21
Section titled “title: Local Models description: Run OpenClaw completely offline using local LLMs (LM Studio, Ollama). sidebar: label: Local Models order: 21”Local Models
Section titled “Local Models”OpenClaw supports running with local Large Language Models (LLMs), allowing for:
- Privacy: Data never leaves your machine.
- Offline Use: Run without internet.
- Cost: No API fees.
Recommended Setup: LM Studio
Section titled “Recommended Setup: LM Studio”LM Studio is the easiest way to serve OpenAI-compatible local models.
Install LM Studio Download from lmstudio.ai.
Load a Model Search for and download a model.
- Recommendation:
MiniMax-Text-01orLlama-3-70B(if hardware permits). - Avoid small (<7B) quantized models for complex agent tasks.
- Recommendation:
Start Server In LM Studio, go to the Local Server tab and click Start Server.
- Ensure it’s running on
http://127.0.0.1:1234.
- Ensure it’s running on
Configure OpenClaw Add the local provider to
~/.openclaw/openclaw.json.{models: {providers: {lmstudio: {baseUrl: "http://127.0.0.1:1234/v1",apiKey: "lm-studio", // Value doesn't mattermodels: [{id: "local-model", // Match the ID in LM Studioname: "My Local Model",contextWindow: 32000}]}}},agents: {defaults: {model: { primary: "lmstudio/local-model" }}}}
Hybrid Mode (Best of Both Worlds)
Section titled “Hybrid Mode (Best of Both Worlds)”You can use a local model for “easy” tasks (chat) and a hosted model (Claude/GPT-4) for complex reasoning or fallbacks.
{ agents: { defaults: { model: { primary: "anthropic/claude-3-5-sonnet", // Smart cloud model fallbacks: ["lmstudio/local-model"] // Local fallback } } }}