Problem: Home Assistant Needs Natural Language AI Control
Home Assistant is powerful but complex. You've set up dozens of automations, but controlling them still requires memorizing entity names, writing YAML, or navigating through multiple screens. Voice assistants like Alexa or Google Home work, but they're limited and cloud-dependent.
You'll learn:
- Install OpenClaw as a Home Assistant add-on
- Connect OpenClaw to control devices via natural language
- Set up voice control with local processing
- Create autonomous smart home automations
Time: 20 min | Level: Intermediate
Why This Solution Works
OpenClaw (formerly Moltbot/Clawdbot) is an open-source AI agent with 68,000+ GitHub stars that runs locally on your machine. Unlike cloud-based assistants, it keeps your smart home data private while providing Claude-powered natural language understanding.
What makes it different:
- 100% local processing - no data leaves your network
- Persistent memory - remembers your preferences and context
- Autonomous actions - can proactively trigger automations
- 50+ integrations - connects to Home Assistant natively
Common use cases:
- "Turn on movie mode" → Dims lights, closes blinds, starts TV
- "I'm going to bed" → Locks doors, arms security, sets thermostat
- "What's the temperature in the bedroom?" → Natural language queries
- Proactive: "It's getting cold, I turned up the heat"
Prerequisites
Before starting, ensure you have:
- Home Assistant OS or Supervised installation (required for add-ons)
- Anthropic API key - Get one at https://console.anthropic.com
- 4GB+ RAM on your Home Assistant host (8GB recommended)
- Node.js 22+ if running OpenClaw separately (add-on handles this)
Cost estimate: OpenClaw is free. API costs ~$0.01-0.05 per conversation with Claude Sonnet 4.
Solution: Install OpenClaw Add-on
Step 1: Add OpenClaw Repository to Home Assistant
Navigate to Settings → Add-ons → Add-on Store. Click the ⋮ menu (top right) and select Repositories.
Add this repository URL:
https://github.com/techartdev/OpenClawHomeAssistant
Expected: Repository appears in your add-on store after refresh.
If it fails:
- Error: "Invalid URL" → Verify you copied the full URL including
https:// - Can't find menu: You need Home Assistant OS or Supervised (not Container/Core)
Step 2: Install OpenClaw Assistant Add-on
From the Add-on Store, find OpenClaw Assistant (should appear under the repository you just added).
Click Install. This takes 5-10 minutes as it downloads dependencies.
# The add-on automatically configures:
gateway:
mode: local # Runs on your network
auth:
mode: token # Secure token-based auth
Why this works: The add-on runs OpenClaw as a supervised service with persistent storage at /config inside the container.
Step 3: Configure OpenClaw via Terminal
Once installed, open the add-on page and click OPEN WEB UI. You'll see an embedded terminal.
Run the onboarding wizard:
openclaw onboard
The wizard will prompt you for:
- Anthropic API key - Paste your key from console.anthropic.com
- Assistant name - Pick something (e.g., "Jarvis", "Alfred")
- Communication channel - Choose "None" for now (we'll use Home Assistant's UI)
- Skills to enable - Select "Home Assistant" when prompted
Expected: Configuration saves to /config/openclaw/config.json
If it fails:
- Error: "Invalid API key" → Double-check key has no extra spaces
- Terminal won't open: Enable
enable_terminal: truein add-on config
Step 4: Set Up Home Assistant Integration
OpenClaw needs a long-lived access token to control your devices.
Create a token:
- Home Assistant → Profile (bottom left) → Security tab
- Scroll to Long-Lived Access Tokens
- Click CREATE TOKEN
- Name it "OpenClaw" and copy the token
Back in the OpenClaw terminal:
openclaw configure
Select Home Assistant from the integrations menu, then paste your token.
Configuration saved at: /config/secrets/homeassistant.token
Step 5: Install ha-mcp Skill for Device Control
The ha-mcp skill provides OpenClaw with Model Context Protocol access to Home Assistant entities.
In the OpenClaw terminal:
openclaw skill install ha-mcp
This installs the Home Assistant Model Context Protocol server, which lets OpenClaw:
- List all devices and entities
- Read current states (temperature, light status, etc.)
- Control devices (turn on/off, set brightness, etc.)
- Query sensor data
Expected: Output shows "Skill installed successfully"
Step 6: Start the Gateway and Test
Start OpenClaw's gateway service:
openclaw gateway start
Expected: Gateway runs on port 18789 by default.
Test it with a simple query through the Control UI:
- From the add-on page, click "Open Gateway Web UI" button (opens in new tab)
- In the chat interface, type: "List my lights"
OpenClaw should respond with your Home Assistant light entities.
If it fails:
- Gateway won't start: Check logs with
openclaw gateway logs - Can't list devices: Verify token permissions in Home Assistant
- WebSocket errors: Try accessing directly at
http://homeassistant.local:18789
Set Up Voice Control (Optional)
Step 7: Configure OpenClaw as Conversation Agent
OpenClaw can be set as the conversation agent for Home Assistant's Assist pipelines, enabling voice control through ESPHome devices or the Home Assistant app.
Install the custom integration:
- Go to HACS → Integrations → ⋮ → Custom repositories
- Add:
https://github.com/Djelibeybi/openclaw_conversation - Install OpenClaw Conversation
- Restart Home Assistant
Configure the pipeline:
- Settings → Voice assistants
- Click + ADD ASSISTANT
- Set Conversation agent to OpenClaw
- Choose your preferred STT/TTS engines
Now you can say: "Hey Jarvis, turn on movie mode" to your voice satellite.
Step 8: Create Autonomous Automations
OpenClaw can proactively trigger actions based on context. Create a custom skill:
openclaw skill create morning-routine
Edit the skill file (opens in default editor):
// morning-routine/index.js
export async function execute() {
const hour = new Date().getHours();
// Check if it's morning and no one is awake yet
if (hour >= 6 && hour < 9) {
const bedroom = await homeassistant.getState('binary_sensor.bedroom_motion');
if (bedroom.state === 'on') {
// Someone woke up, start routine
await homeassistant.callService('scene.turn_on', {
entity_id: 'scene.morning'
});
return "Good morning! I've started your routine.";
}
}
}
// Run this skill every 5 minutes
export const schedule = '*/5 * * * *';
Why this works: OpenClaw runs the skill on a cron schedule. When motion is detected in the morning, it triggers your scene without you asking.
Verification
Test your complete setup:
# Check gateway status
openclaw gateway status
# Send test command through CLI
openclaw message send --target control-ui --message "What's the living room temperature?"
You should see:
- Gateway shows "Running" status
- OpenClaw responds with current temperature from your sensor
- Control UI displays the conversation
Additional tests:
- Voice command: "Turn off all lights" (if voice is configured)
- Natural language: "Make the bedroom warmer"
- Context-aware: "It's too bright" → adjusts based on time/room
What You Learned
- OpenClaw runs as a Home Assistant add-on with full local control
- The
ha-mcpskill enables natural language device commands - Voice control works through Home Assistant's Assist pipelines
- Autonomous skills can trigger actions based on context and schedules
Limitations to know:
- Requires Anthropic API (cloud LLM) unless you configure local models
- Complex automations may need custom skill development
- WebSocket connections can be flaky through Ingress proxies
Next steps:
Advanced Configuration
Use Local LLMs Instead of Anthropic
To avoid API costs and ensure complete privacy:
# Install Ollama on your network
docker run -d -p 11434:11434 ollama/ollama
# Pull a model
docker exec -it ollama ollama pull llama3.2
# Configure OpenClaw
openclaw configure
# Select "Ollama" as provider
# URL: http://your-server:11434
# Model: llama3.2
Trade-off: Local models are faster and private but less capable than Claude Sonnet 4 for complex reasoning.
Add Proactive Notifications
Create a skill that checks weather and reminds you:
// weather-alert/index.js
export async function execute() {
const weather = await homeassistant.getState('weather.home');
if (weather.attributes.forecast[0].precipitation > 0.5) {
await sendNotification('Remember your umbrella! Rain expected today.');
}
}
export const schedule = '0 7 * * *'; // Run at 7 AM daily
Multi-Instance Setup for Different Rooms
Run separate OpenClaw instances for different areas:
# Add-on config for bedroom instance
gateway_public_url: http://homeassistant.local:18790
ha_entity_filter: "bedroom_*"
Each instance can have different personalities and access scopes.
Troubleshooting
Gateway Won't Start
Symptom: openclaw gateway status shows "Stopped"
Fix:
# Check detailed logs
openclaw gateway logs --lines 100
# Common issue: Port already in use
# Change port in config
openclaw configure
# Select Gateway → Port → 18790
OpenClaw Can't Control Devices
Symptom: Commands like "turn on lights" fail with "permission denied"
Fix:
- Verify token hasn't expired: Profile → Security → Long-Lived Access Tokens
- Regenerate token and update:
echo "YOUR_NEW_TOKEN" > /config/secrets/homeassistant.token
openclaw gateway restart
High API Costs
Symptom: Anthropic bill is higher than expected
Solutions:
- Switch to Claude Haiku (faster, cheaper):
openclaw configure→ Provider → Model →claude-haiku-4-5-20251001 - Add rate limiting in add-on config:
rate_limit:
max_requests_per_hour: 100
- Use local LLM (see Advanced Configuration above)
Voice Commands Don't Work
Symptom: Voice satellite doesn't trigger OpenClaw
Fix:
- Check Assist pipeline: Settings → Voice assistants → OpenClaw
- Ensure conversation agent is selected (not default)
- Test with text first: Home Assistant → Assist → Type "list lights"
- If text works but voice doesn't, issue is with STT/TTS configuration, not OpenClaw
Security Considerations
What OpenClaw has access to:
- Full control of all Home Assistant entities
- Your home network (if system access is enabled)
- Files in
/configdirectory
Best practices:
- Restrict API scope: Create a separate HA user with limited permissions
- Sandbox mode: Run OpenClaw in sandbox initially, grant system access only if needed
- Review skills: Inspect code before installing from community registry
- Network isolation: Run HA on isolated VLAN if exposing OpenClaw externally
Token security:
# Verify permissions
ls -la /config/secrets/homeassistant.token
# Should show: -rw------- (only owner can read/write)
Performance Optimization
Reduce Latency
Problem: Commands take 3-5 seconds to execute
Solutions:
- Use Claude Haiku for faster responses (500ms vs 2s)
- Cache frequently used queries:
// In custom skill
const cachedStates = {};
export async function getState(entityId) {
const cacheKey = `${entityId}-${Date.now() / 60000 | 0}`; // 1min cache
if (!cachedStates[cacheKey]) {
cachedStates[cacheKey] = await homeassistant.getState(entityId);
}
return cachedStates[cacheKey];
}
- Run on dedicated hardware (RPi 4 8GB minimum)
Memory Management
Problem: Add-on uses >2GB RAM
Fix:
# Add-on config
memory_limit: 1536mb
Or reduce skill count:
openclaw skill list
openclaw skill disable skill-name-here
Community Skills for Home Assistant
OpenClaw's public registry has 3,000+ community-built skills as of February 2026. Here are top picks for smart homes:
Essential Skills:
wyoming-clawdbot- Wyoming Protocol for voice pipelinesha-scene-builder- Natural language scene creationenergy-monitor- Track and optimize power usageclimate-optimizer- AI-powered HVAC scheduling
Install example:
openclaw skill install ha-scene-builder
Then use: "Create a movie night scene with dim lights and TV on"
Integration with Other Platforms
Combine with Node-RED
OpenClaw can trigger Node-RED flows:
// In OpenClaw skill
await fetch('http://nodered:1880/endpoint/trigger-flow', {
method: 'POST',
body: JSON.stringify({ event: 'openclaw_command' })
});
MQTT Integration
Publish OpenClaw actions to MQTT for other systems:
openclaw skill install mqtt-publisher
openclaw configure
# Add MQTT broker: homeassistant.local:1883
Real-World Examples
Morning Routine
What it does: Gradual wake-up automation based on calendar
export async function execute() {
const calendar = await homeassistant.getState('calendar.work');
const firstEvent = calendar.attributes.start_time;
const wakeTime = new Date(firstEvent);
wakeTime.setMinutes(wakeTime.getMinutes() - 30);
if (new Date() >= wakeTime) {
await homeassistant.callService('light.turn_on', {
entity_id: 'light.bedroom',
brightness: 1
});
// Gradually increase over 15 minutes
for (let i = 1; i <= 100; i++) {
await sleep(9000); // 9 seconds
await homeassistant.callService('light.turn_on', {
entity_id: 'light.bedroom',
brightness: i
});
}
}
}
Energy Optimization
What it does: Monitors solar production and delays high-power tasks
export async function execute() {
const solar = await homeassistant.getState('sensor.solar_production');
const dishwasher = await homeassistant.getState('switch.dishwasher');
if (solar.state > 2000 && dishwasher.state === 'off') {
await sendNotification('Excess solar power! Good time to run dishwasher.');
// Auto-start if user said "yes" to previous suggestions
if (await getUserPreference('auto_dishwasher') === true) {
await homeassistant.callService('switch.turn_on', {
entity_id: 'switch.dishwasher'
});
}
}
}
export const schedule = '*/15 * * * *';
Security Patrol
What it does: AI analyzes camera feeds for unusual activity
export async function execute() {
const motion = await homeassistant.getState('binary_sensor.front_door_motion');
if (motion.state === 'on') {
const image = await homeassistant.getState('camera.front_door');
// Use Claude vision to analyze
const analysis = await analyzeImage(image.attributes.entity_picture);
if (analysis.includes('person') && !analysis.includes('package delivery')) {
await sendNotification('Unusual activity at front door', {
image: image.attributes.entity_picture
});
}
}
}
Cost Analysis
Monthly operational costs (typical home):
| Component | Cost |
|---|---|
| OpenClaw software | $0 (open source) |
| Anthropic API (Claude Sonnet 4) | $2-8 (100-500 requests/day) |
| Hardware (if dedicated) | $0 (uses existing HA server) |
| Total | $2-8/month |
Cost reduction strategies:
- Use Claude Haiku: ~60% cheaper
- Local LLM (Ollama): $0 API costs
- Cache responses: Reduce duplicate API calls
- Scheduled skills only: Avoid always-on listening
Migration Path
From Google Assistant / Alexa
What to change:
- Recreate routines as OpenClaw skills
- Retrain commands (more natural language supported)
- Set up voice satellites if needed
Advantages:
- More complex automations possible
- No cloud dependency
- Better context understanding
- Learns preferences over time
What you lose:
- Third-party skills (Spotify, etc. - use native integrations)
- Shopping lists (recreate in HA or Notion integration)
From Home Assistant Automations
Keep existing automations, add OpenClaw for:
- Natural language triggers
- Complex decision-making
- Proactive suggestions
- Voice control
Hybrid approach works best: YAML for deterministic logic, OpenClaw for AI-powered decisions.
Conclusion
You now have a fully functional AI-powered smart home controller that:
- ✅ Understands natural language commands
- ✅ Runs 100% locally for privacy
- ✅ Can trigger autonomous automations
- ✅ Integrates with Home Assistant natively
- ✅ Supports voice control through Assist
Total setup time: 20 minutes for basic install, 1-2 hours for advanced features.
Tested on Home Assistant OS 2026.2, OpenClaw v0.5.25, Anthropic Claude Sonnet 4. Raspberry Pi 4 8GB recommended minimum.