home-assistant-architect
Complete Home Assistant platform with frontend design, energy management, cameras, sensors, local LLM integration, and Ubuntu server deployment. 15 agents, 16 MCP tools.
View on GitHubTable of content
Complete Home Assistant platform with frontend design, energy management, cameras, sensors, local LLM integration, and Ubuntu server deployment. 15 agents, 16 MCP tools.
Installation
npx claude-plugins install @Lobbi-Docs/claude-orchestration/home-assistant-architect
Contents
Folders: agents, commands, hooks, mcp-server, skills, templates
Files: README.md
Documentation
Comprehensive Claude Code plugin for Home Assistant automation, local LLM integration, and Ubuntu server management. This plugin provides specialized sub-agents, hooks, tools, and skills to streamline smart home development and operations.
Features
Sub-Agents (8 Specialized)
| Agent | Purpose | Model |
|---|---|---|
| ha-device-controller | Control devices, entities, and services | Sonnet |
| ha-automation-architect | Design and optimize automations | Sonnet |
| ha-diagnostics | Troubleshoot issues and logs | Sonnet |
| local-llm-manager | Deploy Ollama and local LLMs | Opus |
| ubuntu-ha-deployer | Deploy HA on Ubuntu servers | Sonnet |
| ha-voice-assistant | Configure voice pipelines | Sonnet |
| ha-energy-optimizer | Analyze and optimize energy usage | Sonnet |
| ha-security-auditor | Audit security configuration | Opus |
Commands
| Command | Description |
|---|---|
/ha-control | Control entities with natural language |
/ha-automation | Create and manage automations |
/ha-deploy | Deploy HA on Ubuntu |
/ha-diagnose | Diagnose issues |
/ollama-setup | Setup local LLM |
/ha-voice | Configure voice assistant |
/ha-backup | Backup and restore |
/ha-mcp | Configure MCP server |
Skills
- ha-core: REST/WebSocket API patterns
- ha-automation: YAML automation best practices
- local-llm: Ollama integration patterns
- ubuntu-deployment: Docker deployment
Hooks
- State change monitoring
- Health checks on task completion
- YAML validation before writes
- Security scanning
- Backup reminders
MCP Server
Full Model Context Protocol server with 11 tools:
- Entity state management
- Service calls
- History queries
- Automation management
- Ollama integration
Installation
Prerequisites
- Claude Code CLI installed
- Home Assistant instance (2025.1.0+)
- Long-Lived Access Token from HA
Plugin Installation
# Clone the plugin
cd ~/.claude/plugins
git clone https://github.com/Lobbi-Docs/claude.git
# Or use the plugin manager
/plugin-install home-assistant-architect
Configuration
Set environment variables:
export HA_URL="http://homeassistant.local:8123"
export HA_TOKEN="your-long-lived-access-token"
export OLLAMA_URL="http://localhost:11434"
Or add to .env:
HA_URL=http://homeassistant.local:8123
HA_TOKEN=your-token
OLLAMA_URL=http://localhost:11434
HA_VOICE_MODEL=llama3.2:3b
Quick Start
Control Devices
# Natural language control
/ha-control turn on living room lights at 50%
/ha-control set thermostat to 72 degrees
/ha-control lock front door
Create Automations
# Create from description
/ha-automation create Turn on porch lights at sunset
# Debug automation
/ha-automation debug motion_lights
Deploy on Ubuntu
# Full stack deployment
/ha-deploy full --dir /opt/homeassistant
# Check status
/ha-deploy status
Setup Local LLM
# Install and configure Ollama
/ollama-setup install
/ollama-setup pull llama3.2:3b
/ollama-setup configure
Architecture
┌─────────────────────────────────────────────────────────────────────┐
│ Home Assistant Architect Plugin │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
│ │ Agents │ │ Commands │ │ Skills │ │
│ │ (8 types) │ │ (8 types) │ │ (4 types) │ │
│ └──────────────┘ └──────────────┘ └──────────────┘ │
│ │ │ │ │
│ └────────────────┼─────────────────┘ │
│ │ │
│ ┌──────▼──────┐ │
│ │ MCP Server │ │
│ │ (11 tools) │ │
│ └──────┬──────┘ │
│ │ │
│ ┌─────────────────┼─────────────────┐ │
│ │ │ │ │
│ ┌──────▼──────┐ ┌───────▼───────┐ ┌─────▼──────┐ │
│ │ Home │ │ Ollama │ │ Ubuntu │ │
│ │ Assistant │ │ (Local LLM) │ │ Server │ │
│ │ API │ │ │ │ │ │
│ └─────────────┘ └───────────────┘ └────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────┘
MCP Server Setup
…(truncated)
Included Skills
This plugin includes 5 skill definitions:
camera-nvr
View skill definition
Camera and NVR Skill
Home Assistant camera integration and Frigate NVR patterns.
Activation Triggers
- Working with camera entities
- Configuring Frigate NVR
- Setting up motion detection
- Creating camera dashboards
- Recording management
Core Patterns
Frigate Configuration
# frigate.yaml
mqtt:
enabled: true
host: mosquitto
user: frigate
password: "{FRIGATE_MQTT_PASSWORD}"
detectors:
coral:
type: edgetpu
device: usb
cameras:
front_door:
ffmpeg:
inputs:
- path: rtsp://user:pass@192.168.1.100:554/stream1
roles:
- detect
- record
- path: rtsp://user:pass@192.168.1.100:554/stream2
roles:
- rtmp
detect:
width: 1920
height: 1080
fps: 5
objects:
track:
- person
- car
- dog
filters:
person:
min_area: 5000
threshold: 0.7
zones:
porch:
coordinates: 0,1080,500,1080,500,0,0,0
objects:
- person
record:
enabled: true
retain:
days: 7
mode: motion
events:
retain:
default: 30
go2rtc:
streams:
front_door:
- rtsp://user:pass@192.168.1.100:554/stream1
Camera YAML Configuration
# Generic camera
camera:
- platform: generic
name: Front Yard
still_image_url: http://192.168.1.100/snapshot.jpg
stream_source: rtsp://user:pass@192.168.1.100:554/stream
...(truncated)
</details>
### energy-management
<details>
<summary>View skill definition</summary>
# Energy Management Skill
Home Assistant energy monitoring and optimization patterns.
## Activation Triggers
- Working with energy dashboard
- Solar/battery integration
- EV charger configuration
- Utility rate management
- Power monitoring
## Core Patterns
### Energy Dashboard Configuration
```yaml
# configuration.yaml
energy:
# The Energy dashboard is configured via UI
# Settings > Dashboards > Energy
Power Monitoring Sensors
# Whole home power
sensor:
- platform: template
sensors:
home_power:
friendly_name: "Home Power"
unit_of_measurement: "W"
device_class: power
value_template: >
{{ states('sensor.main_meter_power') | float(0) }}
home_energy_daily:
friendly_name: "Daily Energy"
unit_of_measurement: "kWh"
device_class: energy
value_template: >
{{ states('sensor.main_meter_energy') | float(0) }}
Solar Integration
# Solar production template
template:
- sensor:
- name: "Solar Power"
unit_of_measurement: "W"
device_class: power
state_class: measurement
state: "{{ states('sensor.inverter_power') | float(0) }}"
- name: "Solar Energy Today"
unit_of_measurement: "kWh"
device_class: energy
state_class: total_increasing
state: "{{ states('sensor.inverter_energy_today') | float(0) }}"
- name: "Solar Self Consumption"
unit_of_measurement: "kWh"
...(truncated)
</details>
### ha-automation
> "What this automation does and why"
<details>
<summary>View skill definition</summary>
# Home Assistant Automation Skill
Automation YAML patterns, triggers, conditions, actions, and best practices for Home Assistant.
## Activation Triggers
Activate this skill when:
- Creating or modifying automations
- Working with automation YAML
- Designing trigger logic
- Building condition chains
- Creating action sequences
## Automation Structure
```yaml
# Complete automation template
alias: "Descriptive Name"
id: unique_automation_id # Optional but recommended
description: "What this automation does and why"
mode: single # single, restart, queued, parallel
# Variables available throughout
variables:
room: "living_room"
default_brightness: 80
trigger:
- platform: state
entity_id: binary_sensor.motion
to: "on"
id: motion_detected # For use in choose
condition:
- condition: sun
after: sunset
action:
- service: light.turn_on
target:
entity_id: "light.{{ room }}"
data:
brightness_pct: "{{ default_brightness }}"
Trigger Types
State Trigger
trigger:
- platform: state
entity_id: binary_sensor.door
from: "off"
to: "on"
for:
seconds: 30
attribute: any # Optional: trigger on attribute change
Time Trigger
trigger:
- platform: time
at: "06:30:00"
# Multiple times
- platform: time
at:
- "07:00:00"
- "12:00:00"
- "18:00:00"
# Input datetime
- platform: time
at: input_datetime.wake_up_time
Time Pattern
trig
...(truncated)
</details>
### ha-core
<details>
<summary>View skill definition</summary>
# Home Assistant Core Skill
Core Home Assistant API integration patterns, authentication, and entity management.
## Activation Triggers
Activate this skill when working with:
- Home Assistant REST API
- WebSocket API connections
- Entity state management
- Service calls
- Event bus subscriptions
## Authentication
### Long-Lived Access Token
```python
import httpx
class HAClient:
def __init__(self, url: str, token: str):
self.url = url.rstrip('/')
self.headers = {
"Authorization": f"Bearer {token}",
"Content-Type": "application/json"
}
async def get_states(self) -> list:
async with httpx.AsyncClient() as client:
response = await client.get(
f"{self.url}/api/states",
headers=self.headers
)
response.raise_for_status()
return response.json()
async def get_state(self, entity_id: str) -> dict:
async with httpx.AsyncClient() as client:
response = await client.get(
f"{self.url}/api/states/{entity_id}",
headers=self.headers
)
response.raise_for_status()
return response.json()
async def call_service(
self,
domain: str,
service: str,
data: dict = None,
target: dict = None
) -> dict:
payload = {}
if data:
payload.update(data)
if target:
payload["target
...(truncated)
</details>
### local-llm
<details>
<summary>View skill definition</summary>
# Local LLM Integration Skill
Deploy and integrate local LLMs with Ollama, LocalAI, and Home Assistant for privacy-focused voice assistants and automation.
## Activation Triggers
Activate this skill when:
- Setting up Ollama or LocalAI
- Configuring local voice assistants
- Integrating LLMs with Home Assistant
- Optimizing local model performance
- Building LLM-powered automations
## Ollama Installation
### Ubuntu/Debian
```bash
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh
# Start as service
sudo systemctl enable ollama
sudo systemctl start ollama
# Pull models
ollama pull llama3.2:3b
ollama pull fixt/home-3b-v3 # HA-optimized
Docker
# docker-compose.yaml
services:
ollama:
image: ollama/ollama:latest
container_name: ollama
restart: unless-stopped
ports:
- "11434:11434"
volumes:
- ./ollama:/root/.ollama
# GPU support (NVIDIA)
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu]
Ollama API
Generate Completion
import httpx
async def generate(prompt: str, model: str = "llama3.2:3b") -> str:
async with httpx.AsyncClient() as client:
response = await client.post(
"http://localhost:11434/api/generate",
json={
"model": model,
"prompt": prompt,
"stream": False,
"options": {
...(truncated)
</details>
## Source
[View on GitHub](https://github.com/Lobbi-Docs/claude)