home-assistant-architect

Complete Home Assistant platform with frontend design, energy management, cameras, sensors, local LLM integration, and Ubuntu server deployment. 15 agents, 16 MCP tools.

View on GitHub
Author Brookside BI
Namespace @Lobbi-Docs/claude-orchestration
Category smart-home
Version 2.0.0
Stars 1
Downloads 3
self.md verified
Table of content

Complete Home Assistant platform with frontend design, energy management, cameras, sensors, local LLM integration, and Ubuntu server deployment. 15 agents, 16 MCP tools.

Installation

npx claude-plugins install @Lobbi-Docs/claude-orchestration/home-assistant-architect

Contents

Folders: agents, commands, hooks, mcp-server, skills, templates

Files: README.md

Documentation

Comprehensive Claude Code plugin for Home Assistant automation, local LLM integration, and Ubuntu server management. This plugin provides specialized sub-agents, hooks, tools, and skills to streamline smart home development and operations.

Features

Sub-Agents (8 Specialized)

AgentPurposeModel
ha-device-controllerControl devices, entities, and servicesSonnet
ha-automation-architectDesign and optimize automationsSonnet
ha-diagnosticsTroubleshoot issues and logsSonnet
local-llm-managerDeploy Ollama and local LLMsOpus
ubuntu-ha-deployerDeploy HA on Ubuntu serversSonnet
ha-voice-assistantConfigure voice pipelinesSonnet
ha-energy-optimizerAnalyze and optimize energy usageSonnet
ha-security-auditorAudit security configurationOpus

Commands

CommandDescription
/ha-controlControl entities with natural language
/ha-automationCreate and manage automations
/ha-deployDeploy HA on Ubuntu
/ha-diagnoseDiagnose issues
/ollama-setupSetup local LLM
/ha-voiceConfigure voice assistant
/ha-backupBackup and restore
/ha-mcpConfigure MCP server

Skills

Hooks

MCP Server

Full Model Context Protocol server with 11 tools:

Installation

Prerequisites

Plugin Installation

# Clone the plugin
cd ~/.claude/plugins
git clone https://github.com/Lobbi-Docs/claude.git

# Or use the plugin manager
/plugin-install home-assistant-architect

Configuration

Set environment variables:

export HA_URL="http://homeassistant.local:8123"
export HA_TOKEN="your-long-lived-access-token"
export OLLAMA_URL="http://localhost:11434"

Or add to .env:

HA_URL=http://homeassistant.local:8123
HA_TOKEN=your-token
OLLAMA_URL=http://localhost:11434
HA_VOICE_MODEL=llama3.2:3b

Quick Start

Control Devices

# Natural language control
/ha-control turn on living room lights at 50%
/ha-control set thermostat to 72 degrees
/ha-control lock front door

Create Automations

# Create from description
/ha-automation create Turn on porch lights at sunset

# Debug automation
/ha-automation debug motion_lights

Deploy on Ubuntu

# Full stack deployment
/ha-deploy full --dir /opt/homeassistant

# Check status
/ha-deploy status

Setup Local LLM

# Install and configure Ollama
/ollama-setup install
/ollama-setup pull llama3.2:3b
/ollama-setup configure

Architecture

┌─────────────────────────────────────────────────────────────────────┐
                    Home Assistant Architect Plugin                   
├─────────────────────────────────────────────────────────────────────┤
                                                                      
  ┌──────────────┐  ┌──────────────┐  ┌──────────────┐               
     Agents          Commands         Skills                   
    (8 types)       (8 types)       (4 types)                  
  └──────────────┘  └──────────────┘  └──────────────┘               
                                                                   
          └────────────────┼─────────────────┘                        
                                                                     
                    ┌──────▼──────┐                                   
                      MCP Server                                    
                      (11 tools)                                    
                    └──────┬──────┘                                   
                                                                     
         ┌─────────────────┼─────────────────┐                       
                                                                  
  ┌──────▼──────┐  ┌───────▼───────┐  ┌─────▼──────┐                
   Home              Ollama         Ubuntu                    
   Assistant       (Local LLM)      Server                    
     API                                                      
  └─────────────┘  └───────────────┘  └────────────┘                
                                                                      
└─────────────────────────────────────────────────────────────────────┘

MCP Server Setup

…(truncated)

Included Skills

This plugin includes 5 skill definitions:

camera-nvr

View skill definition

Camera and NVR Skill

Home Assistant camera integration and Frigate NVR patterns.

Activation Triggers

Core Patterns

Frigate Configuration

# frigate.yaml
mqtt:
  enabled: true
  host: mosquitto
  user: frigate
  password: "{FRIGATE_MQTT_PASSWORD}"

detectors:
  coral:
    type: edgetpu
    device: usb

cameras:
  front_door:
    ffmpeg:
      inputs:
        - path: rtsp://user:pass@192.168.1.100:554/stream1
          roles:
            - detect
            - record
        - path: rtsp://user:pass@192.168.1.100:554/stream2
          roles:
            - rtmp
    detect:
      width: 1920
      height: 1080
      fps: 5
    objects:
      track:
        - person
        - car
        - dog
      filters:
        person:
          min_area: 5000
          threshold: 0.7
    zones:
      porch:
        coordinates: 0,1080,500,1080,500,0,0,0
        objects:
          - person
    record:
      enabled: true
      retain:
        days: 7
        mode: motion
      events:
        retain:
          default: 30

go2rtc:
  streams:
    front_door:
      - rtsp://user:pass@192.168.1.100:554/stream1

Camera YAML Configuration

# Generic camera
camera:
  - platform: generic
    name: Front Yard
    still_image_url: http://192.168.1.100/snapshot.jpg
    stream_source: rtsp://user:pass@192.168.1.100:554/stream


...(truncated)

</details>

### energy-management

<details>
<summary>View skill definition</summary>

# Energy Management Skill

Home Assistant energy monitoring and optimization patterns.

## Activation Triggers

- Working with energy dashboard
- Solar/battery integration
- EV charger configuration
- Utility rate management
- Power monitoring

## Core Patterns

### Energy Dashboard Configuration

```yaml
# configuration.yaml
energy:

# The Energy dashboard is configured via UI
# Settings > Dashboards > Energy

Power Monitoring Sensors

# Whole home power
sensor:
  - platform: template
    sensors:
      home_power:
        friendly_name: "Home Power"
        unit_of_measurement: "W"
        device_class: power
        value_template: >
          {{ states('sensor.main_meter_power') | float(0) }}          

      home_energy_daily:
        friendly_name: "Daily Energy"
        unit_of_measurement: "kWh"
        device_class: energy
        value_template: >
          {{ states('sensor.main_meter_energy') | float(0) }}          

Solar Integration

# Solar production template
template:
  - sensor:
      - name: "Solar Power"
        unit_of_measurement: "W"
        device_class: power
        state_class: measurement
        state: "{{ states('sensor.inverter_power') | float(0) }}"

      - name: "Solar Energy Today"
        unit_of_measurement: "kWh"
        device_class: energy
        state_class: total_increasing
        state: "{{ states('sensor.inverter_energy_today') | float(0) }}"

      - name: "Solar Self Consumption"
        unit_of_measurement: "kWh"
  

...(truncated)

</details>

### ha-automation

> "What this automation does and why"

<details>
<summary>View skill definition</summary>

# Home Assistant Automation Skill

Automation YAML patterns, triggers, conditions, actions, and best practices for Home Assistant.

## Activation Triggers

Activate this skill when:
- Creating or modifying automations
- Working with automation YAML
- Designing trigger logic
- Building condition chains
- Creating action sequences

## Automation Structure

```yaml
# Complete automation template
alias: "Descriptive Name"
id: unique_automation_id  # Optional but recommended
description: "What this automation does and why"
mode: single  # single, restart, queued, parallel

# Variables available throughout
variables:
  room: "living_room"
  default_brightness: 80

trigger:
  - platform: state
    entity_id: binary_sensor.motion
    to: "on"
    id: motion_detected  # For use in choose

condition:
  - condition: sun
    after: sunset

action:
  - service: light.turn_on
    target:
      entity_id: "light.{{ room }}"
    data:
      brightness_pct: "{{ default_brightness }}"

Trigger Types

State Trigger

trigger:
  - platform: state
    entity_id: binary_sensor.door
    from: "off"
    to: "on"
    for:
      seconds: 30
    attribute: any  # Optional: trigger on attribute change

Time Trigger

trigger:
  - platform: time
    at: "06:30:00"

  # Multiple times
  - platform: time
    at:
      - "07:00:00"
      - "12:00:00"
      - "18:00:00"

  # Input datetime
  - platform: time
    at: input_datetime.wake_up_time

Time Pattern

trig

...(truncated)

</details>

### ha-core

<details>
<summary>View skill definition</summary>

# Home Assistant Core Skill

Core Home Assistant API integration patterns, authentication, and entity management.

## Activation Triggers

Activate this skill when working with:
- Home Assistant REST API
- WebSocket API connections
- Entity state management
- Service calls
- Event bus subscriptions

## Authentication

### Long-Lived Access Token

```python
import httpx

class HAClient:
    def __init__(self, url: str, token: str):
        self.url = url.rstrip('/')
        self.headers = {
            "Authorization": f"Bearer {token}",
            "Content-Type": "application/json"
        }

    async def get_states(self) -> list:
        async with httpx.AsyncClient() as client:
            response = await client.get(
                f"{self.url}/api/states",
                headers=self.headers
            )
            response.raise_for_status()
            return response.json()

    async def get_state(self, entity_id: str) -> dict:
        async with httpx.AsyncClient() as client:
            response = await client.get(
                f"{self.url}/api/states/{entity_id}",
                headers=self.headers
            )
            response.raise_for_status()
            return response.json()

    async def call_service(
        self,
        domain: str,
        service: str,
        data: dict = None,
        target: dict = None
    ) -> dict:
        payload = {}
        if data:
            payload.update(data)
        if target:
            payload["target

...(truncated)

</details>

### local-llm

<details>
<summary>View skill definition</summary>

# Local LLM Integration Skill

Deploy and integrate local LLMs with Ollama, LocalAI, and Home Assistant for privacy-focused voice assistants and automation.

## Activation Triggers

Activate this skill when:
- Setting up Ollama or LocalAI
- Configuring local voice assistants
- Integrating LLMs with Home Assistant
- Optimizing local model performance
- Building LLM-powered automations

## Ollama Installation

### Ubuntu/Debian
```bash
# Install Ollama
curl -fsSL https://ollama.com/install.sh | sh

# Start as service
sudo systemctl enable ollama
sudo systemctl start ollama

# Pull models
ollama pull llama3.2:3b
ollama pull fixt/home-3b-v3  # HA-optimized

Docker

# docker-compose.yaml
services:
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    restart: unless-stopped
    ports:
      - "11434:11434"
    volumes:
      - ./ollama:/root/.ollama
    # GPU support (NVIDIA)
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: all
              capabilities: [gpu]

Ollama API

Generate Completion

import httpx

async def generate(prompt: str, model: str = "llama3.2:3b") -> str:
    async with httpx.AsyncClient() as client:
        response = await client.post(
            "http://localhost:11434/api/generate",
            json={
                "model": model,
                "prompt": prompt,
                "stream": False,
                "options": {
      

...(truncated)

</details>

## Source

[View on GitHub](https://github.com/Lobbi-Docs/claude)