geepers-agents

Multi-agent orchestration system with 51 specialized agents for development workflows, code quality, deployment, research, games, and more. Includes orchestrators for checkpoint, deploy, quality, fullstack, research, games, corpus, web, and python workflows.

View on GitHub
Author Luke Steuber
Namespace @jeremylongshore/claude-code-plugins-plus
Category ai-ml
Version 1.0.0
Stars 1,193
Downloads 2
self.md verified
Table of content

Multi-agent orchestration system with 51 specialized agents for development workflows, code quality, deployment, research, games, and more. Includes orchestrators for checkpoint, deploy, quality, fullstack, research, games, corpus, web, and python workflows.

Installation

npx claude-plugins install @jeremylongshore/claude-code-plugins-plus/geepers-agents

Contents

Folders: agents, checkpoint, corpus, deploy, fullstack, games, master, product, python, quality, research, standalone, system, web

Files: LICENSE, README.md

Documentation

Multi-agent orchestration system with MCP tools and Claude Code plugin agents.

Installation

From PyPI (MCP tools)

pip install geepers

# With optional dependencies
pip install geepers[all]
pip install geepers[anthropic,openai]

As Claude Code Plugin (agents)

/plugin add lukeslp/geepers

What’s Included

43 Specialized Agents

Markdown-defined agents for Claude Code that provide specialized workflows:

CategoryAgentsPurpose
Masterconductor_geepersIntelligent routing to specialists
Checkpointscout, repo, status, snippets, orchestratorSession maintenance
Deploycaddy, services, validator, orchestratorInfrastructure
Qualitya11y, perf, api, deps, critic, orchestratorCode audits
Fullstackdb, design, react, orchestratorEnd-to-end features
Researchdata, links, diag, citations, orchestratorData gathering
Gamesgame, gamedev, godot, orchestratorGame development
Corpuscorpus, corpus_ux, orchestratorLinguistics/NLP
Webflask, orchestratorWeb applications
Pythonpycli, orchestratorPython projects

90+ MCP Tools

Six specialized MCP servers expose tools for:

FREE Alternative: Use Ollama for Local LLM

Want to run geepers without paying for LLM APIs? Replace Anthropic/OpenAI/xAI with Ollama for $0/month.

Quick Comparison

ComponentPaid (Cloud APIs)FREE (Ollama)
LLM ProviderAnthropic/OpenAI/xAIOllama (local)
Monthly Cost$50-200/mo$0/mo
PrivacyData sent to cloud100% local
API KeysRequired (3+ keys)None required
Rate LimitsYes (varies by tier)Unlimited
Latency2-5s (network)1-3s (local)

Savings: $600-2,400/year for multi-agent orchestration.

Why Ollama for Geepers?

Benefits:

Recommended Models:

Setup Guide

1. Install Ollama

# macOS
brew install ollama
brew services start ollama

# Linux
curl -fsSL https://ollama.com/install.sh | sh
sudo systemctl start ollama

# Pull model (4GB download)
ollama pull llama3.2

See ollama-local-ai plugin for detailed setup.

2. Install Geepers with Local LLM Support

# Install without paid provider dependencies
pip install geepers

# No need for [anthropic,openai] extras!

3. Configure Ollama as LLM Provider

Create ~/.geepers/config.yaml:

llm:
  provider: ollama
  base_url: http://localhost:11434
  model: llama3.2
  temperature: 0.7

# No API keys required!

4. Update MCP Config

{
  "mcpServers": {
    "geepers": {
      "command": "geepers-unified",
      "env": {
        "GEEPERS_LLM_PROVIDER": "ollama",
        "OLLAMA_BASE_URL": "http://localhost:11434",
        "OLLAMA_MODEL": "llama3.2"
      }
    }
  }
}

Cost Comparison: 43 Agents

Cloud APIs (Anthropic/OpenAI)

43 agents × 1000 calls/month × $0.002/call = $86/month
Annual cost: $1,032

Required API Keys:

Ollama (Local LLM)

43 agents × unlimited calls/month × $0 = $0/month
Annual cost: $0

Required:

Savings: $1,200-2,760/year

Migration Examples

Before (Paid APIs)

# Install with paid dependencies
pip install geepers[anthropic,openai]

# Set API keys
export ANTHROPIC_API_KEY=sk-ant-...
export OPENAI_API_KEY=sk-...
export XAI_API_KEY=xai-...

Monthly Cost: $100-230

After (Ollama)

# Install without paid dependencies
pip install geepers

# Start Ollama (one-time setup)
ollama pull llama3.2
ollama serve

# Configure geepers
export GEEPERS_LLM_PROVIDER=ollama
export OLLAMA_BASE_URL=http://localhost:11434

Monthly Cost: $0

Real Use Case: Multi-Agent Session

Scenario: Running geepers_orchestrator_checkpoint (5 agent calls per session)

Cloud APIs Version

#

...(truncated)

## Source

[View on GitHub](https://github.com/jeremylongshore/claude-code-plugins-plus-skills)
Tags: ai-ml orchestrationmulti-agentdevelopment-workflowcode-qualitydeploymentresearchgamesaccessibilityperformanceflaskreactgodotcorpus-linguisticsmcp