geepers-agents
Multi-agent orchestration system with 51 specialized agents for development workflows, code quality, deployment, research, games, and more. Includes orchestrators for checkpoint, deploy, quality, fullstack, research, games, corpus, web, and python workflows.
View on GitHubTable of content
Multi-agent orchestration system with 51 specialized agents for development workflows, code quality, deployment, research, games, and more. Includes orchestrators for checkpoint, deploy, quality, fullstack, research, games, corpus, web, and python workflows.
Installation
npx claude-plugins install @jeremylongshore/claude-code-plugins-plus/geepers-agents
Contents
Folders: agents, checkpoint, corpus, deploy, fullstack, games, master, product, python, quality, research, standalone, system, web
Files: LICENSE, README.md
Documentation
Multi-agent orchestration system with MCP tools and Claude Code plugin agents.
Installation
From PyPI (MCP tools)
pip install geepers
# With optional dependencies
pip install geepers[all]
pip install geepers[anthropic,openai]
As Claude Code Plugin (agents)
/plugin add lukeslp/geepers
What’s Included
43 Specialized Agents
Markdown-defined agents for Claude Code that provide specialized workflows:
| Category | Agents | Purpose |
|---|---|---|
| Master | conductor_geepers | Intelligent routing to specialists |
| Checkpoint | scout, repo, status, snippets, orchestrator | Session maintenance |
| Deploy | caddy, services, validator, orchestrator | Infrastructure |
| Quality | a11y, perf, api, deps, critic, orchestrator | Code audits |
| Fullstack | db, design, react, orchestrator | End-to-end features |
| Research | data, links, diag, citations, orchestrator | Data gathering |
| Games | game, gamedev, godot, orchestrator | Game development |
| Corpus | corpus, corpus_ux, orchestrator | Linguistics/NLP |
| Web | flask, orchestrator | Web applications |
| Python | pycli, orchestrator | Python projects |
90+ MCP Tools
Six specialized MCP servers expose tools for:
- geepers-unified - All tools in one server
- geepers-providers - 13 LLM providers (Anthropic, OpenAI, xAI, etc.)
- geepers-data - 29+ data sources (Census, arXiv, GitHub, NASA, etc.)
- geepers-cache - Redis-backed caching
- geepers-utility - Document parsing, citations, TTS
- geepers-websearch - Multi-engine web search
FREE Alternative: Use Ollama for Local LLM
Want to run geepers without paying for LLM APIs? Replace Anthropic/OpenAI/xAI with Ollama for $0/month.
Quick Comparison
| Component | Paid (Cloud APIs) | FREE (Ollama) |
|---|---|---|
| LLM Provider | Anthropic/OpenAI/xAI | Ollama (local) |
| Monthly Cost | $50-200/mo | $0/mo |
| Privacy | Data sent to cloud | 100% local |
| API Keys | Required (3+ keys) | None required |
| Rate Limits | Yes (varies by tier) | Unlimited |
| Latency | 2-5s (network) | 1-3s (local) |
Savings: $600-2,400/year for multi-agent orchestration.
Why Ollama for Geepers?
Benefits:
- Zero Cost: No API usage fees for 43 agents
- Privacy: All 90+ MCP tools run locally
- Unlimited: Run as many agent calls as needed
- Offline: No internet required after model download
- GDPR/HIPAA: Compliant by default (local-only)
Recommended Models:
- llama3.2:7b - Best for general agents (4GB)
- mistral:7b - Fast and efficient (4GB)
- codellama:13b - Code-focused agents (7GB)
- mixtral:8x7b - Advanced reasoning (26GB)
Setup Guide
1. Install Ollama
# macOS
brew install ollama
brew services start ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
sudo systemctl start ollama
# Pull model (4GB download)
ollama pull llama3.2
See ollama-local-ai plugin for detailed setup.
2. Install Geepers with Local LLM Support
# Install without paid provider dependencies
pip install geepers
# No need for [anthropic,openai] extras!
3. Configure Ollama as LLM Provider
Create ~/.geepers/config.yaml:
llm:
provider: ollama
base_url: http://localhost:11434
model: llama3.2
temperature: 0.7
# No API keys required!
4. Update MCP Config
{
"mcpServers": {
"geepers": {
"command": "geepers-unified",
"env": {
"GEEPERS_LLM_PROVIDER": "ollama",
"OLLAMA_BASE_URL": "http://localhost:11434",
"OLLAMA_MODEL": "llama3.2"
}
}
}
}
Cost Comparison: 43 Agents
Cloud APIs (Anthropic/OpenAI)
43 agents × 1000 calls/month × $0.002/call = $86/month
Annual cost: $1,032
Required API Keys:
- Anthropic Claude API: $50-100/mo
- OpenAI GPT-4: $30-80/mo
- xAI Grok: $20-50/mo
- Total: $100-230/mo
Ollama (Local LLM)
43 agents × unlimited calls/month × $0 = $0/month
Annual cost: $0
Required:
- Hardware you already own
- One-time model download (4-26GB)
- Total: $0/mo
Savings: $1,200-2,760/year
Migration Examples
Before (Paid APIs)
# Install with paid dependencies
pip install geepers[anthropic,openai]
# Set API keys
export ANTHROPIC_API_KEY=sk-ant-...
export OPENAI_API_KEY=sk-...
export XAI_API_KEY=xai-...
Monthly Cost: $100-230
After (Ollama)
# Install without paid dependencies
pip install geepers
# Start Ollama (one-time setup)
ollama pull llama3.2
ollama serve
# Configure geepers
export GEEPERS_LLM_PROVIDER=ollama
export OLLAMA_BASE_URL=http://localhost:11434
Monthly Cost: $0
Real Use Case: Multi-Agent Session
Scenario: Running geepers_orchestrator_checkpoint (5 agent calls per session)
Cloud APIs Version
#
...(truncated)
## Source
[View on GitHub](https://github.com/jeremylongshore/claude-code-plugins-plus-skills)