Fabrizio Ferri-Benedetti's Context Curator Approach

Table of content
Fabrizio Ferri-Benedetti's Context Curator Approach

Fabrizio Ferri-Benedetti is a Principal Technical Writer at Elastic and co-maintainer of OpenTelemetry docs. Based in Barcelona, he writes about the intersection of technical writing and AI at passo.uno. His approach: tech writers should build their own LLM tools rather than depend on generic solutions.

Fabrizio created Impersonaid for testing documentation against LLM-powered user personas, Valegen for generating Vale linter rules with RAG, and various MCP servers for documentation workflows.

Background

He describes himself as knowing “enough coding to be dangerous” and creates custom tools when existing solutions fall short.

The Context Curator Philosophy

Fabrizio argues that technical writers should become “context curators” for AI systems. The core insight: developers increasingly rely on AI tools that consume documentation as context. Well-structured docs directly improve AI performance.

From his blog post:

“All AI requires to do the right thing is great context and a gentle nudge.”

This shifts the writer’s role from presentation-focused content to designing context-rich information structures. Specific recommendations:

TechniquePurpose
CLAUDE.md filesProject guidelines and commands for AI context
Context foldersOrganized directories for AI consumption
Semantic taggingInformation architecture for machine parsing
llms.txt filesLLM-optimized content delivery
DITA markupStandardized semantic formats

Impersonaid: Documentation User Testing

Impersonaid simulates user interactions with documentation using LLM-powered personas. Instead of traditional user research, it provides quick feedback loops during development.

Install and run:

git clone https://github.com/theletterf/impersonaid.git
cd impersonaid
npm install && npm link
impersonaid create-sample --name beginner_developer
impersonaid simulate --persona beginner_developer \
  --doc "https://your-docs.com/quickstart" \
  --request "How do I authenticate?"

Personas include attributes like:

Supports OpenAI, Claude, Gemini, OpenRouter, and Ollama. Results save as Markdown with the persona profile, request, and simulated response.

Valegen: RAG-Powered Linter Rules

Valegen generates Vale linter rules from natural language descriptions. Rather than writing YAML configurations manually, describe what you want in plain English.

The tool uses retrieval-augmented generation:

  1. Search a vector database of Vale documentation and style guides
  2. Retrieve relevant examples from Vale core, Google, and Microsoft styles
  3. Ground the LLM response in actual syntax patterns
  4. Generate three rule approaches with confidence ratings
git clone https://github.com/theletterf/valegen.git
cd valegen
python -m venv venv && source venv/bin/activate
pip install -r requirements.txt
python setup.py  # Builds vector database
flask run --port 5001

Supports Gemini, GPT, and Claude. The RAG approach prevents hallucinated YAML that looks plausible but fails.

Aikidocs: Own Your Prompts

For custom documentation workflows, Fabrizio built Aikidocs, a NodeJS tool that processes local files as context for LLM requests.

His philosophy: “Tech writers must own the prompt.”

The workflow:

  1. Initialize a git repository (critical for rollback when LLMs stray)
  2. Define folder structure: context, prompts, credentials, output
  3. Configure LLM provider and API key
  4. Run the script to process context

Key insight: use iterative prompting rather than comprehensive requests. Build up capabilities through sequential refinements.

Daily AI Uses

Fabrizio uses Claude and Copilot for professional work. Specific applications:

TaskApproach
Markdown fixingTables and links faster than regex
Code boilerplateReact components, UI widgets
Code explanationsUnderstanding unfamiliar functions
Writer’s blockRequest closing paragraphs for blog posts
Style matchingRewrite content to match voice
User testingComputer Use for automated screenshots
Test generationSelenium code for documentation workflows

From his AI usage post:

“LLMs augment existing skills by removing tedious work at margins, not replacing core work requiring judgment and creativity.”

Other Projects

His GitHub includes:

ProjectDescriptionStars
english-lang“The English Programming Language”505
otel-sonifierCalm monitoring for OpenTelemetry Collector13
Vale-MCPMCP server for Vale linting16

He also contributed to making Splunk’s Vale style guide open source.

Key Takeaways

PrincipleImplementation
Build your own toolsCustom scripts beat generic solutions
Context over promptsStructure docs for AI consumption
Test with personasSimulate users before publishing
Use RAG for rulesGround generation in real examples
Git everythingVersion control enables rollback

Next: Jesse Vincent’s Superpowers Framework

Topics: ai-documentation technical-writing tools mcp