20.1 Generate docs from code structure

Overview and links for this section of the guide.

Goal: produce docs that reflect reality

Docs become truthful when they’re derived from reality:

  • code structure,
  • public interfaces,
  • tests and examples,
  • schemas and error contracts.

The model is good at turning these into readable docs. Your job is to provide the right inputs and require “unknown” when evidence is missing.

The rule

If it’s not in code/tests/contracts, it’s not a fact. Force the model to label unknowns.

Inputs that keep docs grounded

High-signal doc inputs:

  • README (current),
  • file tree,
  • entrypoints,
  • public API definitions (routes, schemas, types),
  • tests (especially integration and golden tests),
  • config examples (.env.example),
  • error taxonomy (status codes/outcome categories).

Low-signal inputs:

  • old conversations,
  • generic “best practices,”
  • random code files without context.

Start with a file map (index)

Before writing docs, build a file map:

  • 1–2 sentences per module/directory,
  • entrypoints and critical paths,
  • where to find configs and tests.

This becomes the foundation for module docs and onboarding docs.

Extract contracts (APIs, CLIs, schemas)

Docs should be organized around contracts:

  • CLI contract: commands, flags, exit codes, examples
  • HTTP/API contract: endpoints, request/response JSON, status codes
  • Schema contracts: JSON schema versions and validation rules
  • Operational contracts: timeouts, retries, rate limits, budgets

Contracts are what users depend on. That’s why they should be documented.

Draft docs from contracts (not from vibes)

Once you have the file map and contracts, ask the model to draft docs with strict rules:

  • only describe behavior supported by code/tests,
  • include examples (commands or JSON),
  • include error behavior and edge cases,
  • mark unknowns explicitly.

Verification: how to spot-check quickly

Don’t try to “prove docs correct” by reading everything. Spot-check:

  • run the quickstart commands,
  • verify CLI flags and exit codes match reality,
  • verify API example payloads match schemas/tests,
  • verify env vars match .env.example.

If docs are wrong, tighten the contract inputs and regenerate.

Copy-paste prompts

Prompt A: build a file map

Create a file map from this file tree and entrypoints.

Rules:
- 1–2 sentences per important directory/module
- Identify entrypoints, critical paths, tests, configs
- If something is unclear, mark as unknown and ask a question

Context:
- File tree: ...
- Entrypoints: ...

Stop after the file map.

Prompt B: draft docs from contracts

Draft documentation for this project using only the evidence below.

Rules:
- Do not invent behavior not supported by the evidence
- If unknown, explicitly say unknown
- Include examples and error behavior

Evidence:
- File map: ...
- CLI/API contracts: ...
- Schemas: ...
- Tests: ...

Output:
- A README section draft (quickstart + usage)
- A reference section draft (contracts + errors)

Where to go next