8.5 "Fix it without breaking it": regression discipline

Overview and links for this section of the guide.

The goal: fix the bug and keep the system stable

It’s easy to “fix” a bug by rewriting code until the failure disappears. That’s also how you create regressions and fragile systems.

Regression discipline is the skill of making the system more reliable every time you fix something.

A good bug fix has two outputs

(1) The bug is fixed. (2) A test exists that proves it stays fixed.

The rule: prove the bug exists before you fix it

Before you change implementation, do one of:

  • write a failing regression test, or
  • capture a deterministic reproduction command and expected output.

This prevents “I think it’s fixed” and forces the fix to be evidence-based.

If you can’t reproduce, don’t refactor

When you can’t reproduce, big changes are gambling. Your first job is to make the bug observable.

A regression-safe workflow

  1. Reproduce: one command/test that fails.
  2. Lock: add a regression test (preferred) or document a repeatable check.
  3. Fix minimally: smallest diff that makes the test pass.
  4. Verify broadly: run the full test suite (or at least the relevant subset).
  5. Commit: a clean commit message linking “bug → test → fix.”

This workflow is how you make every bug fix a net improvement.

Diff discipline (smallest fix)

Use constraints that prevent the model from “fixing” by rewriting:

  • diff-only changes,
  • tight file scope,
  • line budget,
  • no refactors unless required.

If the bug is caused by messy structure, you still fix minimally first, then refactor in a separate step with tests green.

Separate “fix” from “improve”

Fix the bug first. Then decide if you want a refactor. Combining them makes review and blame harder.

Choosing the right regression test

The “right” regression test is the smallest test that fails on the buggy behavior and passes on the fixed behavior.

  • Unit test when the bug is in pure logic.
  • Integration test when the bug is in wiring/IO boundaries (CLI, HTTP routes).
  • Golden test when output format must remain stable (JSON schema outputs).

Keep regression tests readable. Future-you will thank you.

Cleanup after the fix (without scope creep)

After the fix is locked in, you may do a small cleanup pass:

  • remove temporary debug logs (or demote them to DEBUG),
  • improve error messages if they’re unclear,
  • add one or two extra edge-case tests if they’re cheap.

But keep cleanup bounded. Don’t turn a bug fix into a rewrite.

Copy-paste prompt templates

Template A: write a regression test only

We have a bug. Write a regression test that reproduces it.

Constraints:
- Do NOT change implementation yet
- Keep the test minimal and deterministic

Evidence:
- Expected behavior: [...]
- Actual behavior: [...]
- Repro steps / failing output: [...]

Output:
- Diff-only changes (tests only)

Template B: minimal fix that keeps tests intact

Implement the smallest fix to make the regression test pass.

Constraints:
- Diff-only changes
- No refactors beyond what’s required
- Do not weaken or delete the regression test
- No new dependencies

A reusable checklist

  • Reproduce: can I make it fail on demand?
  • Observe: do I have the full error output and inputs?
  • Protect: do I have a regression test?
  • Fix minimally: is the diff small and scoped?
  • Verify: did I run the relevant tests and a couple manual checks?
  • Lock in: did I commit at a green state?

Where to go next