Define workflow outputs as Definitions of Done with checkable acceptance criteria — not vague quality aspirations
Make workflow output specifications operational by defining them as Definitions of Done with explicit, checkable acceptance criteria rather than vague quality aspirations.
Why This Is a Rule
"The report should be comprehensive and well-organized" is a quality aspiration, not an output specification. It tells you the direction but not the destination — you can't verify whether it's been achieved because "comprehensive" and "well-organized" mean different things to different people and different things to the same person on different days. This vagueness produces two failure modes: perfectionism spirals (Define completion as binary observables (draft exists) not subjective evaluations (draft is good) — clear termination prevents perfectionism spirals) because "comprehensive" is never comprehensive enough, and quality inconsistency because the standard shifts with mood and energy.
Agile's Definition of Done (DoD) pattern solves this by converting quality aspirations into checkable acceptance criteria: "Report includes sections 1-5, each section has at least 3 data points with citations, executive summary is under 200 words, all figures have captions." Each criterion is binary (met or not met), observable (anyone can verify), and stable (the standard doesn't shift with the checker's mood).
The key distinction is between specification and aspiration. A specification tells you when to stop and what "done" looks like. An aspiration tells you what direction to go but never tells you when you've arrived. Workflows need specifications at their outputs because specifications are termination conditions — they tell the executor when the work is complete and can be handed off.
When This Fires
- When defining what a workflow should produce before building the workflow
- When different executions of the same workflow produce inconsistent quality
- When the person who receives the workflow's output keeps requesting revisions (specification gap)
- Complements Define completion as binary observables (draft exists) not subjective evaluations (draft is good) — clear termination prevents perfectionism spirals (binary completion criteria) with the DoD framework for output quality
Common Failure Mode
Subjective DoD items that look checkable but aren't: "Code is clean," "Design is professional," "Writing is clear." These pass the format test (they're short, they seem specific) but fail the verification test (two people will disagree about whether they're met). Every DoD item must be verifiable by a stranger who doesn't share your taste or standards.
The Protocol
(1) For each workflow output, write a Definition of Done as a numbered list of acceptance criteria. (2) Apply the stranger verification test to each criterion: could someone who doesn't know your preferences verify whether this criterion is met? (3) Convert subjective criteria to observable ones: "Well-written" → "Passes spell-check, sentences average under 20 words, Flesch-Kincaid score above 60." "Professional design" → "Uses brand colors, all text is legible at 100% zoom, no orphaned headers." (4) Include both content criteria (what must be present) and quality criteria (what standards the content must meet). (5) Test the DoD: have someone else evaluate the output against your criteria. If they reach a different conclusion than you, the criteria aren't operational enough.